On Disinformation and Large Online Platforms

This week I was invited to be a panelist, together with other digital ministers, on a side-event organized by Ukraine in Davos, during the World Economic Forum. The topic was disinformation, and I’d like to share my thoughts on it. The video recording is here, but below is not a transcript, but an expanded version.

Bulgaria is seemingly more susceptible to disinformation, for various reasons. We have a majority of the population that has positive sentiments about Russia, for historical reasons. And disinformation campaigns have been around before the war and after the wear started. The typical narratives that are being pushed every day are about the bad, decadent west; the slavic, traditional, conservative Russian government; the evil and aggressive NATO; the great and powerful, but peaceful Russian army, and so on.

These disinformation campaign are undermining public discourse and even public policy. COVID vaccination rates in Bulgaria are one of the lowest in the world (and therefore the mortality rate is one of the highest). Propaganda and conspiracy theories took hold into our society and literally killed our relatives and friends. The war is another example – Bulgaria is on the first spot when it comes to people thinking that the west (EU/NATO) is at fault for the war in Ukraine.

Kremlin uses the same propaganda techniques developed in the cold war, but applied on the free internet, much more efficiently. They use European values of free speech to undermine those same European values.

Their main channels are social networks, who seem to remain blissfully ignorant of the local context as the one described above.

What we’ve seen, and what has been leaked and discussed for a long time is that troll factories amplify anonymous websites. They share content, like content, make it seem like it’s noteworthy to the algorithms.

We know how it works. But governments can’t just block a website, because they think it’s false information. A government may easily go beyond the good intentions and do censorship. In 4 years I won’t be a minister and the next government may decide I’m spreading “western propaganda” and block my profiles, my blogs, my interviews in the media.

I said all of that in front of the Bulgarian parliament last week. I also said that local measures are insufficient, and risky.

That’s why we have to act smart. We need to strike down the mechanisms for weaponzing social networks – for spreading disinformation to large portions of the population, not to block the information itself. Brute force is dangerous. And helps the Kremlin in their narrative about the bad, hypocritical west that talks about free speech, but has the power to shut you down if a bureaucrat says so.

The solution, in my opinion, is to regulate recommendation engines, on a European level. To make these algorithms find and demote these networks of trolls (they fail at that – Facebook claims they found 3 Russian-linked accounts in January).

How to do it? It’s hard to answer if we don’t know the data and the details of how they currently work. Social networks can try to cluster users by IPs, ASs, VPN exit nodes, content similarity, DNS and WHOIS data for websites, photo databases, etc. They can consult national media registers (if they exist), via APIs, to make sure something is a media and not an auto-generated website with pre-written false content (which is what actually happens).

The regulation should make it a focus of social media not to moderate everything, but to not promote inauthentic behavior.

Europe and its partners must find a way to regulate algorithms without curbing freedom of expression. And I was in Brussels last week to underline that. We can use the Digital services act to do exactly that, and we have to do it wisely.

I’ve been criticized – why I’m taking on this task while I can do just cool things like eID and eServices and removing bureaucracy. I’m.doing those, of course, without delay.

But we are here as government officials to tackle the systemic risks. The eID I’ll introduce will do no good if we lose the hearts and minds of people to Kremlin propaganda.

3 thoughts on “On Disinformation and Large Online Platforms”

  1. These are all good thoughts, however, users now have the option to “favorite” a page and thus the algorithm is rendered useless. Click on a Favorite feed in Facebook and you are presented with your usual lies. Facebook groups are the next bad thing, where algorithms don’t work.

    Another way to address that is building a base “clout” or reputation score for a “source”. The process of doing that may contain representatives from both governments and media moderators; done in an open way, fact-based. “Sources” with lower reputation do not get promoted anywhere and their posts somehow appear labeled with a warning. If someone wants to dispute that, sure, go ahead and prove it.

    Second option is to make legislation to ensure that all social media networks are doing “their best efforts” to sanitize user massive automated news input in some meaningful matter, thus be held liable for misinformation spreading. Sort of like GDPR article 32.

  2. The reason is that the narrative is dictated by people that have been in position of power for nearly 80 years: And a party which YOU are in a coalition with, along with your Yogi The Bear party leader. A party well known to be recruiting trolls to spread misinformation online for over a decade and a half. You also showed support for the pro-russian president. Typical Java mentality: convolute the facts in such a way that it seems like it’s the reflection of the lights off Saturn that caused a system to be crap, and not your on inadequate, convoluted, selfish, egotistical and incompetent behavior. Or as a friend of mine likes to say: “Кажи му джавист и не го обиждай повече”.

  3. I for one support the new regulation which will ban the European Parliament from promoting its posts and facebook from promoting propaganda posts by the European Parliament.

Leave a Reply

Your email address will not be published.