In recent years, several journalists have investigated Kremlin-sponsored troll factories. While their impact on policy and political contests is debatable, less examined—though no less important—are the interconnected ecosystems of disinformation, which the online troll community feeds and sustains. Finding them is remarkably easy. Joining them is even easier, thanks to the anonymity of the internet.
For this study on Russian disinformation in Romania, all I needed was a fake Facebook profile dedicated to following pro-Russian pages, groups and profiles on social media. Facebook opened many doors. It became a point of access to pro-Kremlin disinformation narratives and outlets in Romania, as well as to other international, pro-Putin groups or profiles acting as aggregators for local pages. This was a good lesson about Russia’s distribution machine in Romania and potentially elsewhere: Facebook is a prime vehicle for channeling viewers to blogs and websites.
As a newly minted faux troll, I did not necessarily learn anything new about the types of narratives being propagated—mostly anti-NATO, anti-EU, anti-Western values and sometimes overtly pro-Russia stories. However, I did gain valuable insights into how toxic concepts spread. Here are some of them:
First, it is easy to build an audience. In my first two days with the new account, I received over 100 friendship requests, which I accepted in order to appear like a “real” person. If my intention had been to propagate fake news, I would have a fast-growing following. With a few simple online tools used frequently by pro-Kremlin trolls—such as audience boosting or translation software— it is possible to reach a huge and potentially susceptible audience. Most of my new “friends” were also prone to mystical or conspiratorial articles, which are a favorite breeding ground
for pro-Russian propaganda.
Second, there was an obvious difference between content generators and consumers. Some trolls act as nodes of information in charge of maintaining a constant flux of articles and comments. These are posted on individual Facebook walls or groups. The profiles of content generators soon become easy to spot. Typically, they have thousands of friends; post extremely frequently; have almost exclusively political (e.g. pro-Kremlin) content; and are very active in commenting and reacting to various groups. They also often act in tandem by cross-posting and commenting on similar content. These trolls are also administrators of various pro-Kremlin groups, with very suggestive titles like “Putin’s Friends” or “We don’t want to fight against Russians!” Most of the groups I joined under the fake profile required pre-approval by the administrators—which was granted—and have thousands of members.
Third, sometimes troll networks seem loosely linked. For example, some groups of profiles always tag each other in posts or reinforce each other’s comments. But other times there seems to be competition. They may undermine each other in comments and posts. In the fog of information war, it can be hard to tell whether some of these individuals work for different employers, or if they are true believers
who became connected with pro-Kremlin groups the same way my faux troll joined these communities: by pursuing content and people who were saying the things my fake profile wanted to read.
Fourth, these communities are a hall of mirrors. Most content rolled out on Facebook is militant, and focuses on narratives that reinforce conspiracies and alternative realities. Simultaneously, pro-Russian trolls don’t like anybody—not the government or any political party, not the “Soros- NGOs,” not Romania’s anti-corruption institutions, and not NATO or the EU. Everything we see as beneficial in the Western expert community is deemed to take Romanians away from their pure essence and great destiny. And of course salvation comes from a return to traditional values, of which Russia is the great defender. Anyone who questions the logic of these narratives is immediately trolled, insulted and eventually banned.
Finally, trolls are paranoid virtual creatures. They are deeply worried about being trolled themselves. Recently, a few of my troll “friends” became concerned about intelligence agents infiltrating their network of Facebook friends. They started a flurry of warning posts about trolls, Facebook censorship—which is probably an encouraging sign that people report inappropriate content on Facebook—and anyone who seemed to disagree with them.
One of the more popular posts featured a definition of trolls published by Beatrice Mcartney, who is among the most fervent pro-Kremlin users. The irony of this self-portrait aside, the definition is a useful tool—straight from the source—for anyone who wants to spot Facebook trolls and stay away from them:
“Trolls sell their soul and country to foreign interests. They are people of no character, paid to track key words and react to posts. They would aggress and offend you, post tendentious questions and comments, mainly contradicting you and thus trying to ruin your comments and decredibilize it. Also, they would pick fights and try to derail the discussion from the topic of the post so that others would voluntarily leave the page because of all the dirt being thrown there on purpose.” Few experts could have offered a better definition of a troll than this troll herself.
My two months of immersion in this pro-Kremlin hyper-reality
is a good experiment for testing its power to take the user away from the real world. I was there to examine the habits and profiles of participants. But regular readers could be easily drawn into these communities with every increment of fake news and wild conspiracy. In this hyper-real disinformation space, the artificially created troll becomes a “real” person who then affects the views of unsuspecting content consumers.
Perhaps most troubling, Facebook’s multitude of sources and voices may also create fuzzy lines between paid agents of a foreign power and regular consumers of disinformation who become “converts.” Hundreds and thousands of shares and likes can translate into voting patterns if successful. While this may not yet be enough to sway elections yet, it could bring localized political fringes into the mainstream of the political debate. This will help the Kremlin achieve its effort to sow chaos and subversion in the West. This risk alone should elicit more interest and action from policy makers and the tech community.
The case of Romania’s troll ecosystem is one example. We need more maps of troll networks and more tools to understand how strong their appeal is to various online communities. But understanding the world of pro-Kremlin trolls is not only a tool for experts and policy makers. It needs to become part of every user’s personal Facebook hygiene.