How to spot a Russian troll on social media

Jason Reed/The Daily Dot

Russian trolls are not as duplicitous as you may think.

More than two years after the 2016 presidential election, Russian trolls are still spreading disinformation and propaganda on the internet.

Two Senate-commissioned reports released in December 2018 found that despite (arguably lackluster) efforts by social media giants Facebook and Twitter and federal law enforcement agencies to combat them, Russian social media campaigns are in full swing. The Russian-sponsored Internet Research Agency nearly doubled the number of Facebook and Instagram posts it sent out in 2017 from the year prior, noted one of the reports, authored by the Computational Propaganda Project at Oxford University and Graphik.

And on Thursday, Facebook and Twitter again announced they were suspending malicious actors on their platforms, engaging in state-sponsored, coordinated, misinformation campaigns.

“With at least some of the Russian government’s goals achieved in the face of little diplomatic or other pushback, it appears likely that the United States will continue to face Russian interference for the foreseeable future,” wrote the researchers at New Knowledge, who authored the other report.

Russian trolls are not only multiplying their efforts across multiple platforms; they’re becoming more creative in how they spread their message. Russians broke down their messaging to target specific groups in America, ranging from Blacks to Muslims to the LGBTQ community. Longform blog content, promotion by Google Adwords and YouTube, and getting retweeted by influential people (including our commander-in-chief) were some of the many tactics that added to the legitimacy of the Russian propaganda operations.

Given their methods of disguise are so crafty, how do you spot a Russian troll on social media? Here are some signs to look out for on all the major social media platforms:

How to spot Russian trolls

Twitter

They’re new but already super active

Twitter notes the month and year a user joined the website directly below the user’s bio. If you notice a user that just joined last month but posts several times an hour, that could be a sign that the account is automated. In an analysis of Twitter bots, Ben Nimmo of DFR Labs set the benchmark at 72 tweets per day.

“For the purposes of this analysis, a level of activity on the order of 72 engagements per day over an extended period of months—in human terms, one tweet or like every 10 minutes from 7am to 7pm, every day of the week—will be considered suspicious. Activity on the order of 144 or more engagements per day, even over a shorter period, will be considered highly suspicious. Rates of over 240 tweets a day over an extended period of months will be considered as “hypertweeting”—the equivalent of one post every three minutes for 12 hours at a stretch.”

Even heavy human users of Twitter, such as journalists, celebrities, techies, and the like, very rarely have sent out more than 50 thousand or so tweets in the span of their tweeting career. (Yes, Chris Cillizza has tweeted an almost inhuman 127,000 times, but it’s yet to be revealed whether he’s a Russian bot. Besides, he’s verified.) If an account isn’t a public official or influencer or other person whose career relies on distributing information, yet seems to be retweeting anti-AOC news articles every few minutes, chances are it’s a bot.

Very fond of RT, Sputnik, and junk news websites

Here’s an easy one. Does the account seem to solely consists of retweets from Kremlin-owned news sites such as RT and Sputnik? Or articles from news sources you don’t recognize with dubious sounding names? The simple function of Russian trolls on Twitter seems to be to promote propaganda; which is why retweeting of Kremlin-backed or other fake news sites is a major red flag.

NYU researchers noted in a report that the most common type of bot is one that tweets headlines without links to the original source of news.

“This suggests that an important strategy in the use of bots for the purposes of propaganda might be to promote specific news stories and news media in the rankings of search engines,” says Richard Bonneau in a press release for the report.

Fake avatars

Troll Twitter accounts very often will either have no profile picture; displaying the generic egg instead, or a cartoon or some other non-human image. If they do use a human photo, in most cases it’s being used without the other person’s consent.

In order to figure out if a Twitter avatar is stolen or not, perform a reverse image search. Right-click on the image, and select either “Search Google for that image” or “Copy image address” and paste the address URL onto Google. If the image pops up as a stock photo or attached to someone else’s social media account, you may have spotted a troll.

Run the account by an online tool

If all else fails, check out the many tools that have been created to detect Russian bots. Botcheck.me is a tool that will detect many of the behaviors described above, such as hyper-tweeting, retweeting political polarizing content, and amassing a large number of followers in a short amount of time. Botometer, a joint effort by the Network Science Institute (IUNI) and the Center for Complex Networks and Systems Research (CNetS) at Indiana University, is a similar tool.

Facebook

Use Facebook’s tool to identify propaganda pages

Following the discovery of Facebook pages created by the IRA, the company created a tool that will inform you whether you followed any of its propaganda Facebook or Instagram accounts. The trouble is, the tool doesn’t help with existing Russian propaganda pages—it only lets you know if you ever liked a propaganda page from a list of groups that Facebook has already eliminated.

Be alert of pages with hyper-partisan content or which cater to a specific demographic

According to the New Knowledge report, of the 81 Facebook pages identified to be IRA, 30 targeted Black Americans and 24 targeted conservative Americans.

“The most prolific I.R.A. efforts on Facebook and Instagram specifically targeted Black American communities and appear to have been focused on developing Black audiences and recruiting Black Americans as assets,” said the report.

Much of the content seeks to rally people from a certain political party or racial group, intensify political divisions and spread fake information with memes and links to fake news articles. If you’re a member of some of these Facebook pages, pay attention to the content. Can you verify the information given with mainstream news sources? If the page is affiliated with a real-life organization, is it verified by Facebook?

That being said, identifying a legitimate Facebook page from one created by a Russian propaganda mill is easier said than done. Sometimes they can look virtually identical. The IRA-created Facebook page names identified by New Knowledge vary widely and suit virtually every demographic and political persuasion, they include: Being Patriotic, Stop A.I. (All Invaders), Blacktivist, United Muslims of America, Blacktivist, Army of Jesus, Brown Power, and BM (Black Matters).

For examples of Russian propaganda on Facebook, check out the ones compiled by the House Intelligence Committee.

Instagram

Be aware that you’re using Russia’s favorite tool for propaganda

Russian propaganda is the hardest to identify on Instagram, where arguably everything from fitness “before and afters” to doctored selfies to the people themselves are fake. The Senate reports found that 187 million people engaged with Russian propaganda on Instagram in 2017—nearly twice the number of people who did so on Facebook. Even worse, Instagram’s efforts to take down the IRA-linked accounts appear to be fruitless. Many of the memes shared by the deactivated accounts keep surfacing as they get re-grammed, or screenshotted and posted again by real human users.

If you’re curious to see what Russian propaganda on Instagram looks like, check out this repository of Russian propaganda accounts.

Amrita Khalid

Amrita Khalid

Amrita Khalid is a technology and politics reporter who specializes in breaking down complex issues into practical, useful terms. A former contributor to CQ, a Congressional news and analysis site, she's currently a master's candidate in international relations at the University of Leeds.