Skip to content
chad-madden-bTfza0M0hCE-unsplash

Tweeting at Windmills

How troll campaigns drive misinformation — and what to do about them.

Words: Kate Kohn
Pictures: Chad Madden
Date:

The internet has been a great boon for open-source research and intelligence. Open-source intelligence researchers have been able to make incredible discoveries — tracking down obscure military webpages, unearthing social media accounts of lesser princes around the world, or using satellite imagery to find missile silos being constructed in Northern China. The internet is also the great bane of the innovative researchers who uncover these things.

You see, every army has a dedicated informational operations team whose sole job is to post. Their job is to sway hearts and minds through the writing of shocking, and occasionally fantastical, social media posts. Although some of these people are paid to do it, the real investment comes about because more people will do it for free, spreading misinformation about the research that drowns out the original findings themselves

After the release of the Stuxnet virus, opinion-makers clamored over the possibility of cyberwarfare. However, our popular conception of cyberwarfare looks less like viruses attacking centrifuges, and more like viral memes. If we want insight into the possible shape of disinformation warfare, and its effects on the fidelity of open-source intelligence research, we should consider the use of social media to manufacture and spread the reaction to the public disclosure of missile silos being built in China — not the discovery itself.

Here is the spark notes version: using satellite imagery, open-source researchers at the Federation of American Scientists and James Martin Center for Nonproliferation Studies had found evidence of mass construction of something big in the deserts of Western and Central China. The researchers then published their conclusion that these were missile silos, a discovery which STRATCOM would later concur. Immediately upon publication and STRATCOM’s confirmation, the researchers were met with sour responses, indignation, and minor outrage.

The ultimate goal of these troll campaigns — organic or manufactured — is to drown out the good information. If you overwhelm informational channels with your view of things, it simply becomes true.

So what do you do when a bunch of Internet trolls start accusing you of being CIA assets and fling insults at you nearly 24/7? Well, you try to find out if they are actually state-sponsored trolls or are just doing it for the love of the game.

And a lot of people are doing it for the love of the game. Take influencers. Because they are real folks, with real feelings and real opinions, they can make a larger social impact than a bot network. Disinformation watchers often have a list of known influencers to watch who will promote and amplify false narratives, but it is up to social media companies to determine when they break the rules and face consequences, which they are liable to flub. However, our awareness of a few public-facing trolling groups should not give us the license to name all of our disagreements as the work of paid actors. Sometimes, those engaged in “spammy behaviors” are just people engaging in spammy behaviors.

The “campaign” targeting researchers at FAS and CNS wasn’t such a campaign at all — the responses were not automated, we learned after analyzing data downloaded from Twitter’s API. That does not mean there will not be automated attacks in the future. Automated or not, people will be making big political decisions with the information that they come across online — big decisions about nuclear proliferation and maybe even nuclear weapon usage based on information that is at-best faulty.

The ultimate goal of these troll campaigns — organic or manufactured — is to drown out the good information. If you overwhelm informational channels with your view of things, it simply becomes true. You don’t need to feed bad information to lawmakers directly, that’s a little difficult since they don’t monitor their own social media. But a staffer’s uncle might see something wild and tell his niece about it.

For instance — recently in Germany, a thirteen-year-old girl was allegedly kidnapped and raped by refugees, the German government allegedly covered it up, and thousands protested the government’s handling of the case. Here’s the thing: it didn’t happen. The Russian Foreign Minister made it up. For an example closer to home, think of when Senator Inhofe brought a snowball to the Senate to prove that global warming isn’t real because of decades of disinformation from fossil fuel companies. Now the whole world is at risk.

Disinformation leaking into public policy decisions is a bit like the computer in WarGames. It’s equal parts stupid and avoidable.

Instead of spending time identifying bot networks that might not exist, open-source intelligence organizations should seek novel ways to protect their credibility and promote their research. One easy way is to provide lawmaker-friendly versions of research that can influence policymaking with an expert lens. Using bullet points, one-pagers, colorful charts, and maps get the point across faster than lengthy essays. As much as you love writing long reads, the people at the top don’t have all the time in the world to read them. Organizations should also bolster cybersecurity efforts, especially if writing about known US adversaries. Bad guys can and will try to break into a bank account or two to intimidate researchers. And communications shops will want to rethink messaging: you might have to tweet like the bad guys do. In addition to facts, you need to help readers navigate what they should take away when they read your work. To effectively capture the public’s attention, you need to make it as easy as possible for them to agree with you, which might look like emotional or political pleas to the reader, rather than just sharing the cold facts in your social posts. But it’s all a small price to pay to win the narrative battle.

Most importantly, organizations should uphold and promote the work of rival organizations. Think of this like a social blockchain. The veracity of a claim is upheld, or at least protected by multiple experts, with multiple perspectives, and multiple interests. Together, you strengthen the credibility of each others’ work, and push through claims of bias or nefarious funding. Legitimacy and accuracy are social concepts.

The saying goes, knowing is half the battle. Persuasion is the other half.

Kate Kohn is a writer from Washington, DC. She holds an MA in political communication from American University. 

Kate Kohn

Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.

SIGN UP FOR OUR NEWSLETTERS