Skip to content
Ukraine, invasion, Russia, disinformation

Always Verify

THE BABES BLUF looks at the roots of disinformation and its role at home and abroad.

Words: Kate Hewitt
Pictures: The Climate Reality Project

BLUF: Sharing false or misleading information is not a new concept. Russia’s use of disinformation and propaganda dates back to decades before the Ukrainian invasion or social media. Yet, in the current age of tech advancing at the speed of wildfires, what gets circulated at home and abroad is starting to have dire consequences, including influencing war. It is more important than ever to verify information and ensure that we are arming ourselves, especially the vulnerable amongst us, with credible facts because they may very well become a matter of life and death. 

The world has watched in disbelief as Russia piled its troops around the Ukrainian border for months before invading the sovereign country last week. There are plenty of good explainers and updates on this tragic and unnecessary war, so we won’t discuss that in depth. But one facet of Russian warfare for some time (we are talking decades) has been the use of disinformation and propaganda. That tactic is playing out front and center with the Ukrainian crisis.

If you’re reading this from your computer or iPhone, you should be aware that you’re going to be faced with a new type of warfare, and one that is waged online. What is disinformation? What are its Russian origins? How will it manifest itself with Ukraine? And what can you do about it? Let’s get to work. 


Disinformation, as a concept, is actually quite simple. It is the deliberate sharing of false or misleading information to bias, sway, or change public opinion. It’s an effective tool capitalizing on the basic idea that human perception can be exploited through deception. Sometimes the information is blatant but usually, it’s subtle. But this is how disinformation works. One tiny, well-thought-out seed of doubt, exploited to make you start questioning: What is real and what is fake? Add a multi-billion-dollar industry, advanced technology, a society with 24-hour news cycles, a population enamored with social media, and war, and the world has disinformation on steroids. And no one does disinformation better than the Russians. 

Several US officials and government reports have called disinformation one of the top threats to US national security, especially because a false story reaches 1,500 people on average six times more quickly than a factual story.

In 1923, Joseph Stalin set up a special disinformation unit to conduct active intelligence operations. The office was designed to cause diversions by creating and distributing misleading information through the media of open societies, aka democracies, like the US. And they have been mastering this ever since. It is important to note that Russia approaches war much differently than open, democratic countries. To Russian doctrine, nonmilitary instruments (like disinformation) can rival the effectiveness of conventional weaponry. Valery Vasilyevich Gerasimov, Russian Army General and currently Chief of the General Staff of the Armed Forces of Russia, once said, “The information space opens wide asymmetrical possibilities for reducing the enemy’s fighting potential.” To make it simple: What someone lacks in beauty (advanced military technology), they can make up for in book smarts (disinformation).

The US’ first major brush with disinformation was called Operation INFEKTION (or FORWARD II or DENVER), a 1980s campaign by the KGB to plant the idea that the US had created and spread HIV/AIDS as part of a biological weapons research project (similar to a disinfo campaign now about COVID). Journalist Joshua Yaffa described OPERATION INFEKTION as a strategy that involved “an extraordinary amount of effort — funding radio programs, courting journalists, distributing would-be scientific studies.” 

According to the US Department of State, the goal of this Soviet campaign was to undermine US credibility, isolate Americans at home and abroad, and cause problems for us in countries that hosted our military bases. Some analysts believe the intent was also to distract from the Soviet’s own offensive biological warfare program or retaliate against accusations the US had made for what was later called the “yellow rain incident.” Because even grown men don’t always handle things maturely.


Putting the Russia/Ukraine war aside for one minute, the US is not immune to our own issues of disinformation. Over the past few midterm and presidential elections, there were multiple reports of disinformation campaigns to influence more than just elections. The goal of disinformation by foreign adversaries recently has been threefold: 1) influence votes, 2) cause internal divisions and chaos in the US, and 3) shake the confidence of the American electorate in their political leadership and government. Now, some experts are saying the threat is moving closer to home as disinformation is being formulated and pushed domestically

Several US officials and government reports have called disinformation one of, if not the top, threat to US national security. As Fiona Hill, author of Mr. Putin: Operative in the Kremlinand famous for her time in Trump’s administration as an advisor on Europe on the National Security Council and her testimony in President Donald Trump’s impeachment hearings — said in a CBS 60 Minutes interview in March 2020: “Putin, sadly, has got all of our of our political class, every single one of us, including the media, exactly where he wants us. He’s got us feeling vulnerable…on edge, and he’s got us questioning the legitimacy of our own systems.” 

According to the State Department, a false story reaches 1,500 people on average six times more quickly than a factual story. This is true of false stories about any topic, but stories about politics are the most likely. As pointed out by Carina Kaplan, ​the role of disinformation in wartime and crisis, “is exposing how the power of the global disinformation network, in conjunction with the platforms that enable it, are having real offline consequences.” We’ve seen this play out with elections in the US, abroad with the Rohingya crisis, and now with Ukraine. 


Russia has been planning this invasion for months, and not just physically by surrounding its border with Ukraine with Russian soldiers, but also through false information. Here’s how Russia is using disinformation with Ukraine:

1. Justifying the invasion: 

As Tatiana Serefino recently pointed out, Russia is creating an alternative reality or flipping the narrative to create the conditions for invasion. “Disturbing narratives of neo-Nazis in Ukraine, the genocide of ethnic Russians in Donbas (Russian officials have even gone so far as to say that they have found mass graves), and Ukraine’s nuclear threat are all gaining traction.” These allegations, repeated routinely by Putin are not new and they are also not necessarily for anyone in the West. Instead, Russia is trying to convince its own population and those in Ukraine that this war is necessary.

One article stated that anti-Ukrainian and anti-NATO rhetoric “increased 75 times online since last October in the Czech Republic alone” and “has replaced COVID-19 as the main topic of disinformation.” And one Slovak opinion poll found that 44% of respondents believed the US and NATO were responsible for the escalating tensions between Russia and Ukraine, while only 35% held Russia responsible.

2. Downplaying Russian dissatisfaction:

Russian support for the invasion is torn and disinformation has played a role in swaying some. Disinformation campaigns seem to be swaying public opinion in Russia given that in December 2021, only 9% of Russians thought Putin should arm Ukrainian separatists. But now, according to Dennis Volkov, director of the Levada Center, which is Russia’s leading polling center, more than half of Russians support Putin’s decision to invade Ukraine. “It should have been done a long time ago,” a Moscow resident told the Associated Press. “These poor people who identify as Russian, who mainly identify as Orthodox, who cannot wait anymore and live expecting to be killed.” Support for Russia has also come from Belarus, China, Cuba, India, Iran, Syria, and Venezuela. Yet, over 100,000 thousand have signed petitions, tens of thousands have rallied in Moscow, and more than 2,000 people have gone to jail over their protest of the war breaking out next door. Support among the Russian elite and ruling class, typically those who support Putin without question, have appeared to crack in meetings and interviews, adding fissure to the sincerity of internal Russian support for the invasion. 

3. Rewriting the course of the war: 

As Russian troops advanced on the capital of Ukraine, Russian propaganda was flooding state-backed media outlets with disinformation that President Volodymyr Zelensky had fled Ukraine. But Ukraine was able to fire back and debunk the false information on social media when Zelensky appeared with his cabinet members, including the prime minister, and said: “We’re all here. We are in Kyiv. We defend Ukraine.” The disinformation being spread by Russia is also attempting to panic the public. False claims were dispelled by Ukraine’s Cyber Police force after Ukrainians received text messages alleging that ATMs had stopped working. Russia’s working overtime in Ukraine to threaten troops, trick Ukrainians, and convince Russians that its offensive is perhaps more successful than it really is. 


From a domestic standpoint, Americans are finding it harder and harder to spot disinformation when they see it. For example, you can read articles about real disinformation here, here, here, here, and here. Perhaps you remember the controversy of Kendall Jenner holding up a “Black Lives Matter” sign — a picture that went viral and was later proven to be photoshopped — and that’s a Kardashian controversy… not, you know, war. Disinformation might be fake news, but it is a very real weapon being used to target all of us every day in big and small ways. And as we are witnessing in real-time, disinformation now plays a pivotal role in modern warfare. 

To show you how advanced disinformation technology is already, peek at this website which shows you, every time you refresh the page, a person who very much looks real but is not! How? The site uses a complicated algorithm to produce computer-generated faces that we recognize as human when none of those people are real. Are you looking to test how well you would do when confronted with disinformation? Another guide to election security and disinformation gives you plenty of tools/skills/info to help you spot the nasty junk, then gives you a chance to check your skills. Maybe instead you need some quick, easy tools to fact check information? I got you!

Tool #1 – Check your source: Ask yourself where you are seeing this. If it’s social media, let’s try google. Who shared it? Is the source credible (if you aren’t sure, see  tool #2, #4 and #5) or biased? If you are reading something originating from Russian backed news (RT, Sputnik News, and TASS) then go ahead and assume that was propaganda. 

Tool #2 – Be on Bot alert: If seeing something on social media, double check that the original poster of that information is not a bot by seeing how often they are posting information (is it all day even in the middle of the night — probably a bot); do they have a real picture of themselves (if not — might be a bot); does their profile username make sense or does it have a bunch of random letters or numbers (if the latter — bot).

Tool #3 – Snopes:  This is a quick way to search an issue and see if it’s real.

Tool #4 – Factchecking: MediaBias/FactCheck is a website that shows you the media bias of the outlet you are reading.

Tool #5 – Verifying: Triple check the story/picture/video and always see if you find the story in three credible places. 

PRO TIP: Don’t burn out… we’ll be on the disinformation-warfare-defensive probably for the rest of our lives, so I am not telling you to check every single thing you read when it comes to Ukraine (or anything else for that matter). Pick and choose your battles. The rule of three here is if you’re resharing it, if it’s noticeably upsetting you, or if you’re using it to make an argument/judgment, just triple check it. Otherwise, move the hell on. 


Tech giants are moving to block some outlets and information in Ukraine, at the request of the Ukrainian government to counter disinformation campaigns by Russia. But the move is complicated as there are questions about private company responsibility vs. censorship. Obviously, Russia sees the move as censorship. Experts on the subject have warned that it’s equally as important to get good, accurate information into the hands of Ukrainians, as it is to counter and block disinfo campaigns.

It’s extra important to remember, that none of us are immune to weapons of mass distraction and all of us are likely already victims (insert girl raising hand emoji — it happens to me too!). What we post and reshare plays into what kind of information spreads — when it comes to Ukraine and everything else. As technology advances and the outcomes of disinformation grow in efficacy, it will only become increasingly more pervasive and difficult to spot. The best thing we can all do is practice objectivity and verify before we trust.

That’s all for this one, babes.

No pressure. No bullshit. Just, THE BABES BLUF.

THE BABES BLUF (bottom line up front) is a different kind of current affairs and lifestyle blog that talks about issues in a way women (and men!) can relate to and enjoy. To read more from THE BABES BLUF, visit and subscribe to never miss a #BLUF, and check them out on Twitter or Instagram. For more THE BABES BLUF pieces, see here.

Kate Hewitt


Kate Hewitt currently works in national security and is the founder of THE BABES BLUF, a current affairs and lifestyle blog with a monthly column for Inkstick Media. Previously, she was a Herbert Scoville Jr. Peace Fellow and Research Assistant with the Foreign Policy program at The Brookings Institution focused on nuclear security and strategy issues. She also served as a Community and Organizational Development Adviser in Peace Corps Moldova and held internships with the Massachusetts Institute of Technology and Energy Northwest. Kate was a recipient of The Bulletin of the Atomic Scientists’ Rieser Award (2018), an N Square Nuclear Security Innovation Fellow (2018), and a Farsi Foreign Language and Area Studies Fellow (2017). She has authored articles, reports and book chapters on national security, foreign policy, and the importance of women in STEM and national security — the latter of which is a passion of hers that she exercises by sitting on the Board of Advisors for Girl Security. She holds an M.A. in Global Studies from the University of North Carolina at Chapel Hill and a dual-BA in Political Science and Philosophy from Gonzaga University.


Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.