Skip to content
trust in the digital age

Arms Control in Today’s (Dis)Information Environment: Part III

How to enable trust in the digital age.

Words: Jaclyn Kerr
Pictures: Charles Deluvio
Date:

When the Soviet Union covertly continued its massive biological weapons program after signing the Biological Weapons Convention in 1972, it disguised programs aimed at the weaponization of smallpox, bubonic plague, anthrax, and other pathogens as civilian medical research. It fabricated alternative explanations for deadly outbreaks caused by facility accidents and arranged elaborate “Potemkin tours” for international inspectors. When the United States seemed close to fielding their new enhanced radiation weapon under the administration of President Jimmy Carter, the Soviets launched a massive covert propaganda campaign through media outlets and front organizations to shift public opinion about the new weapon in Europe and the United States, mobilize protests that threatened NATO coherence, and ultimately alter US strategic decisions around the “neutron bomb.”

Information manipulation and covert influence campaigns have long been tools of sub-threshold strategic competition used to try to influence arms race dynamics, arms control decisions, and the enforceability of compliance and verification regimes. During the Cold War, such massive covert operations were only feasible by great powers. Today, not only are there more actors with potential stakes in arms control decisions, but global connectivity and digitization combined with a panoply of new Digital Age tools make it easier to obfuscate, deny, and manipulate the information environment around arms control.

AN OLD PROBLEM, MADE NEW

Accurate and mutually shared information is critical to successful arms control processes at all stages, from negotiation and ratification to implementation and compliance verification. Agreeing on a common foundation of objective facts and clear procedures for measuring compliance can be an immense diplomatic challenge in fraught negotiation settings where trust is limited and mutual interests are only thinly acknowledged. Scientific and technical expertise and intelligence gathering play critical roles but can themselves also become politicized or subject to manipulation. Agreements must not only hold up against a gauntlet of demands by the negotiating parties, but must also meet with some level of legitimacy and acceptance from their respective publics and political elites. Both the facts necessary as basis for negotiation and verification and the public legitimacy of these undertakings can be subject to information manipulation campaigns. Such campaigns can be covertly launched by parties to the arms control process or by third-party spoilers to alter decision-making processes and outcomes.

Today the possible tools for adversarial information manipulation are much faster, cheaper, and more scalable. The centrality of data, algorithms, and digital networks to modern life and today’s fragmented social media-fueled information ecosystems all open new vulnerabilities to manipulation relevant to current and future arms control efforts. As sub-threshold cyber aggression remains hard to deter or defend against, critical data, communications, and intelligence sources can be subject to exfiltration, manipulation, poisoning, leaks, and other abuses. While individual data and data-based advertising and recommendation engine algorithms can be used to microtarget messages on scale to manipulate large populations, spear-phishing and hack-and-leak operations can aim to compromise key individuals such as negotiators or political leaders at critical moments in arms control processes. One interesting, possible example is the recent leaked audio recording of Iran’s foreign minister while he engaged in re-negotiation of the Joint Comprehensive Plan of Action, JCPOA.

The current fragmented media and discourse ecosystem allows easy scalable campaigns to influence public opinion. Growing strategic use of cyber-enabled disinformation in international competition has asymmetrically benefited the United States’ non-democratic competitors and has the potential to weaken the leverage of the US and allies in current and future arms control processes.

Today the possible tools for adversarial information manipulation are much faster, cheaper, and more scalable. The centrality of data, algorithms, and digital networks to modern life and today’s fragmented social media-fueled information ecosystems all open new vulnerabilities to manipulation relevant to current and future arms control efforts.

Through years of experimentation at controlling and manipulating the information environments of their own domestic publics, authoritarian states including Russia, China, and Iran have developed capabilities at online influence that have in turn facilitated new forms of adversarial disinformation campaigns as tools of international competition. Even as the US developed capabilities and strategy to contest the cyber domain, the governance of and potential security issues emergent from the online media and discourse ecosystem were largely left out of these strategic and conceptual developments — ultimately leaving the US and its allies open to strategic surprise from those countries that had always taken a combined strategic approach to these different facets of the information environment. Recent disinformation campaigns by Russia to spread alternative narratives and bolster deniability around uses of chemical weapons and violations of the INF treaty are good examples. Likewise, the GRU’s attempted hacking of the Organisation for the Prevention of Chemical Weapons (OPCW) in 2018 shows how adversarial cyber operations against key actors or institutions can be used in combination with mass disinformation campaigns in attempts to undermine attribution processes.

Ongoing changes in the information mediation ecosystem and emerging technological advances further exacerbate these challenges. Deepfake and shallowfake technologies can permit malicious uses of simulated likenesses of public figures as elements of disinformation campaigns (an effect that political leaders in several countries contacted by a supposed Alexei Navalny aide have recently learned can also be achieved through some mix of spear-phishing, look-alikes and video-conferencing). Improved tools of microtargeting, combined with sentiment analysis capabilities and chatbots that can pass the “Turing Test,” have the potential to fuel even more precise and scalable targeting of vulnerable individuals and groups for manipulation. As recent breakthroughs in synthetic text generation show, even further automation and scalability of campaigns to manipulate the public discourse space may be on the near horizon.

Meanwhile, changes in the social media industry, including the rise of end-to-end encrypted many-to-many private messaging applications, and increasing use of niche platforms arising to protect the “free speech” of those removed from the mainstream alternatives mean that much of the most abusive information-manipulation activity might happen in places out of clear public sight. In future arms control processes, these new tools and platforms have the potential to bolster the precision and impact of disinformation campaigns targeting public opinion, even while also permitting new possible forms of technical data falsification and manipulation aimed at key institutions and decision-makers.

TECHNOLOGICAL SOLUTIONS?

In light of the potential negative impact of deliberate manipulation of the information environment on future arms control efforts, the United States and allies should work to establish the necessary practices, tools, and institutions to limit the impact of disinformation in the arms control space. Clear government leadership is needed to identify and mitigate the risks disinformation poses to current and future arms control processes. Solutions will have a necessarily technological character — both in relation to detection of manipulation and to countering its effects — though specifics will vary depending on the particular mechanism of information manipulation being addressed, the stage of the arms control process in question, and the target of the disinformation. Technological solutions alone will mean very little, however, if not leveraged in conjunction with practices and organizational structures built to coordinate efforts across stakeholders, recognize shared interests, and facilitate the possibility of trust.

Mitigating the harm of mass cyber-enabled disinformation campaigns is a whole-of-society problem. Social media platform efforts to limit the systemic spread of disinformation will, for example, be a critical piece of disrupting adversarial efforts to manipulate public opinion on arms control-related issues. A variety of data analytic tools and cooperative engagements can likewise help detect and reduce the spread and influence of disinformation across platforms. Enhanced systemic data privacy and cybersecurity protections will also be critical mitigations in protecting against some types of disinformation campaigns, whether through regulation, public-private partnerships, enhanced cyber defense capabilities, or more widespread use of privacy-enhancing technologies (e.g. leveraging federated learning, secure multi-party computation, differential privacy, or homomorphic encryption). While the race between development of synthetic content production algorithms and the tools to detect their products might remain locked in a cat-and-mouse pattern, alternative approaches to prevalence reduction in which platforms require posted content be marked with a verifiable signature might help limit the impact of AI-generated text, photos, videos, audio, and data as vehicles of disinformation.

While we must collectively pay attention to these systemic challenges of cyber-enabled disinformation, the particular risks of disinformation campaigns in the arms control space should also be specifically and concretely addressed. Expert working groups with this focused tasking should be stood up to support new negotiations or ongoing implementation and compliance activities. Steps should be taken to maximize the cybersecurity of key data, facilities, and public figures involved in arms control processes. Critical intelligence sources should be vetted for potential vulnerabilities to manipulation or data poisoning. False narratives and disinformation campaigns relevant to current arms control issues should be specifically anticipated, tracked, and disrupted. Reliable authoritative sources of relevant information should be made available to journalists and the public. These are all doable tasks but require explicit attention and cooperation of experts from different parts of government, the private sector, and society.

In addition to mitigating against mass disinformation campaigns that can influence arms control processes, assuring the technical basis of trust between governments involved in these processes is also crucial and should be an area of focused effort and research. While preventing unauthorized access to some data assets is vital to non-proliferation, shared access to trustworthy data sources plays an equally essential role in negotiation processes and verification regimes. Establishing mutually agreed on records (e.g. of existing stockpiles, inventories, facilities, materials, and equipment) as well as shared metrics and systems for how these can be objectively measured, interpreted and verified in the future, are vital parts of arms control negotiation processes and form the basis of trust in subsequent verification regimes. IAEA safeguards, for example, use inspections and monitoring to verify that reports and declarations made by states correctly account for all of their nuclear materials and activities.

Despite the many perils of today’s adversarial disinformation environment, emerging data security and assurance technologies might prove valuable tools for enhancing trust by facilitating the development and maintenance of reliable shared data. In some cases, for example, they might allow breakthroughs in accounting and control of sensitive materials, reducing the degree of reliance on intrusive inspection activities. Combined applications of Blockchain technologies, Internet of Things sensors, smart contracts, and digitally signed provenance-tracking systems have the potential to transform how some critical data is reliably measured, shared, verified, and protected.

ARMS CONTROL IN THE DIGITAL AGE

Today, an urgent reimagining of arms control for the Digital Age is needed.

With more actors involved in global competition, new and interdependent military domains, and emerging, dual-use, data-centric 4th Industrial Revolution technologies, the complexity of mitigating against catastrophic risk is greater than ever before. Yet, even amidst calls for new arms control treaties and enforcement protocols for AI, cyber, space, bio, and other threats, there is also recognition that Cold War-era approaches to arms control and compliance verification may no longer be sufficient. The technological and information environment makes adversarial use of disinformation to influence arms control processes much more potent, just as the assurance of accurate mutually-trusted data becomes more central than ever to these processes.

In today’s difficult to control and easy to manipulate information environment, new mechanisms are needed to generate trust — a vital ingredient to successful arms control processes. Technological solutions can help, but relationships must follow. Ultimately, the challenge of mitigating the effects of disinformation on arms control processes is a human challenge as well as a technological one: Nuclear arms and other weapons of mass destruction remain an existential threat to the future of all life on the planet, a fact that can be broadly recognized and understood even within a competitive and contentious international environment.

Jaclyn Kerr is a Senior Research Fellow at National Defense University’s Institute for National Strategic Studies. The views expressed in this paper are those of the author and are not an official policy or position of the National Defense University, the Department of Defense, or the US government.

This is the final in a series of three articles on “Arms Control in Today’s (Dis)information Environment” by National Defense University’s Sarah Jacobs Gamberini, Justin Anderson, and Jaclyn Kerr. The goal of the series is to contribute to a discussion about how disinformation poses a challenge to future arms control processes. The first article examined the heightened challenge of developing trust necessary for arms control in an age of disinformation. The second article focused on the repercussions of this change in context for future nuclear arms control negotiations.

Jaclyn Kerr

Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.

SIGN UP FOR OUR NEWSLETTERS