Skip to content

Accidents, Paradoxes and the Epistemic Future of Nuclear Policy

The nuclear community must learn to imagine the unimaginable and sit with uncomfortable facts.

Words: Aditi Verma
Pictures: Deon van Zyl
Date:

On March 11, 2011, at 2:46 pm Japan Standard Time an earthquake measuring 8.9 struck just off the east coast of Japan. At the time of the earthquake, three reactors were in operation at the Tokyo Electric Power Company (TEPCO) owned Fukushima Daiichi site. Three others were shut down and undergoing scheduled inspections. All six reactors at the site were boiling water reactors based on American General Electric designs. The peak ground acceleration resulting from the earthquake caused the three operating reactors at the Fukushima Daiichi plant to shut down automatically.

The earthquake also damaged the transmission towers with an immediate loss of offsite power supply to the plant. Following this loss of power, the Emergency Diesel Generators (EDGs) at each of the three operating reactor units started up automatically to remove the decay heat from the recently shut down reactors as well as to cool the spent fuel pools.

A tsunami caused by the earthquake arrived at the Fukushima Daiichi plant site approximately 45 minutes after the earthquake. The maximum wave height of the tsunami, 49 feet (15 meters), exceeded the height of the 16 foot (5 meter) sea wall around the plant. Seawater surged over the seawall, submerged and damaged the EDGs and units 1 through 5 lost AC power. The tsunami water also disabled two out of three EDGs at unit 6 (but the remaining EDG, which was air-cooled and located at a slightly higher elevation, supplied emergency AC power to reactor units 5 and 6.)

Despite efforts to depressurize the reactors and inject water into the core, fuel temperatures began to rise in reactor units 1, 2 and 3. The overheating caused the zirconium in the fuel cladding and the steam to react and produce hydrogen.

On March 12, a hydrogen explosion in the Unit 1 reactor destroyed the reactor building.

Hydrogen explosions also occurred on March 14 at Unit 3, and at Unit 4 on March 15.

Overheating of the fuel ultimately led to fuel melting in units 1, 2 and 3. The accident was rated at level 7, which is the maximum possible level on the International Atomic Energy Agency’s International Nuclear Event Scale. While estimates of the damages caused by the accident vary, they are upwards of ¥20 trillion Japanese yen or close to $200 billion American dollars. Yet even these estimates do little to capture the massive human displacements and resettlements caused by the accident.

I am a nuclear engineer by training, albeit one with a sociological imagination. I work at the intersection of technology and policy, and attempt also to work at the intersection of engineering and the social sciences. Specifically, in my work, I have studied how designers of nuclear reactors make decisions in the foundational early stages of design – particularly how they think about risk and safety.

We carve out the problems we choose to pay attention to – through new regulations, through new narratives, through new tools for the measurement of safety – and other causes are gradually forgotten.

When accidents, particularly nuclear accidents, occur, there is a desire, even an imperative almost, to identify the ‘root causes’ of that particular accident. This is not entirely surprising because such accidents cause existential crises sector-wide or industry-wide that reflect the desire to never repeat the mistakes that led to such a severe outcome. However, identifying the root causes of an accident is seldom possible. The causes of an accident are typically large in number and complex in relationship to each other.

These causes may be specific to the plant site – unprotected diesel generators, insufficiently high seawalls, containments without filtered vents, and so on – but they are also likely to be broader and systemic – a captured regulator, a poor national ‘safety culture’, an industry that is not transparent and breaches the trust of the people it serves. In reality, it is practically impossible to identify all of these root causes in a real sense and so very often, they are only resolved rhetorically. We carve out the problems we choose to pay attention to – through new regulations, through new narratives, through new tools for the measurement of safety – and other causes are gradually forgotten. That is not to say that these processes of forgetting are deliberately malicious or ill-intentioned. Instead, they often are an inevitable by-product of modern bureaucracies and siloed organizational structures.

Accidents also present several paradoxes. A first paradox is that accidents can sometimes be not entirely unexpected. There are always precursors to accidents. We can see them coming but we choose not to see them, and we are nevertheless surprised when they occur. A second paradox – one that I borrow from Scott Knowles, a historian who studies disasters – is that accidents are seldom purely natural or technological. This was clearly the case for the Fukushima Daiichi accident. Often, we try to sort accidents as one or the other – natural or technological – in order to be able to say something about which ones were of our own making and which were simply beyond imagination and design, too catastrophic to be designed for. This too we saw at Fukushima. A third paradox is the temporality of accidents. Accidents are not confined to discrete points in time or perhaps even over well-defined periods. In many ways, the Fukushima accident is still unfolding. But another way in which accidents are not discrete points in time is that they sometimes begin well before we are able to sense them. When we fail to maintain our technological and institutional infrastructures, our epistemic infrastructures – when we defer the maintenance of each of these infrastructures as well as the relationships with the publics who have placed their trust in us, we set off slow-moving chains of events that ultimately will end in catastrophe.

So, what then is the answer to these paradoxes? If it is so hard to resolve the temporal and causal dimensions of accidents, how should we, or how could we, even presume to anticipate and prepare for them? Here I would like to suggest a set of ideas – which I would like to coalesce and name under one heading. When we typically seek to learn from accidents – by identifying root causes, by developing best practices, by creating gold standards for organizational and institutional design – we inevitably leave out some inconvenient truths and facts, what Steve Rayner has called “uncomfortable knowledge.” In so doing, we put on blinders until the next accident. We create what Bronk and Jacoby have called an “analytical monoculture.” We should instead preserve that uncomfortable knowledge, preserve a diversity of beliefs and interpretations of facts. We should create the opposite of an analytical monoculture – we should create epistemic plurality. As an engineer, I know that this is truly an uncomfortable thing to do, because engineers are trained to find the right answers, and generally, a single right answer. It is uncomfortable to think that there could be more than one right answer to any given question.

Yet, this is where we begin to see some particularly fertile (no pun intended) opportunities for research and interdisciplinary collaboration. We have to think about how to break our analytical monocultures and build the possibility of epistemic plurality into the designs of our technologies, our organizations and our institutions. For technologists – scientists and engineers – this calls for significant introspection about our ways of making knowledge and about our pedagogical practices. How can we train future engineers, particularly future nuclear engineers and nuclear professionals who are intellectually agile and able to sit with uncomfortable knowledge without seeking to make it reductive? Every field of engineering grapples with this paradox but perhaps none more so than nuclear engineering. We ought to learn from our past failures and lead the way as engineering disciplines writ large strive to become more reflective and equitable.

For the humanists and social scientists, particularly those for whom the nuclear sector is a research site – there is work for you too. Work with us engineers. Hold up a mirror, so that we may not forget the uncomfortable facts. Show us the deficits in our own thinking, and, if I may, allow us to interrogate your own where they exist. Work with us to reimagine institutional and technological infrastructures so that we can create the necessary conditions for epistemic plurality. Such interdisciplinary communion is vital for the design and governance of any complex and risky technology.

And perhaps finally and most importantly – in order for us researchers as well as practitioners and policymakers from diverse intellectual and professional backgrounds to be able to work with each other, we have to learn to speak a more common language, less cluttered with the jargon of our respective fields, so that when we talk about risk, about nuclear safety and security, we are talking to each other and not past each other so that we can work together to manage the atom.

Dr. Aditi Verma is a Stanton Nuclear Security Fellow at Harvard’s Belfer Center for Science and International Affairs where she is jointly appointed by the Project on Managing the Atom and the International Security Program. She is broadly interested in how nuclear technologies specifically and complex technologies broadly—and their institutional infrastructures—can be designed in collaboration with publics such that traditionally excluded perspectives can be brought into these design processes.

Author’s note: This piece is based on the author’s closing remarks at a conference on “Nuclear Safety and Security After Chernobyl and Fukushima: Lessons Learned and Forgotten” hosted by the Project on Managing the Atom at Harvard’s Belfer Center for Science and International Affairs. The author thanks Professor Matthew Bunn, Professor Todd Allen and Dr. Denia Djokić for thoughtful comments on an earlier draft of this piece.

Aditi Verma

Hey there!

You made it to the bottom of the page! That means you must like what we do. In that case, can we ask for your help? Inkstick is changing the face of foreign policy, but we can’t do it without you. If our content is something that you’ve come to rely on, please make a tax-deductible donation today. Even $5 or $10 a month makes a huge difference. Together, we can tell the stories that need to be told.

SIGN UP FOR OUR NEWSLETTERS