PART
ONE
MAGNIFICENT DESOLATION
A couple of years ago, distracting myself at work, I saw a link on Twitter to a YouTube video that caught my attention. It was a computer-animated re-creation of the sinking of the Titanic in real time, all two hours and forty minutes of it. I did not watch the whole video, but I skipped around and watched parts, interested especially in the few interior views, where you can watch the water level slowly rising at an angle—since the ship pitched forward as it sank—in the white-painted hallways of the lower decks, and later, in the ballroom and grand staircase, as wicker chairs bob around.
The strangest thing about the video is that it includes no people—no cartoon passengers. There is no violin music, no voice-over. The ship is lit up, glowing yellow in the night, but the only sound, apart from a few emergency flares and engine explosions, is of water sloshing into and against the ship. The overall impression is of near silence. It’s almost soothing.
This is true until the last few minutes of the video, when the half-submerged ship begins to groan and finally cracks in half. Only then, as the lights go out and the steam funnels collapse, do you hear the sound of people screaming, which continues for another thirty seconds after the ship has disappeared. A caption on the screen reads: “2:20—Titanic is gone. Rescue does not arrive for another hour and forty minutes.” A few lifeboats (empty) are seen floating on the calm black ocean, under a starry sky. Then, another caption: “2:21—Titanic is heard beneath the surface breaking apart and imploding as it falls to the sea floor.” The video ends on this disturbing note, with no framing narrative to lend a pseudo-happy ending.
At once, I was obsessed with the story of the Titanic. I rewatched the James Cameron movie (which I first saw in high school—still ridiculous, still gripping); I read a Beryl Bainbridge novel (Every Man for Himself) based on the night of the sinking, which felt like a novelization of the Cameron movie, though the book predates it, just; I read thousands of words on Wikipedia and what you might call fan sites, if you can be a fan of a disaster—lists of “facts” and conspiracy theories. I watched a documentary (Titanic’s Final Mystery) about a weird new theory of the root cause of the disaster: One scientist thinks that a sudden and extreme drop in temperature caused a mirage on the horizon that obscured the iceberg from the men in the lookout until they were nearly upon it. The same illusion could explain why a nearby ship, the SS Californian, did not see that the Titanic was clearly in distress. It is, of course, just a theory.
The Hollywood version of the narrative, which puts the blame on hubris, has a lot of pull—the Titanic sank because they dared to call it unsinkable. It’s the Icarus interpretation: Blinded by a foolhardy overconfidence, we flew too close to the sun, melting our wings, and so on. It’s the easiest explanation, appealing in its simplicity, its mythic aura.
* * *
When I ran out of freely available Titanic material, I moved on to other disasters. I had an overwhelming desire for disaster stories, of a particular flavor: I wanted stories about great technological feats meeting their untimely doom. I felt addicted to disbelief—to the catharsis of reality denying my expectations, or verifying my worst fears, in spectacular fashion. The obvious next stop was 9/11.
So far, 9/11 is the singular disaster of my lifetime. People who were in New York City at the time always comment on how “beautiful” and “perfect” that September morning was, with “infinite visibility”—pilots call those conditions “severe clear.” As I recall, it was a bright blue day in Houston too. I was driving from my apartment to the Rice University campus a couple of miles away when I heard radio reports of a plane hitting one of the Twin Towers. I continued driving to school, parked my car in the stadium lot, and went into the student center, where a few people were watching the news on TV with that air of disbelief that can appear almost casual.
The live footage of a massive steel skyscraper with smoke pluming from a hole in its side was shocking, but I felt it dully—shock in the form of incomprehension, maybe denial. I don’t remember truly feeling horror—that is, understanding—until people began to jump from the buildings. They were specks against the scale of the towers, filmed from a distance, but you knew what they were. They became known as the “jumpers”: people trapped in the upper floors of the buildings, above the planes’ impact and unable to get out, who were driven to such desperation from the extreme heat and lack of oxygen that they broke the thick windows with office furniture and jumped to the pavement hundreds of stories below. Leslie E. Robertson, the lead structural engineer of the towers, later wrote that “the temperatures above the impact zones must have been unimaginable.” The people nearby, and still in the buildings, could hear the bodies landing.
An Associated Press photo dubbed “The Falling Man” captures one of these jumpers: a man “falling,” as if at ease, upside down and in parallel with the vertical grid of the tower. (It’s a trick of photography; other photos in the series show him tumbling haphazardly, out of control.) The photo was widely publicized at first but then met with vehement critique. Some people found this particular image too much to take, an insult to their senses. And though the jumps were witnessed by many, the New York City medical examiner’s office classifies all deaths from the 9/11 attacks as homicides. Of course, the deaths were forced, forced by suffering—but they were also voluntary. It seems akin to prisoners held in solitary confinement (or otherwise tortured) killing themselves—murder by suicide.
When I think of the jumpers, I think of two things. I think of images of women covering their mouths—a pure expression of horror. They were caught on film, watching the towers from the streets of Manhattan. I do this sometimes—hand up, mouth open—when I see or read something horrible, even when alone. What is it for? I think, too, of the documentary about Philippe Petit, who tightrope-walked between the tops of the towers in 1974. At the time, they were the second-tallest buildings in the world, having just been surpassed by the Sears Tower in Chicago. It was an exceptionally windy day (it’s always windy at 1,300 feet) and when a policeman threatened him from the roof of one building, Petit danced and pranced along the rope, to taunt him. This feels like one of the craziest things a man has ever done. For the jumpers, death was not a risk but a certainty; they jumped without thinking. It’s more horrible to contemplate than many of the other deaths because we know the jumpers were tortured. Death is more fathomable than torture.
A Discovery Channel documentary that I found on YouTube called Inside the Twin Towers provides a minute-by-minute account of the events on September 11, a mix of reenactments and interviews with survivors. One man who managed to escape from the North Tower—he was four floors below the impact—recounts a moment when he opened a door and saw “the deepest, the richest black” he had ever seen. He called into it. Instead of continuing down the hall to see if anyone was there, he retreated back to his office in fear. He says in the film, “If I had gone down the hallway and died, it would have been better than living with this knowledge of ‘Hey, you know what, when it came right down to it, I was a coward.’ And it was actually our two coworkers down that hallway, on the other side, that ended up dying on that day. And I often think now, Perhaps I should have continued down that hallway.”
This is a classic case of survivor’s guilt, sometimes known as concentration-camp syndrome: the sense that your survival is a moral error. Theodor Adorno, in an amendment to his somewhat misunderstood line about poetry after Auschwitz, wrote:
Perennial suffering has as much right to expression as a tortured man has to scream; hence it may have been wrong to say that after Auschwitz you could no longer write poems. But it is not wrong to raise the less cultural question whether after Auschwitz you can go on living—especially whether one who escaped by accident, one who by rights should have been killed, may go on living. His mere survival calls for the coldness, the basic principle of bourgeois subjectivity, without which there could have been no Auschwitz; this is the drastic guilt of him who was spared. By way of atonement he will be plagued by dreams such as that he is no longer living at all.
This syndrome, along with post-traumatic stress disorder, goes some way toward explaining why so many Holocaust survivors have committed suicide.
* * *
There is survivor’s guilt, but there is also survivor’s elation, survivor’s thrill—a thrill felt only by those a little farther from disaster. The September 24, 2001, issue of The New Yorker included a symposium of responses to the attacks. A few were able to acknowledge the element of thrill in observation. Jonathan Franzen wrote:
Unless you were a very good person indeed, you were probably, like me, experiencing the collision of several incompatible worlds inside your head. Besides the horror and sadness of what you were watching, you might also have felt a childish disappointment over the disruption of your day, or a selfish worry about the impact on your finances, or admiration for an attack so brilliantly conceived and so flawlessly executed, or, worst of all, an awed appreciation of the visual spectacle it produced.
I find Franzen’s moral hierarchy here questionable, that “worst of all” most puzzling. Because to me, more than worry, or admiration (!), the most natural and undeniable of reactions would seem to be awe.
It’s the spectacle, I think, that makes a disaster a disaster. A disaster is not defined simply by damage or death count; deaths by smoking or car wrecks are not a disaster because they are meted out, predictable. A disaster must not only blindside us, but be witnessed, and re-witnessed, in public. The Challenger explosion killed only seven people, but like the Titanic, which killed more than 1,500, and like 9/11, which killed almost 3,000, the deaths were both highly publicized and completely unexpected. Disasters are news because they are news.
All three of these incidents forced people to watch huge man-made objects, monuments of engineering, fail catastrophically, being torn apart or exploding in the sky. These are events we rarely see except in movies. The destruction of the Challenger and the World Trade Center are now movies themselves, clips we can watch again and again. The ubiquity of cameras, which we now carry all the time in our pockets, makes disaster easier to witness and to reproduce; it may even create a kind of cultural demand for disasters. We also get to watch the reaction shots—both the special effects and the human drama.
Roger Angell’s version of survivor’s thrill in the same New Yorker issue is less chastising:
When the second tower came down, you cried out once again, seeing it on the tube at home, and hurried out onto the street to watch the writhing fresh cloud lift above the buildings to the south, down at the bottom of this amazing and untouchable city, but you were not surprised, even amid such shock, by what you found in yourself next and saw in the faces around you—a bump of excitement, a secret momentary glow. Something is happening and I’m still here.
Angell is saying this is not an aberration; it is the norm. It is one of the terrible parts of disaster, our complicity: the way we glamorize it and make it consumable; the way the news turns disasters into ready-made cinema; the way war movies, which mean to critique war, can really only glorify war.
We don’t talk about it now, but I always found the Twin Towers hideously ugly, in a way not explainable by their shape alone—they were long rectangular prisms, nothing more. Their basic boxiness was somehow an affront. I find the Empire State Building and the Chrysler Building beautiful. I find the Eiffel Tower beautiful. It must be their tapering sweep, the way they diminish as they ascend, their detail suggesting fragility. How could anyone ever have found the Twin Towers beautiful? They seemed designed only to represent sturdiness, like campus buildings in the brutalist tradition that were said to be riot-proof.
A friend, a New Yorker, disagrees. She tells me the buildings “did amazing things with the light.” Another, also from New York, says they were “sexy at night.” But all skyscrapers are sexy at night, from below if not from afar, by virtue of their sheer dizzying size, their sheer sheerness. They stand like massive shears, stabbed into the sky.
Despite their imposing, even ominous height, the towers fell in less than two hours; the Titanic took only a little longer to sink. But that happened gradually. When you watch a building collapse, it seems like it suddenly decides to collapse. It’s a building, and then it’s not a building, just a crumbling mass of debris. There is no transition between cohesion and debris. It is terrifying how quickly an ordered structure dissolves. Where does it all go? Buildings, like anything, are mostly empty space.
* * *
In the vocabulary of disaster, the word “debris” is important—from the French debriser, to break down. A cherishable word, it sounds so light and delicate. But the World Trade Center produced hundreds of millions of tons of it. The bits of paper falling around the city led some people to mistake the attack for a parade. In space flight, or even on high-speed jets, tiny bits of foreign object debris (FOD) can cause catastrophe. Space food is coated in gelatin to prevent crumbs, which in a weightless environment could work into vulnerable instruments or a pilot’s eye. Debris on the runway could get sucked into a jet engine and cause it to fail.
The Challenger explosion, like the sinking of the Titanic, is usually chalked up to hubris. But if hubris is overconfidence—“presumption toward the gods”—the explanation is unsatisfying. Engineers at NASA’s Marshall Space Flight Center knew that the O-ring seals, which helped contain hot gases in the rocket boosters, were poorly designed and could fail under certain conditions—conditions that were present on the morning of the launch, which was unusually cold. The O-rings were designated as “Criticality 1,” meaning their failure would have catastrophic results. But the engineers did not take action to ground all shuttle flights until the problem could be fixed. As the very first sentence in the official Report of the Presidential Commission on the Space Shuttle Challenger Accident puts it: “The Space Shuttle’s Solid Rocket Booster problem began with the faulty design of its joint and increased as both NASA and contractor management first failed to recognize it as a problem, then failed to fix it and finally treated it as an acceptable flight risk.” What shocks me most when I read about the space program is the magnitude of the risks. The Challenger exploding on live TV in front of 17 percent of Americans was unthinkable to most of those viewers, but not unthinkable to workers at NASA.
From what I understand, NASA has always embraced risk. In his memoir Spaceman, the astronaut Mike Massimino, who flew on two missions to service and repair the Hubble telescope, recounts the atmosphere at NASA after the space shuttle Columbia broke up on reentry in 2003:
When I walked in I saw Kevin Kregel in the hallway. He was standing there shaking his head. He looked up and saw me. “You know,” he said, “we’re all just playing Russian roulette, and you have to be grateful you weren’t the one who got the bullet.” I immediately thought about the two Columbia missions getting switched in the flight order, how it could have been us coming home that day. He was right. There was this tremendous grief and sadness, this devastated look on the faces of everyone who walked in. We’d lost seven members of our family. But underneath that sadness was a definite, and uncomfortable, sense of relief. That sounds perverse to say, but for some of us it’s the way it was. Space travel is dangerous. People die. It had been seventeen years since Challenger. We lost Apollo 1 on the launch pad nineteen years before that. It was time for something to happen and, like Kevin said, you were grateful that your number hadn’t come up.
The culture of risk at NASA is so great that in place of survivor’s guilt there is only survivor’s relief. But knowing the risks and doing it anyway must require some level of cognitive dissonance. This is apparent when Massimino writes that “like most accidents, Columbia was 100 percent preventable.” This is hindsight bias; only past disasters look 100 percent preventable. The Columbia shuttle broke apart due to damage inflicted on the wing when a large chunk of foam insulation flew into it during launch. This was observed on film, and the ground crew questioned whether it might have caused any damage. However, insulation regularly broke apart during launches and had never caused significant damage before. Further, NASA determined that even if the spacecraft had been damaged, which it had no way of verifying, there was nothing that the flight crew could do about it, so NASA officials didn’t even inform them of the possibility of the problem.
When Columbia came apart during reentry, disintegrating and raining down parts like a meteor shower over Texas and Louisiana, an investigation was launched. At first, no one believed that the foam could have done enough damage to cause the accident. It was “lighter than air.” Massimino writes, “We looked at the shuttle hitting these bits of foam like an eighteen-wheeler hitting a Styrofoam cooler on the highway.” Not until they actually reenacted the event by firing a chunk of foam at five hundred miles per hour toward a salvaged wing and saw the results did they accept it as the cause of the disaster. Anything going that fast has tremendous force. This was not like the failure of the O-ring; the risks of the insulation were not understood. Or, more properly, they were simply not seen—it’s basic, though unintuitive, physics. The same type of accident is 100 percent preventable now only because the disaster happened, triggering a shuttle redesign. When redesigns cost billions of dollars, if it isn’t broke, they don’t and probably can’t fix it.
The concept of hubris lets us off too easy. It allows us to blame past versions of ourselves, past paradigms, for faulty thinking that we’ve since overcome. But these scientists we might scoff at now were incredibly smart and incredibly well prepared. The number of things that didn’t go wrong on all the space missions is astounding. It’s easy to blame people for not thinking of everything, but how could they think of everything? How can we?
Not knowing the unknowable isn’t hubris. There is danger in thinking, “We were dumb then, but we’re smart now.” We were smart then, and we are dumb now—both are true. We do learn from the past, but we can’t learn from disasters we can’t even conceive of. While disasters widen our sense of the scope of the possible, there are limits. We can’t imagine all possible futures. Yet we call this hubris. Perhaps it’s comforting to believe that disasters are the result of some fixable “fatal flaw,” and not an inevitable part of the unfolding of history.
To say there are limits to technological progress—we can’t prepare ourselves completely for the unforeseen—is not to say that progress is impossible, but that progress is tightly coupled with disaster. As the French cultural theorist Paul Virilio famously said, “The invention of the ship was also the invention of the shipwreck.” Not until we experience new forms of disaster can we understand what it is we need to prevent. Overreliance on the explanatory power of hubris is itself a form of hubris, a meta-hubris. And without hubris pushing us, however blinkered, forward, would there be any progress at all? Don’t we need hubris to enable and justify advances in technology? NASA seems to take hubris in stride; they see occasional disaster as the fair cost of spaceflight.
In his “Letter from Birmingham Jail,” Martin Luther King, Jr., warned of “the strangely irrational notion that there is something in the very flow of time that will inevitably cure all ills.” You could say the same of technological progress; it is tempting to believe that progress occurs on a linear curve, that eventually all problems will be solved, and all accidents will be completely preventable. But there’s no reason to assume that the curve of progress is linear, that the climb is ever increasing.
* * *
I want to come back to the Titanic, and some common misconceptions. One is that there were not enough lifeboats on board for frivolous reasons—because proprietors felt they would look unattractive on deck, or because they were regarded as mere symbols, serving only to comfort nervous passengers on a ship designers believed was literally unsinkable. This isn’t the case. Rather, the thinking at the time was that the safest method of rescue, in the event of an emergency, was to ferry passengers back and forth between the sinking ship and a rescue ship. Because the Titanic would sink slowly, if at all, people would actually be safer on the ship, for some time, than in a lifeboat. Therefore, the lifeboats didn’t need to accommodate the entire capacity of the ship in one go.
So why did the Titanic sink so fast? The surprising truth is that if the ship had hit the iceberg head-on, instead of narrowly missing it at the stern and then scraping along its side, it would not have sunk. The ship was capable of sustaining major damage from an impact like an iceberg—it could have stayed afloat if four of its sixteen watertight bulkheads were flooded. But the iceberg tore into the ship in such a way that five compartments were damaged. This event was not, realistically, foreseeable; no iceberg in history had done that kind of damage to a ship, and none has done that kind of damage since. It was, in essence, a freak accident.
There are echoes of this in the World Trade Center’s collapse. It’s well-known that the buildings were designed to survive the impact of an airplane. However, the engineers were envisioning emergencies like a small, slow-flying plane hitting one of the towers by accident—in fact, a bomber flying in near-zero visibility had hit the Empire State Building in 1945—not a modern jet being flown purposely into a tower at top speed. Still, there was a false sense of security. After the first impact, the PA system in the building told people to remain at their desks when of course they should have been evacuating. Some building staff also told workers it would be safer to stay where they were.
Is this hubris, or something else? Disasters always feel like a thing of the past. We want to believe that better technology, better engineering will save us. That the more information we have, the safer we can make our technology. But we can never have all the information. In creating new technology to address known problems, we unavoidably create new problems, new unknowns. Progress changes the parameters of possibility. This is something we strive for—to innovate past the event horizon of what we can imagine. And with so much that is inaccessible, opaque, and in flux, we can’t even hold on to what we already know.
* * *
As they stepped out of the lunar module and began their moon walk, Neil Armstrong said to Buzz Aldrin, “Isn’t that something! Magnificent sight out there.” Aldrin’s cryptic, poetic response was “Magnificent desolation.” I think of this quote when I see footage of disasters. Especially after years of buffer, years of familiarity, have lessened the sting, it’s easy to see these events as, in their way, magnificent. Magnificent creations beget magnificent failures. It is awesome that we built them; it was awesome when they fell. Horror and awe are not incompatible; they are intertwined.
Is it perversity or courage that allows some people to admit to survivor’s thrill? On the afternoon of September 11, I remember meeting my then boyfriend on campus for lunch. He was a contrarian type, but his reaction still disturbed me—he was visibly giddy, buzzed by the news. It’s not that I don’t believe other people were excited, but no one else had revealed it. In 2005, before the levees broke in New Orleans, a friend of mine asked if I wasn’t just a little bit disappointed that Hurricane Katrina hadn’t turned out as bad as predicted. Just hours later, she regretted saying it.
Often, when something bad happens, I have a strange instinctual desire for things to get even worse—I think of a terrible outcome and then wish for it. I recognize the pattern, but I don’t understand it. It’s as though my mind is running simulations and can’t help but prefer the most dramatic option—as though, in that eventuality, I could enjoy it from the outside. Of course, my rational mind knows better; it knows I don’t want what I want. Still, I fear this part of me, the small but undeniable pull of disaster. It’s something we all must have inside us. Who can say it doesn’t have influence? This secret wish for the blowout ending?
2016
DOOMSDAY PATTERN
On May 31, 1945, U.S. Secretary of War Henry Stimson called a meeting of experts to advise President Harry Truman on the atomic bomb: Should we use it or not? J. Robert Oppenheimer, the scientist heading the Manhattan Project, was asked to explain the difference between the new bombs and the firebombs already in use. That spring, General Curtis LeMay had been firebombing Japan with napalm, a highly flammable and “sticky” mixture of gasoline and gelling agents. Almost a million people in sixty cities were “scorched, boiled, and baked to death,” in LeMay’s own words, in these napalm raids. It must have been hard to believe that the A-bomb could be dramatically more deadly—so what would it accomplish?
Oppenheimer’s response was that anything living within two-thirds of a mile of the atomic bomb’s blast site would be irradiated, and further, the appearance of the explosion would have its own impact. The meeting notes read: “It was pointed out that one atomic bomb on an arsenal would not be much different from the effect caused by any Air Corps strike of present dimensions. However, Dr. Oppenheimer stated that the visual effect of an atomic bombing would be tremendous.”
At the time, this was purely theoretical. But six weeks later, Oppenheimer was present for the Trinity test, the first detonation of a nuclear weapon, in the desert of New Mexico. On that early morning of July 16, 1945, after an incredibly bright explosion (witnesses without eye protection were temporarily blinded), the light turned white, then red, then purple. This “purple luminescence,” the effect of ionized atmosphere, smelled like a waterfall. The physicist Robert Serber said that “the grandeur and magnitude of the phenomenon were completely breathtaking.”
The people who worked on the bomb understood that some of its power was symbolic—that the difference between nuclear warfare and previous classes of weaponry was partly aesthetic. Stimson even worried that the power of the symbol might be lost if the bomb were dropped on an already devastated country. He wrote in his diary, “I was a little fearful that before we could get ready the Air Force might have Japan so thoroughly bombed out that the new weapon would not have a fair background to show its strength.” But Oppenheimer was right about the tremendous effect. The bombs the United States dropped on Hiroshima and Nagasaki felt qualitatively different, even if, in the end, the death toll didn’t match that of the firebombs. As Laurens van der Post, then a prisoner of war in Japan, said, there was “something supernatural” about the atomic blasts.
I’ve often heard that the residents of Hiroshima were warned about the bomb—that the military dropped leaflets on the city instructing them to evacuate. This is something of a myth. The warnings were vague and not specific to any particular city; LeMay had been dropping leaflets with lists of possible bomb targets for weeks. Although the people of Hiroshima were preparing for attack, they had expected more firebombing and were clearing out fire lanes. They heard air-raid sirens on the morning of August 6, but they heard those every morning. They were not prepared for an entirely new kind of weapon, and the new kind of terror it would bring. As M. Susan Lindee puts it in Suffering Made Real: American Science and the Survivors at Hiroshima, “They had been eating an orange, working in a garden, or reading a book. Minutes later they wandered, without feeling, past corpses, neighbors trapped in burning mounds of rubble, or children without skin.”
The Japanese word for the survivors of the bombings at Hiroshima and Nagasaki is hibakusha. This is not the word for “survivor.” It is usually translated as “bomb-affected people” or “explosion-affected persons”—a euphemism, almost politically correct. They avoid the more direct term seizonsha (“survivors”) because, as John Hersey writes in Hiroshima, “in its focus on being alive it might suggest some slight to the sacred dead.”
This sounds well intentioned, but for all its sensitivity toward the departed, the term in practice placed a stigma on the living, who were feared and considered unclean. The Wikipedia page for hibakusha shows a woman with black cross-hatchings on her back and arms—the pattern of the kimono she was wearing burned into her skin. The hibakusha were not inclined to identify themselves as such because it made them less employable and marriageable. There was little financial incentive either, since the Japanese government didn’t offer the victims health care or other compensation until 1957.
I read Hiroshima in junior high, and the detail I always remembered most clearly from Hersey’s account of the hibakusha was that their eyeballs melted. Those words, that image. I have remembered and re-remembered it so many times—their eyeballs melted—that I started to think it was a false memory, an invention of my imagination. It seems possible only as a metaphor, but it isn’t. On here:
On his way back with the water, he got lost on a detour around a fallen tree, and as he looked for his way through the woods, he heard a voice ask from the underbrush, “Have you anything to drink?” He saw a uniform. Thinking there was just one soldier, he approached with the water. When he had penetrated the bushes, he saw there were about twenty men, and they were all in exactly the same nightmarish state: their faces were wholly burned, their eyesockets were hollow, the fluid from their melted eyes had run down their cheeks. (They must have had their faces upturned when the bomb went off; perhaps they were anti-aircraft personnel.)
This passage informed my entire conception of war. For decades, I have found it difficult to accept that the bombs were necessary. The logical argument has trouble competing with the emotional impact of that etched-in detail.
Now, in its one-sidedness, the little yellow paperback with a red sun on the cover has the whiff of propaganda—but propaganda about what? Is it against nukes or war in general? Was the war necessary? Chillingly, I’ve had the same feeling, that I’m looking at propaganda, in Holocaust museums. How are we to compare these two horrors, if it’s even possible? Am I supposed to choose sides?
Reading about the Hiroshima and Nagasaki attacks, I see propaganda everywhere—Axis or Allies, pro- or anti-war. The persistent belief that the cities were warned—isn’t that American propaganda? A kind of victim-blaming, as in, they had their chance to escape? In the month before the attacks, Truman wrote in his diary (I’m almost touched that these men of war kept diaries):
Even if the Japs are savages, ruthless, merciless and fanatic, we as the leader of the world for the common welfare cannot drop this terrible bomb on the old Capitol or the new … The target will be a purely military one and we will issue a warning statement asking the Japs to surrender and save lives. I’m sure they will not do that, but we will have given them the chance.
This reads like rationalization, like self-propaganda: They deserve it, even if they don’t deserve it. We can’t do it, but we will. Later, after the bombing on August 6, Truman would say over the radio, “It is an awful responsibility that has come to us. Thank God it has come to us instead of our enemies, and we pray that He may guide us to use it in His ways and for His purposes.” When the journalist Wilfred Burchett visited Hiroshima in September 1945, he described the symptoms of acute radiation sickness (severe nausea, vomiting, and diarrhea; swollen, bleeding tissue; hair loss) and called it “atomic plague.” American scientists thought this was Japanese propaganda; they believed that if you were close enough to be irradiated, you’d be dead.
In 1980, The New York Review of Books published a letter to the editors and a response to that letter under the title “Was the Hiroshima Bomb Necessary?” In 1981, Paul Fussell wrote that it was “surely an unanswerable question.” This was in an essay first published in The New Republic as “Hiroshima: A Soldier’s View,” which Fussell later retitled “Thank God for the Atom Bomb.” It is written largely as a response to the “canting nonsense” of moralists “who dilate on the special wickedness of the A-bomb droppers.” Fussell notes that most of the people who feel the use of the atomic bomb was wrong were not lined up for combat in Japan, as he was (“the farther from the scene of horror, the easier the talk”), and goes to some lengths to disabuse the reader of any idea that wartime in the pre-nuclear era was less horrific. He describes marines “sliding under fire down a shell-pocked ridge slimy with mud and liquid dysentery shit into the maggoty Japanese and USMC corpses at the bottom, vomiting as the maggots burrowed into their own foul clothing.” He quotes Glenn Gray, a veteran and author, who wrote, “When the news of the atomic bombing of Hiroshima and Nagasaki came, many an American soldier felt shocked and ashamed.” Fussell’s response: “Shocked, OK, but why ashamed? Because we’d destroyed civilians? We’d been doing that for years.” (In Errol Morris’s film The Fog of War, Defense Secretary Robert McNamara, who helped plan the firebombing strategy, said LeMay once told him that if the United States had lost the war, he and the others involved would have been prosecuted as war criminals. McNamara wondered, “But what makes it immoral if you lose and not if you win?”)
Fussell’s aim in writing this provocative essay, he later explained, was “to complicate, even mess up, the moral picture,” which he felt had been oversimplified by the “historian’s tidy hindsight.” He quotes various men of war: John Fisher, British admiral of the fleet: “Moderation in war is imbecility.” Sir Arthur Harris, marshal of the Royal Air Force: “War is immoral.” General William Tecumseh Sherman: “War is cruelty, and you cannot refine it.” Louis Mountbatten, admiral of the fleet: “War is crazy.” General George S. Patton: “War is not a contest with gloves. It is resorted to only when laws, which are rules, have failed.” If we follow these arguments, there can be no war crimes—war is war, and the only objective is to kill more of them than they kill of us. War must be total.
However, Fussell’s arguments seem to follow from a premise that he does not complicate or question: “The purpose of the bombs was not to ‘punish’ people but to stop the war.” In this, he says, they were effective; they prevented further land invasions that might have killed him, an American. But it’s far from an undisputed point. Eisenhower thought nuclear weapons were “completely unnecessary,” and a postwar analysis by the U.S. Strategic Bombing Command determined that Japan would have surrendered anyway, with or without the bombs and even without more invasions. Craig Nelson writes in The Age of Radiance, “They had lost sixty cities; Hiroshima and Nagasaki were just numbers sixty-one and sixty-two. If they hadn’t given up after losing Tokyo, after all, they certainly wouldn’t because of Nagasaki.” There is evidence that the bombs were used in part to justify their enormous cost, as well as to send a signal to the USSR—quite the power move.
Nukes, like poison gas, can end up killing your own men and not just your enemies, depending on which way the wind blows, so they don’t make very good weapons of war. But in their theoretical potency, their supernatural mystique—Russia’s Tsar Bomba has the force of 3,333 Little Boys, and there are more than enough nukes in existence to destroy all life on this planet—they work very well as weapons of fear. They function as a threat: of punishment by annihilation. As such, Nelson claims, the atomic bombs “did not signal the end of World War II” but “the start of the Cold War.”
The coldness of a cold war depends on reciprocal threat, the idea that mutually assured destruction will act as a deterrent against either side actually deploying the weapons. If the system works, the nuclear weapons stay symbols, and we all agree to live in constant low-level fear: the pre-apocalypse. But this system of shared risk, known as “brinkmanship” (as in a willingness to go to the brink of war), works only if both parties are rational—if the “adversary is not suicidal,” as Evan Osnos wrote in a 2017 piece in The New Yorker, “The Risk of Nuclear War with North Korea.” It’s not clear, however, that either adversary in this case, Donald Trump or Kim Jong Un, is rational. North Korea has been unsure how to interpret Trump’s aggression. Osnos’s guide, Pak Song Il, said, “He might be irrational—or too smart. We don’t know.”
When Osnos asked Pak if his country was really prepared for the possibility of nuclear attack, Pak seemed unfazed:
“A few thousand would survive,” Pak said. “And the military would say, ‘Who cares? As long as the United States is destroyed, then we are all starting from the same line again.’” He added, “A lot of people would die. But not everyone would die.”
Can any game theory of war account for this level of befogged daredevilism? It throws a wrench in the works of brinkmanship if mutually assured destruction is seen as a point in nuclear warfare’s favor.
* * *
I must have been profoundly uninterested in the news as a child, because I have no direct memory of two major events of 1986 that now obsess me: the Challenger explosion and Chernobyl. Chernobyl is by general consensus the “worst” nuclear disaster in history. But what does “worst” mean? One assumes this is measured by the number of casualties, or perhaps some combination of the number of casualties and the cost of the damage. But you learn very quickly when you’re reading about nuclear disasters that it’s difficult to fact-check anything; sources contradict each other and even self-contradict. This is understandable when you consider that the nuclear energy industry was born out of the nuclear weapons industry; the Cold War military complex was naturally prone to secrecy, and the energy industry inherited a serious PR problem.
I have read over and over that Chernobyl was the worst/biggest/deadliest nuclear accident ever (of course not counting the bombings in Japan, which were intentional), but according to The Age of Radiance, it was “merely the fourteenth most lethal nuclear accident in USSR history”—Nelson cites, for example, a 1957 accident at a plutonium plant in the Urals that irradiated 270,000 people and 14,000 square miles of land. The other thirteen were kept under wraps until glasnost. I had difficulty finding more information on these deadlier-than-Chernobyl accidents. Our cultural calculus on what constitutes the “worst” disasters must include how much publicity they get.
In any case, the people most affected by Chernobyl were not aware of those earlier incidents; they had been told, and believed, that nuclear power was safe. In fact, the Chernobyl accident resulted from a series of mistakes made during a safety test. The plant operators were trying to determine if the plant could function properly in a power loss due to a nuclear attack. Unfortunately, Soviet nuclear plants at the time were designed to double as production facilities for weapons-grade plutonium, so instead of the usual containment shell designed to protect the environs from radiation leaks in a worst-case scenario, they had a removable lid that facilitated fuel changes.
When Chernobyl exploded, workers at the plant and in the nearby town of Pripyat experienced something very like the Trinity test: a purple-and-pink glow in the sky; a fresh, clean scent like ozone. “It was pretty,” one witness said. They went out and watched it from their balconies like an L.A. sunset. If they were quite close, they tasted something metallic. You see this in reports of radiation exposure—a taste of metal, like tinfoil, or in one case, “a combination of metal and chocolate.” Cancer patients receiving radiotherapy describe the same sensation. It’s not the flavor of waves or particles but a phantom taste—a sign of nerve damage.
Workers who realized what had happened called their wives and provided instructions: Swallow iodine, wash your hair, wipe down the counters, throw out the rag. If there’s laundry drying, put it back in the wash. (I was surprised to learn that some radiation is superficial; you can wash it away.) But those workers themselves and, later, the many, many soldiers and volunteers who were called on to put out the fires and clean up the accident—known as the liquidators—absorbed obscene amounts of radiation. Some could work for only forty seconds at a time before reaching the lifetime limit of exposure. According to The Chernobyl Nuclear Disaster, a textbook-like account by W. Scott Ingram, Chernobyl’s director, Viktor Brukhanov,
realized then that his closest assistants were in a condition of shock. He became even more alarmed when he asked a health worker that he encountered to take a reading of the radioactivity in the atmosphere. The instruments measured radioactivity in units called rems. A reading of 3.6 rems was considered high. The health worker told Brukhanov that the needle went off the dial at 250 rems. In other words, most of the people in the building and on the grounds had received deadly doses of radiation.
The cleanup workers faced a troublesome choice: The protective clothing was so heavy that it made them move slowly, and it was hard to get in and out of the site of the accident quickly. Many simply didn’t wear it.
The people in the zone of evacuation were incapable of processing the disaster. Life had not given them the training. In Chernobyl Prayer, Svetlana Alexievich collects three hundred pages of testimony from survivors—the hibakusha of Chernobyl, the “Chernobyl people.” (“You’ve got a wife, children. A normal sort of guy. And then, just like that, you’ve turned into a Chernobyl person.”) In a chapter titled “The author interviews herself on missing history and why Chernobyl calls our view of the world into question,” Alexievich writes:
The night of April 26, 1986. In the space of one night we shifted to another place in history. We took a leap into a new reality, and that reality proved beyond not only our knowledge but also our imagination. Time was out of joint. The past suddenly became impotent, it had nothing for us to draw on; in the all-encompassing—or so we’d believed—archive of humanity, we couldn’t find a key to open this door. Over and over in those days, I would hear, “I can’t find the words to express what I saw and lived through,” “Nobody’s ever described anything of the kind to me,” “Never seen anything like it in any book or movie.”
When a tsunami rises over a city, or a plane flies into a skyscraper, we say it’s “just like a movie.” This suggests that disaster movies help us process disaster—it’s the only exposure most of us get, outside of news clips, to deadly spectacles. There’s no script or template for a novel disaster.
In Survivor Café, Elizabeth Rosner notes, “When I ask Holocaust survivors to tell me their stories, I notice them flinch at the word. It’s as though ‘story’ implies something invented, a fairy tale.” Chernobyl Prayer does not feel like a collection of stories, with structures imposed retroactively. It is simply people talking, relating their experience. Many speak of their fondness for jokes: “I don’t like crying. I like hearing new jokes.” Here’s a good one:
There’s a Ukrainian woman sells big red apples at the market. She was touting her wares: “Come and get them! Apples from Chernobyl!” Someone told her, “Don’t advertise the fact they’re from Chernobyl, love. No one will buy them.” “Don’t you believe it! They’re selling well! People buy them for their mother-in-law or their boss!”
The Chernobyl people don’t like to dwell:
I was struck by the indifference with which people talked about the disaster. In one dead village, we met an old man. He was living all alone. We asked him, “Aren’t you afraid?” And he answered, “Of what?” You can’t be afraid the whole time, a person can’t do that; some time goes by, and ordinary life starts up again.
There are dozens of comparisons to war—it was the closest available analogue. (Nuclear accidents are usually spoken of in terms of “Hiroshimas.” It’s become a unit of measurement.) “We’d grown used to the idea that danger could only come from war.” “It was a real war, an atomic war.” “Just like in 1937.” “Instead of assault rifles they gave us spades.” “Is this what nuclear war smells like? I thought war should smell of smoke.” “They call it ‘an accident,’ ‘a disaster,’ but it was a war. Our Chernobyl monuments resemble war memorials.”
If it was a war, it was a war with no clear enemy: “To answer the question of how we should live here, we need to know who was to blame. Who was it? The scientists, the staff at the power plant? Or was it us, our whole way of seeing the world?” Another:
At first, it was baffling. It all felt like an exercise, a game.
But it was genuine war. Nuclear war. A war that was a mystery to us, where there was no telling what was dangerous and what wasn’t, what to fear and what not to fear. No one knew.
When Wilhelm Röntgen discovered X-rays in 1895, he named them X because they represented the unknown. This gets at what was, and is, so uncanny about radiation: You can’t see it; you can only see its effects. One cameraman sent to film the scene at Chernobyl after the fact said, “It wasn’t obvious what to film. Nothing was blowing up anywhere.” But some people, it seems, are immune to this fear of the unseeable; they refused to evacuate or later returned to the contaminated land, the zone of exclusion, because, one said, “I don’t find it as scary here as it was back there.” They chose contamination over exile, the invisible over the visible threat. “This threat here, I don’t feel it. I don’t see it. It’s nowhere in my memory. It’s men I’m afraid of. Men with guns.”
As a whole, Alexievich’s book is stunning, but difficult to take. It is bookended with two long monologues from women who lost the loves of their lives in the accident. They watched their husbands become unrecognizable: “His nose got somehow out of place and three times bigger, and his eyes weren’t the same anymore. They moved in opposite directions.” He begs for a mirror and she refuses. “I just didn’t want him to see himself, to have to remember what he looked like.” The other woman was pregnant, and while she sat at her husband’s bed in the hospital, the baby inside her absorbed radiation “like a buffer.” It was born two weeks early and died within four hours.
In their gut-wrenching grief, these monologues remind me of Marie Curie’s mourning journal:
They brought you in and placed you on the bed … I kissed you and you were still supple and almost warm … Oh! How you were hurt, how you bled, your clothes were inundated with blood. What a terrible shock your poor head, that I had caressed so often, taking it in my hands, endured. And I still kiss your eyelids which you close so often that I could kiss them, offering me your head with the familiar movement which I remember today, which I will see fade more and more in my memory.
Pierre Curie, severely weakened from radiation exposure, had fallen in the street and had his head run over by a horse and carriage.
In 1989, a group of journalists from the Chugoku Shimbun, a newspaper based in Hiroshima, began writing a series of global investigative reports, now collected in a book called Exposure: Victims of Radiation Speak Out. One of these reports quotes an antinuclear activist in the Soviet Union: “All the radiation sufferers of the world have to unite!” Another details the phenomenon of “radiophobia”: “To describe the state of mind whereby a person becomes paranoid about radiation and its effects, the Soviet media often uses the word radiophobia. It expresses the feelings of the Soviet people, who are torn between the truth as told to them by the government, and the rumors they hear through unofficial channels.”
In the same way that the Japanese concept of hibakusha can be extended to survivors of any nuclear accident or attack, Soviet-style radiophobia can be found anywhere with nuclear power. Conspiracy theories bloom around nuclear technologies because there is so much misinformation and conflicting information. Paul Fussell supported the use of the atomic bomb—and he repeats the received notion that Hiroshima was properly warned before the attack—but not nuclear energy, or “the capture of the nuclear-power trade by the inept and the mendacious (who have fucked up the works at Three Mile Island, Chernobyl, etc.).” Is this radiophobia? Long-term studies of survivors in Japan, Ukraine, and Belarus have shown that ranges of exposure previously thought to be highly dangerous are only slightly dangerous—with incidences of cancer perhaps 5 percent higher than the normal population. (We all have some exposure to radiation through daily living, not just from X-rays but from ordinary activities like eating bananas or taking a walk.) The people most at risk in a nuclear disaster, it turns out, are emergency workers and children, who are especially prone to thyroid cancers.
But this doesn’t tell us much about the psychological effects of exposure. After Chernobyl, many were diagnosed with “panic disorder” or something called “vegetovascular dystonia,” terms that, like “hysterical,” seem like little more than dismissals, euphemisms for “crazy.” A report of the International Atomic Energy Agency supposed that “the designation of the affected population as ‘victims’ rather than ‘survivors’ has led them to perceive themselves as helpless, weak and lacking control over their future.” The nuances of the terminology reflect degrees of stigma, but they influence stigma too—the names we give to people’s discomfort affect how uncomfortable those people make us.
Other studies have shown elevated stress levels in populations exposed to nuclear accidents even years later (for example, those living near Three Mile Island). Hersey, when he visited Hiroshima again forty years after the bombing, described a “lasting A-bomb sickness” marked by weakness, fatigue, dizziness, digestive problems, and “a sense of doom.” We know that chronic stress is bad for the body; it can lead to heart disease, diabetes, immune disorders, any number of conditions that might kill you. Is that not real? Is fear not real? There’s a tension in the literature around nuclear disasters, between the need to accurately describe them as they are—a kind of nightmare sublime—and to balance out “hysteria” by presenting cold “facts.” It’s difficult to reconcile the horror of nuclear disasters with our ability to move on. Where “Chernobyl people” tell jokes, the Japanese say “Shikata ga-nai”—roughly, “It can’t be helped.”
I want to talk about the hibakusha without succumbing to fearmongering and nuclear phobia. So here’s a fact: There have been many more deaths, orders of magnitude more, from accidents in the fossil-fuel industries than in nuclear energy. But I can’t think of a particular accident with as much disaster capital as Chernobyl. In 2010, there were “the 33,” the trapped coal miners in Chile, but they all survived and became heroes.
Why are some deaths more horrifying than others?
* * *
In the spring of 2013, I often drove north on Route 93 in Colorado from Golden to Boulder. It’s a gorgeous, hilly route, through yellow-green grassy fields, with misty blue mountains on your left, but dangerous in the snow; I know someone who totaled their car on that road. It snowed a lot that spring, to a maddening degree, once or twice a week right up through the end of May, but the snow made it even more gorgeous and misty, and sometimes I saw herds of antelopes.
About midway between Golden and Boulder, you pass Rocky Flats on your right. This area housed a plant that made plutonium triggers for nuclear bombs; it closed in 1992. What was the plant is now a Superfund site. The plant’s first accident occurred on September 11, 1957 (the same year as that mysterious accident in the Urals), with another major and nearly catastrophic fire on Mother’s Day in 1969. Waste was found to be seeping into open fields. In 1970, airborne radiation was detected in Denver. But the unsafe conditions continued for years, until informants tipped off the EPA and FBI, triggering a raid in 1989. Where was our glasnost? The hibakusha of Colorado filed a lawsuit, but after twenty years it was denied.
Until recently, there was a bar across the street from the site called the Rocky Flats Lounge—a truly great bar, kind of a cowboy bar, with an open back so you could watch the sun set over the mountains to the west. There were horses in view. They had karaoke on the weekends, and I once heard the bartender, a woman, sing a devastating version of “Fake Plastic Trees.” They sold T-shirts and tank tops bearing mushroom clouds and the words I GOT NUCLEAR WASTED AT ROCKY FLATS. The bar is now permanently closed; it kept catching on fire.
I’m telling you this because I keep thinking about it. I keep thinking about Hurricane Irma; there were upward of eighty Superfund sites in its path. What will become of them? The EPA is being dismantled. I keep thinking about Fukushima, the new hibakusha it created. Japan sees earthquakes and tsunamis all the time; they have a culture of disaster preparedness. But preparation takes time. Before 2011, most seismologists believed that earthquakes with magnitudes of higher than 8.4 weren’t possible in Japan. Climate change accelerates natural disasters. Earthquakes can trigger tsunamis and volcanic eruptions, and volcanic eruptions and earthquakes can trigger tsunamis; global warming leads to increases in all three. You can’t prepare for the worst-case scenario when the scenario keeps getting rapidly worse.
After we talked on Twitter about the Cold War, a writer I know named Michael Farrell Smith sent me a link to a lyric essay he wrote that includes this snippet of faux dialogue, an apt depiction of life in the pre-apocalypse:
Q. Could you talk about the Challenger explosion in the context of the Cold War?
A. I … guess so. Well, all those shuttles were a product of the U.S./U.S.S.R. space race, for one thing. And after the Challenger, when Chernobyl exploded and burned three months later, it felt as if some doomsday pattern was beginning. Everything was going to explode. Nothing was safe. Of course, I was just a kid, and what did I know, the world’s more-or-less fine.
I feel this way all the time now. Nothing is safe. Everything’s fine.
2017
Copyright © 2020 by Elisa Gabbert