ONE DAY, WHEN I was a student at Christ the King Elementary School in my hometown of Richland, Wash., the nuns gathered all the kids, two by two, and walked us outside to the parking lot. There sat a mobile van emblazoned with the logo of the Atomic Energy Commission and the words “Whole Body Scanner.”
One at a time, we were led into the van, where we laid on a white-sheathed table beneath a large, (scary), medical-looking machine. There was a whirring sound, and after a minute or two we were told to get up and make room for the next child. We weren’t told what the process was for, but it’s safe to assume that the government was interested in the effects of radiation on those of us who were “downwinders” from one of the nation’s largest nuclear complexes.
Richland was (and is) the bedroom community for the Hanford Nuclear Reservation. Hanford was built in the 1940s as part of the Manhattan Project, the massive wartime program that led to the atomic bombs dropped on Hiroshima and Nagasaki at the end of World War II. Hanford’s role was the production of plutonium for the world’s first nuclear weapon, the “test” bomb detonated in New Mexico a few weeks before Hiroshima, and for the bomb that destroyed Nagasaki three days later.
Those weapons were dropped 69 years ago, but the debate about their morality continues. It emerged again this spring when the two Missouri senators proposed renaming D.C.’s Union Station after Harry S. Truman, who authorized history’s only nuclear attack on people. One commenter in a related discussion wrote, “I have a problem with judging past cultures by today's standards. To end WWII we dropped bombs on cities filled with innocent civilians. By today's standards that would be condemned. Are you willing to say we should not have done that to end WWII?”
Even by the standards of the time, the intentional targeting of tens of thousands of civilians was a barbaric act (and unneeded militarily—Gen. Douglas McArthur called it “unnecessary,” since “Japan was already prepared to surrender”). Before World War II, the rules of “civilized” warfare prohibited the intentional killing of civilians. Admittedly, this was an oft-violated principle; civilians have always suffered greatly in war. But the general intent was that armies fought against armies, not civilians.
Early in the war, even Hitler refrained from bombing civilians. According to a 1961 history of the Battle of Britain, Hitler commanded the Luftwaffe that “The war against England is to be restricted to destructive attacks against industry and air force targets” and that “every effort should be made to avoid unnecessary loss of life amongst the civilian population.” That soon changed, and cities and towns became targets; in one of the largest raids on London, almost 3,000 civilians died.
The Allies, for their part, engaged in massive air attacks on German cities; one of the more horrendous examples was the 1945 firebombing of Dresden, which killed an estimated 25,000 people. That March, hundreds of U.S. bombers carpeted Japan’s largest city, Tokyo, with thousands of incendiary weapons and cluster bombs carrying napalm, jelled-gasoline, and white phosphorus. More than 100,000 civilians were killed, a million more injured. That was the context of the attacks on Hiroshima and Nagasaki, in which two bombs killed more than 200,000 people.
With the distance of time, these horrific actions will likely be viewed as among the most shameful in human history.
Vincent Harding, an eminent chronicler of the Southern freedom movement who passed away this spring, was pictured in his New York Times obituary wearing a button that proclaimed “War is Terrorism.” Given that one of the definitions of terrorism is the intentional targeting of civilians, and given the realities of modern warfare, it’s hard to argue with that statement.
Jim Rice is editor of Sojourners.
Image: Japanese children make cranes from origami to remember the children victims of the 1945 nuclear attack, Attila JANDI / Shutterstock