Discover more from Slow Boring
The tragedy of the Manhattan Project
Framing nuclear as a military-first technology has been disastrous for mankind
If you were writing a Manhattan Project movie as a work of fiction, you would tell a different story than any of the contested versions of reality.
The fictional version might go something like this: the United States starts way behind in nuclear physics, with fission having been first demonstrated in Berlin in December of 1938 and the world’s premier quantum scientist, Werner Heisenberg, leading the German effort. But Albert Einstein and Leo Szilard, two Jewish refugees from Hitler’s dictatorship, write to FDR that the United States needs to explore the creation of a fission bomb. Robert Oppenheimer, from a wealthy assimilated Jewish family, leads a team that features critical contributions at every level from exiles and emigres. They beat Heisenberg to the bomb — in part because Niels Bohr refuses to help the Nazis and in part because Hitler thinks quantum mechanics is “Jewish science” — and, because of this nuclear breakthrough, win the war.
As the movie “Oppenheimer” shows, this is more or less the narrative the Manhattan Project protagonists thought they were living out.
But they were wrong. Their work was incredibly impressive from a technical standpoint, and it did end up getting used in the war. But Germany had already surrendered, and despite the endless debates over the validity of its use on Japan, no one thinks the United States wouldn’t have won the war without nuclear bombs. Instead, the causal importance of the Manhattan Project for the world was all about its influence on the postwar era. Again, as detailed in the film, the American government was initially optimistic about obtaining a monopoly on nuclear weapons, but that proved short-lived in part thanks to Soviet espionage.
And what I think is really interesting is that while the Manhattan Project indisputably accelerated nuclear science, I think that by turning nuclear research into a military-first undertaking, in the longer run it probably retarded the development of nuclear electricity for civilian purposes. This is the kind of thing that’s by nature inherently speculative. But as a person who loves counterfactual historical speculations, I think it’s correct — if World War II had somehow not happened, commercial nuclear reactors would have been built later but in a more sustainable way that eventually led to their more widespread adoption.
The Manhattan Project didn’t accomplish its goals
People continue to debate the morality of the decision to drop the bomb on Hiroshima and Nagasaki, but in important ways there never was a “decision” in the way that people often think about this. Harry Truman established presidential control over the use of nuclear weapons after the bombing of Nagasaki. That’s not to say he was somehow out of the loop on the earlier bombings; he always accepted responsibility for them as part of the overall ethic of “the buck stops here.”
But on an operational level, no distinction was drawn between nuclear weapons and other forms of ordinance. The United States was at war with Japan, and the military commanders were empowered to deploy available weapons in order to try to defeat their enemy.
The interesting high-level policy decision was the decision to build the bombs in the first place. This was a monumental undertaking that cost $2 billion relative to the 1940 GDP of $200 billion. Importantly, the bulk of that money did not go to Oppenheimer and his famous team of scientists at Los Alamos. Everything you see in the movie in the New Mexico desert — the town built from scratch, the international all-star team of physicists, the testing site, etc. — accounted for about 10% of the program’s budget. The bulk of the expense was building the multiple nuclear reactors and supportive infrastructure required to manufacture the quantity of fissile material needed to build the bombs. In other words, it wasn’t just a very expensive science project (though it was that), it was also an even more expensive manufacturing program.
Having achieved success, General Leslie Groves and the even higher-level officials who authorized the program obviously weren’t going to say “well, this bomb isn’t even that useful.” They used it in hopes of ending the war faster, and in this case it worked.
But a separate question from the wisdom of using the bomb, now having built it, was whether it made sense ex post to have built it in the first place. A science project that wasn’t useful against Germany isn’t just disappointing from a narrative standpoint; the entire strategic concept of the Unites States from 1940 onward was “Europe First.” If you’d proposed a very expensive and difficult undertaking that wasn’t helpful in beating Germany, it would have been rejected. But it’s hard to predict the future, and ex ante it seemed like a good idea, given the purported threat of the Nazi Bomb. And the sheer sunk cost ended up shaping postwar history.
A world without Hitler
But what if Hitler had never come to power? One way of generating that counterfactual outcome involves assuming something different happened in World War I. Those WWI counterfactuals are near and dear to my heart, but a little wacky for these purposes.
Instead let’s posit that the center-right Brüning Cabinet inaugurated in 1930 embarked on a program of fiscal and monetary stimulus that successfully fought the Great Depression. The likely upshot would be that in the 1932 federal elections, the various non-Nazi rightist parties gain vote share rather than losing it. The non-Nazi right of the Weimar era was not a particularly friendly or admirable group of people, but they wouldn’t have launched a huge war, provoked an exodus of scientists from Europe, or compelled left-wing American scientists to push for a crash military physics program. Germany itself would have settled down into something resembling “normal” left-right politics, the United States would continue to be somewhat isolationist in its outlook, and Britain and France would have been focused on imperial issues.
In this context, nobody is looking at the early fission experiments of 1938–1939 and saying the government needs to throw money at this.
And without military funding, nuclear reactor construction proceeds much more slowly (it’s expensive!) and primarily for pure scientific research. In real history, Chicago Pile-1 was the world’s first reactor and it was built for research, but immediately after that the U.S. government started building reactors to manufacture fissile material. Counterfactually, we’d have seen multiple research reactors rolled out over a period of years. These reactors would have posed the question “is there a cost-effective way to use this technology to make electricity?” and work would have proceeded over time on trying to find positive answers to that question. And of course as is often the case in peacetime, there would have been military spinoffs as civilian technology advanced. The internal combustion engine was not developed in order to power tanks, but once ICE vehicles were around we got military designs.
Obviously there would have been a safety question with nuclear reactors. But as a technology developed for the purpose of generating electricity, they would have been regulated alongside other electricity-generating technologies — initially by state governments as part of their broader framework for public utility regulation, and then later as part of the regulatory umbrella of the Environmental Protection Agency. This is what I’ve described as the nuclear policy America needs, an integrated regulatory framework that doesn’t just say “nuclear power should be as safe and clean as possible” but that “the American energy sector should be as safe and clean as possible.” In that world, nuclear comes to crowd out coal over time because, regulated at safety parity, coal would lose its cost advantage.
Instead, though, nuclear technology was developed as a military technology with its own framework.
The atomic age
“Oppenheimer” shows but does not explain that at the end of the war, the U.S. government set up the Atomic Energy Commission to regulate nuclear technology.
Crucially, this included both civilian and military uses of nuclear technology because the two were seen as inextricably linked. One of the beats of the film is a regulatory dispute about whether we should authorize the export of certain kinds of isotopes that are useful for medical purposes or whether that would pose a risk to America’s nuclear secrets. In that case, civilian use won out, but the point is that there was a kind of presumption of military control over the whole thing.
The world’s first nuclear power station, Calder Hall in the U.K., was completed in 1956 — over a decade after the Trinity Test — and its primary purpose was manufacturing weapons-grade plutonium for the U.K. nuclear weapons program. Generating electricity was conceived as a fun co-benefit of bomb manufacturing. America’s first nuclear power station came online one year later. That reactor was not manufacturing bombs. Instead, it had been originally intended to power a nuclear aircraft carrier. But the carrier was cancelled, so they kept the reactor and used it to make electricity.
Under Eisenhower, the AEC launched Project Plowshare in 1957 designed to identify and promote non-military uses of nuclear technology. But this wasn’t like “we could use nuclear fission to boil water and use boiling water to make electricity.” Instead they came up with stuff like “maybe we could detonate five hydrogen bombs in Alaska to carve out an artificial port.” That idea, Project Chariot, never came to fruition due to a mix of (understandable!) local concern about radiation and the fact that nobody could really explain what the benefit of a port in a remote area of Alaska would be.
So while the AEC crew was genuinely optimistic about nuclear, the short-term imperative was the arms race with the Soviet Union, and the proximate goal of civilian use was to generate cross-subsidy to defray its ruinous costs.
This created the institutional legacy of regulating civilian nuclear power separately from other forms of electricity-generation — first by the AEC then later by the NRC — and also impacted the trajectory of technological development. The main kind of reactor in use today is the pressurized water reactor (PWR). The PWR was originally created for use in submarines. Nuclear propulsion is very attractive in a submarine for reasons that have nothing to do with cost-effectiveness; it’s just highly desirable to have a sub that can stay under water for extremely long periods of time. And if you need your power plant to be able to move itself, it’s important that you maximize the ratio of power generated to the mass of the reactor. PWR is good for that, boats were the first major use of nuclear power, the regulatory agency had a military-first mindset, and so the whole industry ended up standardizing around this tech.
Later, the AEC explored molten salt reactors (MSRs) as potentially useful in aviation, but came to the conclusion that this wasn't workable. MSRs are worse for ships because of the weight issue. For civilian purposes, though, the fact that an MSR can’t melt down seems important. Yet in our world, instead of going down that route, all the work went into building PWRs, then safety-minded people argued they needed more regulation, and we eventually backed ourselves into a corner where our PWRs are, in fact, super-duper safe and are also too expensive to build.
Atoms for peace
Apart from the specific institutional legacy, the military-first nature of nuclear technology has framed the politics of nuclear power in a profound way.
Civilian nukes really were essentially an adjunct to the Cold War arms race. At a time when air pollution and climate change weren’t salient, that made anti-nuclear a natural left-wing cause as part of broader Cold War politics. The powerful and ultimately destructive German anti-nuclear movement was against nuclear electricity, but gained tremendous momentum from protests against Pershing missiles. Anti-nuclear activism has deep resonances with the anti-human neo-pastoralist strain of environmentalism, but if that was all there was to it then it never could have become mainstream. The idea that nuclear energy is a tool of death and destruction in a way that fossil fuels are not, by contrast, lends credibility to the whole thing and has really damaged human health.
And as Alex Trembath pointed out to me, survey results suggest that people continue to strongly associate nuclear with bombs. If you specifically mention “generating electricity” as the purpose of your nuclear power plant, support for it goes up.
The susceptibility of polling results to question-wording effects also suggests that opinions are often weakly held. When survey questions include a rationale for supporting nuclear energy — even one as simple as “generating electricity” — support is markedly higher. For reasons that are not entirely clear, survey items about nuclear that specifically reference generating electricity consistently return higher support (a finding documented as early as the 1970s) and are predictably favored by the nuclear industry in its polling. This question wording may simply be cueing respondents to consider a familiar benefit they value, but it may also reduce the likelihood that respondents will conflate nuclear energy with nuclear weapons, given that studies find this to be a frequent association.
That’s in part just one of those weird-but-familiar quirks of public opinion. But I do think it’s relevant that bomb-manufacturing really was the original primary purpose of reactors and electricity was a secondary factor. Now as it happens, you can make bombs out of lots of stuff that is also useful for non-bomb purposes (fertilizer, for example). And it’s not at all unusual for a technology to be developed first for military purposes (radar, jet planes, and synthetic fabrics all from World War II) and then later adapted for civilian purposes.
But nuclear technology has a unique institutional and ideological legacy as a military-first technology. As it happens, it’s a military-first technology that thankfully has only rarely been used for military purposes. That’s in part because leaders have shown sufficient wisdom to avoid blowing the world up, but also because it’s just not that useful. Having nuclear bombs didn’t help America win in Vietnam and didn’t help the Soviets or Americans win in Afghanistan. By contrast, using nuclear fission to generate zero-emission electricity is potentially very useful but has only sporadically panned out commercially. That failure is something we need to try to dig out of in a regulatory sense. And I also think it’s arguably the most important legacy of the Manhattan Project.