Last week, government scientists at the Lawrence Livermore National Laboratory achieved a long-sought milestone in developing clean fusion energy. For the first time, the amount of energy produced by a fusion reaction exceeded the energy required to produce it.
The press dutifully reported the news, but there has been little celebration outside of scientific circles. For most people, fusion remains a futuristic pipe dream, constantly lurking around the corner, never materializing.
There are reasons for skepticism: Few scientific endeavors have been dogged by so many dead ends and false claims. But this has blinded us to the fact that, disappointments aside, scientists have been making slow but steady progress on fusion far longer than many people realize.
The ideas behind fusion originated with a paper delivered by British astrophysicist Arthur Eddington at a conference held in 1920. A devout Quaker and brilliant scientist, Eddington ventured an answer to an age-old question: How do stars like the sun generate energy?
He speculated that the immense pressure inside stars fused hydrogen atoms together, creating helium. This “fusion” converted some of the original matter into raw energy. As Eddington put it: the stars’ “sub-atomic energy is…freely used to maintain their great furnaces…”
Eddington admitted to his listeners that he was more or less spitballing. But everything he said that day proved eerily accurate, including his warning that control of this latent power could be used for the benefit of the human race — “or its suicide.”
In the 1930s, chemist Ernest Rutherford and two collaborators began conducting experiments with a heavy isotope of hydrogen known as deuterium. In 1934, the team slammed deuterium atoms together, turning the isotope into helium while simultaneously producing what they described as “an enormous effect” — a blast of energy.
This was fusion in miniature. Four years later, German physicist Hans Bethe figured out the precise subatomic sequence of events that undergird the process. That same year, two young scientists read Bethe’s article on the subject and resolved to put his ideas into practice.
The eccentric duo, Arthur Kantrowitz and Eastman Jacobs, worked at a government research facility focused on aircraft performance. Building a fusion reactor had nothing to do with their jobs, so they dubbed their creation a “Diffusion Inhibitor,” a vague but pretentious phrase that deterred superiors from asking too many questions.
Their design, foreshadowing later developments, featured a metal doughnut, or “torus,” lined with magnets designed to contain and control the reaction. Lasers hadn’t been invented, so they opted for radio waves to superheat the hydrogen. This consumed so much power that they had to conduct experiments at night to avoid taking down the power grid.
Ultimately, they flipped the switch and nothing happened. Not long afterward, their superiors caught on and shut down the project. No one realized it at the time, but the pair had come remarkably close to building the first fusion reactor, save for some flaws in the containment structure.
It wasn’t until after World War II that scientists resumed work on fusion, all too aware of its speculative nature. James Tuck, a British physicist who had cut his teeth working on the Manhattan Project, designed an early fusion reactor he dubbed the “Perhapsatron,” because “perhaps it will work and perhaps it will not.”
Far less amusing was an episode that helps explain the longstanding skepticism of the new technology. In the late 1940s, Argentina’s populist dictator, Juan Domingo Perón, funded the fusion research of an obscure Austrian scientist named Ronald Richter. In 1951, Peron proudly announced that Richter — who had close ties to former Nazis — had created the world’s first fusion reactor. Subsequent scrutiny unmasked Richter’s research as fundamentally flawed, if not fraudulent.
The following year, however, two developments underscored why fusion could not be ignored. First came news that the United States had detonated the world’s first hydrogen bomb — effectively, an uncontrolled fusion reaction — reviving the suicide-of-the-human-race problem that Eddington originally identified.
No less consequential was the work of theoretical physicist Lyman Spitzer at Princeton University on how to control the superheated gas, or plasma, at the heart of the fusion reactor. This state of matter is like a subatomic orgy, where atomic nuclei and electrons, formerly monogamous, promiscuously mingle. In order to contain the chaos, Spitzer sketched out a figure-eight apparatus he called the stellarator.
An avid mountaineer, the physicist christened his research Project Matterhorn on account of the long and arduous climb he foresaw in fusion research. Through the 1950s, Spitzer and his collaborators built a series of prototypes that marked a giant leap forward. At the same time, a group of physicists in the Soviet Union led by Andrei Sakharov and Igor Tamm developed their own model, known as a Tokamak, a Russian acronym referring to a gigantic magnetic doughnut, or torus.
So began a new phase in fusion research as scientists built ever larger stellarators and tokamaks. From the late 1950s onward, fusion moved from a theoretical, fanciful concept to something concrete. Unfortunately, these advances also led flamboyant promoters to get ahead of themselves, imagining a future defined by inexpensive, limitless power.
Typical of the genre was a breathless article in Popular Mechanics in 1959, “Fusion Power for the World of Tomorrow,” predicting, “It may come sooner than you think!”
This hype proved damaging as well as unrealistic. Many commentators from the 1960s onward became increasingly disenchanted with fusion. Though the energy shortages of the 1970s led to more funding and renewed hopes, these inevitably fell short, bolstering the cynical view.
Lost in all the hoopla was the fact that scientific teams around the world continued to make slow but steady progress on turning fusion into a reality, gradually solving the technical challenges associated with containment while producing ever larger bursts of energy.
These piecemeal advances, not especially eye-catching when viewed in isolation, were overshadowed by failures and frauds like the infamous “cold fusion” controversy of 1989, when two researchers erroneously claimed they had created a stable fusion reaction at room temperature.
Fusion skeptics also delighted in pointing out that decades of research had never managed to achieve a so-called “net-energy gain.” Anytime researchers fired up hydrogen isotopes into a frenzy, they always ended up with less energy than when they started.
That’s why last week’s announcement is so critical. No, it doesn’t mean commercialization is imminent. But after many decades of trying, researchers have finally achieved a critical milestone in their quest to develop fusion power, bringing the world considerably closer to the vision Arthur Eddington first articulated more than a century ago.
Stephen Mihm, a professor of history at the University of Georgia, is coauthor of “Crisis Economics: A Crash Course in the Future of Finance.”
Send questions/comments to the editors.
Comments are no longer available on this story