weapon, they had “invented modern mathematical modeling.” But they needed the machines to make it practical.28
Immediately after the encounter on that station platform, von Neumann used his authority as a top-flight scientific adviser to the war effort to jump into this nascent and obscure computer project and promote its development. By June 1945 he had written a 101-page paper that became “the technological basis for the worldwide computer industry.” He started designing and building a new prototype computer in Princeton at the Institute for Advanced Study.
But to what to apply this new tool? Van Neumann identified “the first great scientific subject” for which he wanted to use this newly discovered computer power: “the phenomena of turbulence,” or, put more simply, forecasting the weather. He recognized the similarities between simulating atomic explosions and making weather predictions; both were nonlinear problems in fluid dynamics that needed vast amount of computation at breakneck speed.29
The complexity of the weather cried out for the rigorous mathematical analysis that von Neumann loved and that only the computer made possible. The strategic significance made it urgent. The intellectual challenge appealed to him. He feared that the Soviets might add weather modification to their arsenal and wage “climatological warfare” against the United States. He himself gave some favorable thought to using better knowledge of the weather to “jiggle the earth,” as he put it—that is, modify the weather and create a warmer semitropical climate around the world. Frankly, he thought, people would like that.
In seeking support for funding for the navy computing and climate studies, he argued that high-speed computing “would make weather predictions a week or more ahead practical.” He thereafter supervised the building of MANIAC—for Mathematical Analyzer, Numerical Integrator and Computer. The New York Times would call it a “giant electronic brain.”30
By 1948 the Numerical Meteorology Project was up and running. A new recruit, Jule Charney, a mathematician and meteorologist, took the lead in figuring out the mathematical formulas to conjoin climate modeling with the advances in computing. What they were trying to do was express the physical laws governing the dynamics of heat and moisture in the atmosphere in a series of mathematical algorithms that could be solved by a computer as they unfolded over time. By the early 1950s Charney and the group were producing its first computer simulations of climate. By the 1960s the Princeton initiative had morphed into the GFDL—Geophysical Fluid Dynamics Laboratory, now part of the National Oceanic and Atmospheric Administration—which became one of the leaders in developing climate-change models.31
Von Neumann’s quest to understand stratospheric circulation and atmospheric turbulence was giving rise to increasingly sophisticated simulations of how the global atmosphere worked—the patterns and flows by which the air moved around the world. These became known as general circulation models. They had to be global because the earth had only one atmosphere. The modelers were constantly striving to make their models more and more realistic, which meant more and more complex, in order to better understand how the world worked.
Climate modeling was very difficult, taxing, and definitely pioneering. “The computer was so feeble at the time,” recalled Syukuro Manabe, recruited to the GFDL from the meteorology faculty at Tokyo University and one of the most formidable of all the climate modelers. “If we put everything into the model at once, the computer couldn’t handle it. I was there and was watching the model blow up all the time.”
But already in 1967 Syukuro Manabe and Richard Wetherald, members of the Princeton lab, were hypothesizing, in what became a famous paper, that a doubling of CO2 would increase global temperatures by three to four degrees. They backed into the subject by accident. “I wanted to see how sensitive the model is to cloudiness, water vapor, ozone, and to CO2,” said Manabe. “So I was changing greenhouse gases, clouds . . . playing and enjoying myself. I realized that CO2 is important, as it turned out, I changed the right variable and hit the jackpot,” he continued. “At that time, no one cared about global warming... Some people thought maybe an ice age is coming.”
Notwithstanding his conviction that “probably this is the best paper I wrote in my whole career,” Manabe led further breakthroughs on modeling in the mid-1970s. Over the years data from satellites provided a benchmark against which to test the accuracy of the ever-more-complex models. And yet that 1967 hypothesis—that a doubling of CO2 would bring a three-to-fourdegree increase in the average global temperature—would become a constant in the debate over