World War II, German troops occupying the Netherlands banned the export of food and coal to its northern parts. Trains were stopped and roads blockaded. Travel on the waterways was frozen. The cranes, ships, and quays of the port of Rotterdam were blown up with explosives, leaving a “tortured and bleeding Holland,” as one radio broadcaster described it.
Heavily crisscrossed by waterways and barge traffic, Holland was not just tortured and bleeding. She was also hungry. Amsterdam, Rotterdam, Utrecht, and Leiden depended on regular transportation of supplies for food and fuel. By the early winter of 1944, wartime rations reaching the provinces north of the Waal and Rhine Rivers dwindled to a bare trickle, and the population edged toward famine. By December, the waterways were reopened, but now the water was frozen. Butter disappeared first, then cheese, meat, bread, and vegetables. Desperate, cold, and famished, people dug up tulip bulbs from their yards, ate vegetable skins, and then graduated to birch bark, leaves, and grass. Eventually, food intake fell to about four hundred calories a day—the equivalent of three potatoes. A human being “only [consists of] a stomach and certain instincts,” one man wrote. The period, still etched in the national memory of the Dutch, would be called the Hunger Winter, or Hongerwinter.
The famine raged on until 1945. Tens of thousands of men, women, and children died of malnourishment; millions survived. The change in nutrition was so acute and abrupt that it created a horrific natural experiment: as the citizens emerged from the winter, researchers could study the effect of a sudden famine on a defined cohort of people. Some features, such as malnourishment and growth retardation, were expected. Children who survived the Hongerwinter also suffered chronic health issues: depression, anxiety, heart disease, gum disease, osteoporosis, and diabetes. (Audrey Hepburn, the wafer-thin actress, was one such survivor, and she would be afflicted by a multitude of chronic illnesses throughout her life.)
In the 1980s, however, a more intriguing pattern emerged: when the children born to women who were pregnant during the famine grew up, they too had higher rates of obesity and heart disease. This finding too might have been anticipated. Exposure to malnourishment in utero is known to cause changes in fetal physiology. Nutrient-starved, a fetus alters its metabolism to sequester higher amounts of fat to defend itself against caloric loss, resulting, paradoxically, in late-onset obesity and metabolic disarray. But the oddest result of the Hongerwinter study would take yet another generation to emerge. In the 1990s, when the grandchildren of men and women exposed to the famine were studied, they too had higher rates of obesity and heart disease. The acute period of starvation had somehow altered genes not just in those directly exposed to the event; the message had been transmitted to their grandchildren. Some heritable factor, or factors, must have been imprinted into the genomes of the starving men and women and crossed at least two generations. The Hongerwinter had etched itself into national memory, but it had penetrated genetic memory as well.
But what was “genetic memory”? How—beyond genes themselves—was gene memory encoded? Waddington did not know about the Hongerwinter study—he had died, largely unrecognized, in 1975—but geneticists cannily saw the connection between Waddington’s hypothesis and multigenerational illnesses of the Dutch cohort. Here too, a “genetic memory” was evident: the children and grandchildren of famine-starved individuals tended to develop metabolic illnesses, as if their genomes carried some recollection of their grandparents’ metabolic travails. Here too, the factor responsible for the “memory” could not be an alteration of the gene sequence: the hundreds of thousands of men and women in the Dutch cohort could not have mutated their genes over the span of three generations. And here too, an interaction between “the genes and the environment” had changed a phenotype (i.e., the propensity for developing an illness). Something must have been stamped onto the genome by virtue of its exposure to the famine—some permanent, heritable mark—that was now being transmitted across generations.
If such a layer of information could be interposed on a genome, it would have unprecedented consequences. First, it would challenge an essential feature of classical Darwinian evolution. Conceptually, a key element of Darwinian theory is that genes do not—cannot—remember an organism’s experiences in a permanently heritable manner. When an antelope strains its neck to reach a tall tree, its genes do not record that effort, and its children are not born as giraffes (the direct transmission of an adaptation into a heritable feature, remember, was the basis