couples”—in which both parents happen to possess at least one copy of the mutant gene—could choose not to conceive a child, or to adopt children. Over the last decade, the combination of targeted parental screening and fetal diagnosis has reduced the prevalence of children born with cystic fibrosis by about 30 to 40 percent in populations where the frequency of the mutant allele is the highest. In 1993, a New York hospital launched an aggressive program to screen Ashkenazi Jews for three genetic diseases, including cystic fibrosis, Gaucher’s disease, and Tay-Sachs disease (mutations in these genes are more prevalent in the Ashkenazi population). Parents could freely choose to be screened, to undergo amniocentesis for prenatal diagnosis, and to terminate a pregnancy if the fetus was found to be affected. Since the launch of the program, not a single baby with any of these genetic diseases has been born at that hospital.
It is important to conceptualize the transformation in genetics that occurred between 1971—the year that Berg and Jackson created the first molecule of recombinant DNA—and 1993, the year that the Huntington’s disease gene was definitively isolated. Even though DNA had been identified as the “master molecule” of genetics by the late 1950s, no means then existed to sequence, synthesize, alter, or manipulate it. Aside from a few notable exceptions, the genetic basis of human disease was largely unknown. Only a few human diseases—sickle-cell anemia, thalassemia, and hemophilia B—had been definitively mapped to their causal genes. The only human genetic interventions available clinically were amniocentesis and abortion. Insulin and clotting factors were being isolated from pig organs and human blood; no medicine had been created by genetic engineering. A human gene had never intentionally been expressed outside a human cell. The prospect of changing an organism’s genome by introducing foreign genes, or by deliberately mutating its native genes, was far outside the reach of any technology. The word biotechnology did not exist in the Oxford dictionary.
Two decades later, the transformation of the landscape of genetics was remarkable: human genes had been mapped, isolated, sequenced, synthesized, cloned, recombined, introduced into bacterial cells, shuttled into viral genomes, and used to create medicines. As the physicist and historian Evelyn Fox Keller described it: once “molecular biologists [had discovered] techniques by which they themselves could manipulate [DNA],” there “emerged a technological know-how that decisively altered our historical sense of the immutability of ‘nature.’
“Where the traditional view had been that ‘nature’ spelt destiny, and ‘nurture’ freedom, now the roles appeared to be reversed. . . . We could more readily control the former [i.e., genes], than the latter [i.e., the environment]—not simply as a long-term goal but as an immediate prospect.”
In 1969, on the eve of the revelatory decade, Robert Sinsheimer, the geneticist, wrote an essay about the future. The capacity to synthesize, sequence, and manipulate genes would unveil “a new horizon in the history of man.”
“Some may smile and may feel that this is but a new version of the old dream of the perfection of man. It is that, but it is something more. The old dreams of the cultural perfections of man were always sharply constrained by his inherent, inherited imperfections and limitations. . . . We now glimpse another route—the chance to ease and consciously perfect far beyond our present vision this remarkable product of two billion years of evolution.”
Other scientists, anticipating this biological revolution, had been less sanguine about it. As the geneticist J. B. S. Haldane had described it in 1923, once the power to control genes had been harnessed, “no beliefs, no values, no institutions are safe.”
* * *
I. In 1978, two other researchers, Y. Wai Kan and Andree Dozy, had found a polymorphism of DNA near the sickle-cell gene—and used it to follow the inheritance of the sickle-cell gene in patients.
II. The high prevalence of the mutant cystic fibrosis gene in European populations has puzzled human geneticists for decades. If CF is such a lethal disease, then why was the gene not culled out by evolutionary selection? Recent studies posit a provocative theory: the mutant cystic fibrosis gene may provide a selective advantage during cholera infection. Cholera in humans causes a severe, intractable diarrhea that is accompanied by the acute loss of salt and water; this loss can lead to dehydration, metabolic disarray, and death. Humans with one copy of the mutant CF gene have a slightly diminished capacity to lose salt and water through their membranes and are thus relatively protected from the most devastating complications of