that you (the doctor) could direct multiple low-intensity rays at the tumor from different directions, leaving healthy tissue intact, but converging at the tumor site with enough collective intensity to destroy it. Just like how the general divided up troops and directed them to converge at the fortress, and how the fire chief arranged neighbors with their buckets around the burning shed so that their water would converge on the fire simultaneously.
Those results are from a series of 1980s analogical thinking studies. Really, don’t feel bad if you didn’t get it. In a real experiment you would have taken more time, and whether you got it or not is unimportant. The important part is what it shows about problem solving. A gift of a single analogy from a different domain tripled the proportion of solvers who got the radiation problem. Two analogies from disparate domains gave an even bigger boost. The impact of the fortress story alone was as large as if solvers were just straight out told this guiding principle: “If you need a large force to accomplish some purpose, but are prevented from applying such a force directly, many smaller forces applied simultaneously from different directions may work just as well.”
The scientists who did that work expected that analogies would be fuel for problem solving, but they were surprised that most solvers working on the radiation problem did not find clues in the fortress story until they were directed to do so. “One might well have supposed,” the scientists wrote, that “being in a psychology experiment would have led virtually all subjects to consider how the first part [of the study] might be related to the second.”
Human intuition, it appears, is not very well engineered to make use of the best tools when faced with what the researchers called “ill-defined” problems. Our experience-based instincts are set up well for Tiger domains, the kind world Gentner described, where problems and solutions repeat.
An experiment on Stanford international relations students during the Cold War provided a cautionary tale about relying on kind-world reasoning—that is, drawing only on the first analogy that feels familiar. The students were told that a small, fictional democratic country was under threat from a totalitarian neighbor, and they had to decide how the United States should respond. Some students were given descriptions that likened the situation to World War II (refugees in boxcars; a president “from New York, the same state as FDR”; a meeting in “Winston Churchill Hall”). For others, it was likened to Vietnam, (a president “from Texas, the same state as LBJ,” and refugees in boats). The international relations students who were reminded of World War II were far more likely to choose to go to war; the students reminded of Vietnam opted for nonmilitary diplomacy. That phenomenon has been documented all over the place. College football coaches rated the same player’s potential very differently depending on what former player he was likened to in an introductory description, even with all other information kept exactly the same.
With the difficult radiation problem, the most successful strategy employed multiple situations that were not at all alike on the surface, but held deep structural similarities. Most problem solvers are not like Kepler. They will stay inside of the problem at hand, focused on the internal details, and perhaps summon other medical knowledge, since it is on the surface a medical problem. They will not intuitively turn to distant analogies to probe solutions. They should, though, and they should make sure some of those analogies are, on the surface, far removed from the current problem. In a wicked world, relying upon experience from a single domain is not only limiting, it can be disastrous.
* * *
• • •
The trouble with using no more than a single analogy, particularly one from a very similar situation, is that it does not help battle the natural impulse to employ the “inside view,” a term coined by psychologists Daniel Kahneman and Amos Tversky. We take the inside view when we make judgments based narrowly on the details of a particular project that are right in front of us.
Kahneman had a personal experience with the dangers of the inside view when he assembled a team to write a high school curriculum on the science of decision making. After a full year of weekly meetings, he surveyed the entire team to find out how long everyone thought the project would take. The lowest estimate was one and a half years, the highest two