crew leadership not as decision making, but as sensemaking. “If I make a decision, it is a possession, I take pride in it, I tend to defend it and not listen to those who question it,” Gleason explained. “If I make sense, then this is more dynamic and I listen and I can change it.” He employed what Weick called “hunches held lightly.” Gleason gave decisive directions to his crew, but with transparent rationale and the addendum that the plan was ripe for revision as the team collectively made sense of a fire.
On the night of the Challenger conference call, following procedure in the face of uncertainty was so paramount that NASA’s Mulloy asked Thiokol to put its final launch recommendation and rationale on paper and sign it. Last-minute sign-off had always been verbal in the past. Thiokol’s Allan McDonald was in the room with Mulloy, and refused. One of McDonald’s bosses in Utah signed and faxed the document instead. Even Mulloy, who had demanded data, must have felt uneasy with the decision, while at the same time feeling protected by NASA’s ultimate tool—its hallowed process. The process culminated with more concern for being able to defend a decision than with using all available information to make the right one. Like the firefighters, NASA managers had merged with their tools. As McDonald said, looking only at the quantitative data actually supported NASA’s stance that there was no link between temperature and failure. NASA’s normal quantitative standard was a dearly held tool, but the wrong one for the job. That night, it should have been dropped.
It is easy to say in retrospect. A group of managers accustomed to dispositive technical information did not have any; engineers felt like they should not speak up without it. Decades later, an astronaut who flew on the space shuttle, both before and after Challenger, and then became NASA’s chief of safety and mission assurance, recounted what the “In God We Trust, All Others Bring Data” plaque had meant to him: “Between the lines it suggested that, ‘We’re not interested in your opinion on things. If you have data, we’ll listen, but your opinion is not requested here.’”
Physicist and Nobel laureate Richard Feynman was one of the members of the commission that investigated the Challenger, and in one hearing he admonished a NASA manager for repeating that Boisjoly’s data did not prove his point. “When you don’t have any data,” Feynman said, “you have to use reason.”
These are, by definition, wicked situations. Wildland firefighters and space shuttle engineers do not have the liberty to train for their most challenging moments by trial and error. A team or organization that is both reliable and flexible, according to Weick, is like a jazz group. There are fundamentals—scales and chords—that every member must overlearn, but those are just tools for sensemaking in a dynamic environment. There are no tools that cannot be dropped, reimagined, or repurposed in order to navigate an unfamiliar challenge. Even the most sacred tools. Even the tools so taken for granted they become invisible. It is, of course, easier said than done. Especially when the tool is the very core of an organization’s culture.
* * *
• • •
As Captain Tony Lesmes described it, his team at Bagram Air Base in northeast Afghanistan only went to work when someone got really unlucky. Lesmes commanded a team of Air Force pararescue jumpers, PJs for short, a division of Special Operations designed for harrowing rescue missions, like parachuting into enemy territory at night to save downed pilots. Cross a soldier, a paramedic, a rescue diver, a firefighter, a mountain rescue specialist, and a parachutist, and you get a PJ. Their emblem depicts an angel with arms wrapped around the world, and the words “That others may live.”
There was no typical day for the PJs at Bagram. One day they were rappelling down a mountain to rescue a soldier who fell into an unmarked well. Another day they were rushing to treat Marines injured in a firefight. PJs could accompany units out on missions, but mostly they stayed on twenty-four-hour alert, waiting for a “9-line,” a form (with nine lines) that provided basic information about an active emergency. Like one that came in on an autumn day in 2009. It was a category alpha, traumatic injuries. Within minutes, the team would be airborne.
Intel was sparse. A roadside bomb had exploded in the middle of an Army convoy of armored vehicles. The site was approximately a half hour away by