Second, our brains are wired to make systematic shortcuts in decision‐making. Kahneman distinguishes two types: System 1 and System 2 thinking. System 1 thinking is fast.19 It's generally unconscious, automatic, and effortless. The brain in System 1 mode assesses the situation, acts without conscious control, and responds. Roughly 98 percent of our thinking occurs here. System 2, on the other hand, is slow. It is conscious, deliberate, requires effort, and is rational. The brain in System 2 mode seeks anomalies, missing information and consciously makes decisions – 2 percent of our thinking occurs here. This is purposeful – imagine having to think about contracting and lowering our diaphragm in order to inhale every single breath.20
Unfortunately, System 1 (fast) can interfere with or influence System 2 (slow) with numerous cognitive biases without our conscious knowledge. For example, confirmation bias is the tendency to search, interpret, and recall information that confirms one's prior beliefs or preconceptions.21 For example, a 2021 study of soccer commentators showed that they are six times more likely to associate attributes of physical power to players of darker skin tone than to lighter skin tone.22 This is an example of how the stereotype that darker‐skinned people have greater physical prowess and lack intellectual prowess manifests in everyday, unconscious ways. Confirmation bias shows up in how we assess job candidates as well as how we evaluate business proposals. After all, when 98 percent of our thinking is System 1(fast), we are subject to it unless we consciously slow it down or create systems to help.
The third influence on decision‐making relevant for this book is Type 1 and Type 2 decision‐making. It's a framework that categorizes different kinds of decisions and can help corral the best of our brainpower to make good decisions. Type 1 decisions are irreversible; Type 2 decisions are reversible. It is crucial to distinguish which type before committing to an action. For example, if a decision is Type 1 (irreversible) and consequential, rigorously gather facts and perspective from multiple, diverse sources, and actively mitigate risk. On the other hand, that's way too much effort for a reversible Type 2 decision. Instead, it's better to decide quickly, especially if you can do an experiment.
I learned a useful twist to decision‐making when listening to author Jim Collins talk about his research with Morten Hansen on the impact of time and risk.23 They learned that the best decision makers could assess when the risk would increase. For Type 1 decisions that were irreversible and consequential, they would use the best available information and make a decision fast. If risk wasn't going to change for a long time, they would rigorously gather data (see Figure I.3).
I experienced this in my own life. When my father was diagnosed with stomach cancer in 2010, he had multiple treatment options, but he had time to decide which option to pursue before the tumor grew too big and limited his choices. We rigorously sought new data from different doctors, researchers, and other patients. We decided to remove his stomach because we were absolutely clear it was the best choice. It was a Type 1 irreversible decision, but the risk was not increasing rapidly. In contrast, when the Tubbs Fire erupted in 2017 in rural northern California near Calistoga, risk was increasing by the second because of 70 mph wind gusts and dry grasses due to drought. “Evacuate!” was a Type 1 irreversible decision with no time, and it was the right call.
FIGURE I.3 Type 1 versus Type 2 decisions.
However, there's an interesting development to that story. In Chapter 3, I profile a volunteer disaster response leader who was involved in the 2017 season of California wildfires. In that year, the size, speed, and scale of the fires was so great that it quickly overwhelmed the normal emergency systems. For residents, there was chaos, because the established, reliable system for communication, exit protocols, and shelter infrastructure was insufficient for the complexity of a wildfire at that scale. One of the insights that responders had between 2017 and 2018 was that they had to evolve their perspective from a wildfire as a singular event to a wildfire season. In the subsequent years, as police, firefighters, public agencies, and emergency response teams reconceived their systems around a fire season, they developed protocols and training so that residents could be more informed and prepared.24
So how did that affect Type 1 decision‐making? I got an unexpected perspective from a friend who lives in Santa Rosa, one of the towns that was partially destroyed by those 2017 fires. She said that now when the local emergency air horn sounds, it's actually a comfort. They have prestocked emergency bags, everyone in the house has practiced their routines, and there are clear directions about where to go. They've built a system to respond to a Type 1 (irreversible) decision, and they've developed a routine to train the System 1 (fast) thinking brain so actions are automatic and life‐saving instead of chaotic, confusing, and potentially fatal. The theme of encountering a complex problem (wildfires at scale), changing one's perspective (wildfire season), responding with experiments (new protocols), and building infrastructure to make decisions easier (emergency routines) is a sequence that is core to the Move to the Edge, Declare It Center framework.
In summary, we humans are less rational than we believe, we are wired to create mental shortcuts that can lead to biased decisions, and we have to pay attention to whether decisions are reversible and when risk changes. The best we can do – which can become pretty good with practice – is to be aware of these influences, then design habits, practices, and systems to mitigate them. However, all the theory in the world can't save you if your brain is hijacked by neurochemicals screaming, “Run away!” That brings us to the fourth principle: the reactions to stress.
Our Bodies: Reacting to Stress
The COVID pandemic provided a vivid lens to observe common patterns of stress reactions that often lead to poor decision making. I'm sure we have all felt many, if not all of the following list.
We freeze, delaying decisions out of fear of doing the wrong thing.
We go into flight, failing to confront reality, ignoring, minimizing, failing to gather facts that might challenge our belief about the crisis.
We go into fight mode, defending or blaming without thinking, often doing more collateral damage.
We go into friend/fawn mode, soothing our uncertainty by taking actions to make sure we are liked and appreciated by others, instead of making decisions that we fear will make us unpopular – even if it's the right call.
We return to the familiar – the old decision‐making models, playbooks, intuitions, or people, ignoring or failing to see that the context has changed and new rules apply.
During my research for this book, I learned about another perspective to these stress patterns, emerging from the work of Shelley Taylor. She noted that the original research of fight‐or‐flight by Walter Cannon was tested only on men and male animals.25 Taylor's own experience suggested that women reacted to stress