2 Difficulties understanding statistical or complex scientific information related to unfamiliar activities or technologies: A variety of cognitive biases and related factors hamper people’s understanding of probabilities. This difficulty hampers discussions about risks between experts and nonexperts. For example, risk experts are often confused by the public’s rejection of the argument that a risk from a new activity or technology is acceptable if the risk is smaller than the ones people face in their daily lives.
3 Personalization: People often personalize the risk. Sample question: What if I am the person who is harmed or adversely impacted?
4 Trustworthiness: People often raise questions of trust. Sample question: Why should I believe you on this issue given you previously made mistakes or changed your mind about risks and threats?
5 Cumulative Risks: People often raise concerns about cumulative risks. Sample question: I already have enough risks in my life. Why should I take on even one more?
6 Benefits: People often question whether the risks are worth the benefits. Sample question: Will the benefits of the new activity or technology significantly outweigh the risks?
7 Ethics: People often raise ethical questions. Sample question: Who gave you the right to make decisions that violate the moral principles and rights of others? Complicating the perception of fairness is the difficulty people have understanding, appreciating, and interpreting small probabilities, such as the difference between 1 chance in 100,000 and 1 chance in 1,000,000. These same problems hamper discussions between technical experts and nonexperts about what is remotely possible and probable. Given these difficulties, to be effective, risk communication strategies must address the experiences, attitudes, beliefs, values, and culture of those receiving information about a risk or threat. Effective risk communication skills are built on a foundation of understanding how people perceive risks.
8 Strong emotional responses to risk information: Strong feelings of fear, worry, anger, outrage, and helplessness are often evoked by exposure to unwanted or dreaded risks. These emotions often make it difficult for leaders, risk managers, and technical experts to engage in constructive discussions about risks in public settings. Emotions are most intense when people perceive the risk to be involuntary, unfair, not under their personal control, managed by untrustworthy individuals or organizations, and offering few benefits. More extreme emotional reactions often occur when the risk affects children, when the adverse consequences are particularly dreaded, and when worst‐case scenarios are imagined. Strong emotional responses to risk information are not necessarily wrong or contrary to knowledge. They can be based on practical or experiential knowledge and emphasize what people value, such as fairness and equity. Strong emotional responses are also not necessarily opposed to reason. That rationality and emotions are opposed to each other derives from the belief that the human brain perceives reality in two distinct ways: one is emotional, instinctive, intuitive, spontaneous, while the other is rational, analytical, statistical, and occurred later in human evolution. However, strong emotional responses to risk information can be rational and prevent people from engaging in dangerous activities.
9 Desires and demands for scientific certainty: People often display a marked aversion to uncertainty. They use a variety of coping mechanisms to reduce the anxiety generated by uncertainty. This aversion frequently translates into a clear preference for statements of fact over statements of probability, which is the language of risk assessment. People often demand that technical experts tell them exactly what will happen, not what might happen. For example, the changing recommendations during the COVID‐19 pandemic on things such as whether face masks were effective or whether a person without symptoms could spread the disease caused many people to become frustrated and caused them to distrust science.
10 Strong beliefs that resist change: People tend to seek out information that confirms and supports their beliefs and often ignore evidence that contradicts their beliefs. Beliefs often operate on a polarized scale of True or False, with little gray in‐between. Opinions often operate on a different scale – Favorable or Unfavorable. According to the Four Hit Theory of Belief Formation, once formed, a belief is difficult or impossible to change. On average, four unanswered risk communication messages (hits) from trustworthy sources can crystalize into a belief. Less than four risk communication messages (hits) are typically still an opinion. A hit from one side can be negated by a hit from the other side.Strong beliefs about risks or threats, once formed, change slowly and are extraordinarily persistent even in the face of contrary evidence. Initial beliefs about risks structure the way subsequent evidence is interpreted. Fresh evidence – e.g., data provided by a technical expert – appears reliable and informative only if it is consistent with the initial belief; contrary evidence is dismissed as unreliable, erroneous, irrelevant, or unrepresentative.
11 Opinions can be manipulated by how information is presented: When people lack strong prior beliefs, subtle changes in the way risk information is presented and framed can have a major impact on opinions. For example, two groups of physicians were asked to choose between two therapies – surgery or radiotherapy.5 Each group received the same information; however, the probabilities were expressed either in terms of dying or surviving. Both numbers expressed the same probability, but the different presentations resulted in dramatic variation in the choice of therapy. Here, physicians received the survival data better. However, the effects of information framing are modified by factors, including risk aversion, experience, beliefs, level of risk, type of risk, or costs of risk mitigation.
12 Ignoring or dismissing risk information because of its perceived lack of personal relevance: Risk data often relates to society. These data are often of minimal interest to individuals, who are more concerned about risks to themselves rather than risks to society.
13 Using risks as proxies or surrogates for other personal, societal, economic, cultural, or political agendas and concerns: The specific risks that people focus on reflect their beliefs about values, social institutions, and moral behavior. Risks and crises may be exaggerated or minimized under their personal, societal, economic, cultural, or political agendas, priorities, and concerns. Debates about risks often serve as proxies or surrogates for debates about high concern issues. The debate about nuclear power, for example, is sometimes less about the specific risks of nuclear power than about other issues such as the proliferation of nuclear weapons, the adverse effects of nuclear waste disposal, the value of large‐scale technological progress and growth, and the centralization of political and economic power in the hands of a technological elite.
Table 3.5 Factors that affect the ability of people to make informed decisions about risks.
Inaccurate perceptions of riskDifficulties in understanding statistical or complex scientific information related to unfamiliar activities or technologiesStrong emotional responses to risk informationDesires and demands for scientific certaintyStrong beliefs and opinions that resist change and distort understandingWeak beliefs and opinions that can be manipulated by the way information is presented and framedIgnoring or dismissing risk information because of its perceived lack of personal relevanceUsing risks as proxies or surrogates for other personal, societal, economic, cultural, or political agendas and concerns |
Cultural factors, such as values, norms, social networks, group memberships,