If we understand consistency in its philosophical meaning, as the property that I should not hold contradictory beliefs, that is a very desirable property. If you were to claim that, at the same time, a certain cat inside a box is alive and, at the same time, not alive, that would be seen, at best, as a paradox. If you are a respected scientist making that claim, people might actually take it seriously, maybe even too seriously, even if what you meant was not that this would actually happen, but that a certain interpretation of quantum mechanics had to be wrong because it led to that conclusion. Still, in most situations, you would be justifiably dismissed as a lunatic. You can still make claims of uncertainty, of course. Saying that the cat might be alive and it might be dead is not inconsistent.
We may also understand consistency as commitment. In this case, commitment means being willing to support a cause, to maintain your ideas even in the face of contrary evidence. Even in this case, commitment seems to have a good reputation among us, but that reputation is far better than it should be. Being committed to an idea when evidence points elsewhere is a rather bad choice. That would prevent learning; it would prevent advance.
There are some circumstances where commitment to an idea might still be acceptable. For example, if you are committed to a value; let’s say, something like saving lives. Saying saving lives is a good thing might not be a description of the world. It is a choice about what we think good is. If there is a moral we must follow, it seems saving lives would be included. If we hold life as a value, we can change our minds about the best strategies to achieve that goal. And, of course, we might want to inspect our other values when making any decisions. There might be inconsistencies there (of the bad, illogical type), as well as problems to achieve all our goals. If you also give importance to avoid suffering, you will find circumstances where both goals will conflict, and you will have to make choices. But that is not inconsistency. It is only the unavoidable fact that life is complicated, and we can never have it all. Your commitment to both values may still exist, even if sometimes they can’t both be achieved. But a commitment to an idea about how the world is seems to be a different action. It is not up to us to decide how the world is. That kind of commitment can cause people to become biased and unable to learn and change their minds.
Rational arguments sometimes have little effect on convincing us. It seems that emotional discourses have a much stronger effect, but the desire to belong to a group is certainly not our only emotional desire. We are far more complex individuals than that. We want, for example, to feel good about ourselves. Cohen and his team observed how that could influence our opinions. They tested how people react to information, including information with which they disagree. They observed a much stronger tendency to agree with the conflicting information when it was presented in a way that would improve the self-worth of the volunteers (Cohen et al. 2000).
Affective influences can have an impact on how we perceive many things and how we react to them. Of course, stronger emotions have a more significant impact (Clore and Schnall 2005). Politicians have known for a long time how to use our emotions to convince populations. The classical example is their common appeals to fear (Witte and Allen 2000). However, it is not only the basic emotions in a message that matter. Emotional states associated with certainty makes us feel more sure about our decisions than emotions associated with uncertainty (Tiedens and Linton 2001). Sounding confident, once more, convinces better. Interestingly, adopting views that might not fit with the majority of the group also seem to have an emotional side. Imhof and Lamberty (2017) observed that in supporters of conspiracy theories. In that case, it seems the need to feel special and unique might be linked to a stronger tendency to accept conspiracy beliefs.
Convincing others and ourselves should be a rational process, but it is far more about our emotions. That is a problem that might not have a permanent solution. We can, of course, learn to control some of the damage. For example, we should pay more attention to how some people try to manipulate us by using emotional discourse. Some people do it consciously; others are not aware but do it just the same. We might want to avoid situations where our emotions would work against our judgment, that is, if we care about getting as close as possible to true or best answers. If all we want is to fit into our groups, influence them, and ascend socially, we may already be very well adapted to the task.
Up to now, we have seen that our intuition and our feelings can fool us. We need to find ways around that problem. That means we now need to investigate if there are better tools we can use for our opinions and decision making.
References
Abelson, R. P., and D. A. Prentice. 1989. “Beliefs as Possessions: A Functional Perspective.” In Attitude Structure and Function. London: Psychology Press.
Amodio, D. M., J. T. Jost, S. L. Master, and C. M. Yee. 2007. “Neurocognitive Correlates of Liberalism and Conservatism.” Nature Neuroscience 10(10), 1246–47.
Asch, S. 1955. “Opinions and Social Pressure.” Scientific American 193(5), 31–35.
Asch, S. E. 1956. “Studies of Independence and Conformity: A Minority of One against a Unanimous Majority.” Psychological Monographs 70(416), 70.
Bakker, B., G. Schumacher, C. Gothreau, and K. Arceneaux. 2019. “Conservatives and Liberals Have Similar Physiological Responses to Threats: Evidence from Three Replications.” PsyArXiv, https://psyarxiv.com/vdpyt/.
Clore, G. L., and S. Schnall. 2005. “The Influence of Affect on Attitude.” In D. Albarracin, B. T. Johnson, and M. P. Zanna (Eds.), Handbook of Attitudes. Mahwah, NJ: Erlbaum.
Cohen, G. L., J. Aronson, and C. M. Steele. 2000. “When Beliefs Yield to Evidence: Reducing Biased Evaluation by Affirming the Self.” Personality and Social Psychology Bulletin 26(9), 1151–64.
Dominguez D, J. F., S. A. Taing, and P. Molenberghs. 2016. “Why Do Some Find It Hard to Disagree? An fMRI study.” Frontiers in Human Neuroscience 9, 718.
Eisenberger, N. I., M. D. Lieberman, and K. D. Williams. 2003. “Does Rejection Hurt? An fMRI Study of Social Exclusion.” Science 302(5643), 290–92.
Galton, F. 1907. “Vox Populi.” Nature 75(1949), 450–451.
Gilead, M., M. Sela, and A. Maril. 2018. “That’s My Truth: Evidence for Involuntary Opinion Confirmation.” Social Psychological and Personality Science 10(3), 393–401.
Hill, G. W. 1982. “Group versus Individual Performance: Are n+1 Heads Better Than One?” Psychological Bulletin 91(3), 517–39.
Imhoff, R., and P. K. Lamberty. 2017. “Too Special to Be Duped: Need for Uniqueness Motivates Conspiracy Beliefs.” European Journal of Social Psychology 47(6), 724–34.
Janis, I. L. 1972. Victims of Groupthink: A Psychological Study of Foreign-Policy Decisions and Fiascoes. Boston: Houghton Mifflin Company.
Jost, J. T., J. Glaser, A. W. Kruglanski, and F. J. Sulloway. 2003. “Political Conservatism as Motivated Social Cognition.” Psychological Bulletin 129(3), 339–75.
Kahan, D. 2010. “Fixing the Communication Failure.” Nature 463(7279), 296–97.
Kahan, D. M. 2013. “Ideology, Motivated Reasoning, and Cognitive Reflection.” Judgment and Decision Making 8(4), 407–24.
Kahan, D. M., E. Peters, E. C. Dawson, and P. Slovic. 2013. “Motivated Numeracy and Enlightened Self-Government.” Yale Law School, Public Law Working Paper No. 307.
Kahan, D. M., E. Peters, E. C. Dawson, and P. Slovic. 2017. “Motivated Numeracy and Enlightened Self-Government.” Behavioural Public Policy 1(1), 54–86.
Kanai, R., T. Feilden, C. Firth, and G. Rees. 2011. “Political Orientations Are Correlated with Brain Structure in Young Adults.” Current Biology 21(8), 1–4.
Kerr, N. L., and R. S. Tindale. 2004. “Group