Figure 1.4 Examples of Representational Images of Nature
©Ann Sloan Devlin
In one sense, a theory provides a necessary roadmap for normal science to proceed; on the other hand, we should remember Shermer’s first problem in scientific thinking that theory influences observations and his second that the observer changes the observed. Kuhn references a well-known early study by psychologists Bruner and Postman (1949) in which anomalous (e.g., a red six of spades) as well as normal playing cards are used. A few participants in the study fail to identify the anomalies, even with extended exposure to the these mis-fit cards. In these cases, their experience appears to limit their ability to perceive. Kuhn comments, “In science, as in the playing card experiment, novelty emerges only with difficulty, manifested by resistance against a background provided by expectation” (p. 64).
A major point of Kuhn’s book is that researchers who operate within a paradigm or scientific tradition have a difficult time accepting anomalous data that do not fit within the theories and laws of that framework. When old frameworks or paradigms are overthrown, it is often young researchers who may be outside of that tradition who forge the new framework: these researchers are “so young or so new to the crisis-ridden field that practice has committed them less deeply than most of their contemporaries to the world view and rules determined by the old paradigm” (p. 143). Similarly, as Murray Sidman (1960) states in his volume Tactics of Scientific Research, “sometimes it is the younger scientists, who enter the field unencumbered by the prejudices of past controversies, who pick out the threads of continuity from the tangle of theory, data and pseudo-problems that form a part of every stage of scientific progress” (p. 41).
Though not always resulting in the overthrow of a paradigm, some of the major changes in science have come from young researchers who perhaps were not fully wedded to a single theory or methodology (that hypothesis itself might be testable). Consider the experimental study of memory, where questions of capacity and duration have been studied since at least the time of Ebbinghaus in the late 19th century. As but one example of breaking out of a procedural paradigm, George Sperling’s (1960) doctoral thesis at Harvard University transformed the way scientists think about the storage capacity of very short-term visual memory (immediate memory) by introducing the partial report technique. Prior to that time, using the whole report technique, participants in research on immediate visual memory storage had to call out or recall everything that they had just seen in a brief exposure (~ 50-msec.) to visual stimuli. Sperling’s breakthrough was to have participants call out information presented on only one row in the visual array of three rows (Figure 1.5).
Figure 1.5 Example of Sperling’s (1960) Partial Report Technique
Source: Adapted from Sperling, 1960, p. 3.
Sperling (1960) argued that to recall the information successfully from this one row, participants must have had ALL of the rows available at the time the particular row in question was cued. The cue was presented through an auditory tone (high, medium, or low) to correspond to the position of the rows of information on the page (top, middle, bottom). This approach is a masterful example of tradition (continuing the study of memory through visual exposure) and innovation (changing what was asked of the participant, which in turn dramatically transformed our thinking about the capacity of immediate visual memory).
Revisit and Respond 1.2
Give an example from each one of Shermer’s (1997) categories:Problems in scientific thinkingProblems in pseudoscientific thinkingLogical problems in thinkingPsychological problems in thinking
Of these four categories, which do you think has the most potential to undermine the research process and why?
Explain why science is a combination of tradition and innovation.
Research and the Value of Common Sense
You might be a bit discouraged about how limitations in thinking affect the research process. There is reason for concern; on the other hand, humans have some remarkable cognitive assets.
In 1995, Marvin Minsky gave an address at Connecticut College at the dedication of its new interdisciplinary science center, funded by the Olin Foundation. His address was as wide ranging and stimulating as his research. Minsky is considered a founder of artificial intelligence, and one of his corporate affiliations was as a fellow of Walt Disney Imagineering (http://web.media.mit.edu/~minsky/minskybiog.html). Minsky died in 2016. Imagineers, as the name suggests, were part of a research and development think tank and worked on imagining ideas that might result in possibilities for entertainment (Remnick, 1997). In David Remnick’s article describing the Disney Corporation’s view of amusement in the future, Minsky was reported to have accepted the offer to be an Imagineer because it “reminded him of the early days at the Artificial Intelligence Lab” (Remnick, 1997, p. 222). In describing his view of the future, Minsky said: “I’m telling you: all the money and the energy in this country will eventually be devoted to doing things with your mind and your time” (p. 222). Speaking about what he thought future amusements might have in store, he said, “you’ll put on a suit and a special pair of glasses, and you’ll become involved in an experiential form of entertainment” (p. 222). Virtual reality and Google Glass? This article was published over 20 years ago.
Minsky was obsessed (not too strong of a word) with the workings of the mind. Among Minsky’s many contributions, his book The Society of Mind (1985), written for a general audience, stands out because it provides a perspective on what makes the human mind amazing and distinctive. The book is presented as a series of short topics and reflects Minsky’s wide-ranging approach to discourse. Researchers often focus exclusively on the errors we make (Kahneman, 1991); in this book, Minsky also points out some of the cognitive assets of humans, in particular, common sense.
Discussing all of the processes that must be involved when making something with children’s blocks, Minsky stated, “In science, one can learn the most by studying what seems the least” (1985, p. 20). Furthermore, “What people vaguely call common sense is actually more intricate than most of the technical expertise we admire” (1985, p. 72). Minsky argued it is easier to represent expertise than common sense because with expertise you are dealing with a limited domain of knowledge; humans, on the other hand, bring to bear many different kinds of expert systems in solving the simplest of everyday problems. Hence, common sense is anything but common, according to Minsky.
Much of what Minsky said can be applied to the research process. Research does not have to be sophisticated to be powerful; in fact, you could argue that the most powerful research is simple and elegant (think of Sperling’s [1960] partial report technique described earlier in this chapter). Small studies such as those one might do in a research methods class provide the opportunity to fill in the gap between what is known at the local level and shared wisdom, according to Rachel Kaplan in a very nice piece titled “The Small Experiment: Achieving More With Less” (R. Kaplan, 1996). People often complain that results in the social sciences are obvious, that is, we just demonstrate what everyone already knows—the we-knew-it-all-along effect, which is also called hindsight bias. But many such findings are not obvious until after you conduct the research. Common sense may lead us to ask questions that have been overlooked. Don’t be afraid to ask questions that others would view as “obvious,” that is, as commonsensical. After research emerged showing that patients have positive judgments of therapists whose offices are neat but also personalized (Nasar & Devlin, 2011), a therapist is reported to have commented, “Isn’t that obvious?” If it were obvious, then why did so many therapists’ offices used in this series of studies fail to conform to these criteria?
Hindsight bias: After an event has occurred, we have the tendency to claim that it could have been easily predicted.