Other anthropologists studying the use of hallucinogens in so-called traditional societies reported that the drugs’ embedment in ritual settings and cosmological worldviews prevented the disruptive effects they had on American and European youth. Elsewhere, it seemed, psychedelics even served to stabilize the social order. In a Huichol initiation ceremony, for example, the ingestion of peyote turned the adolescent into a full member of his tribe. The drug experience allowed the young person to get to know for himself the supernatural spirit realm that provided the group with a normative structure and ethical orientation. In these settings, “doing drugs” validated the moral and religious order according to which the tribe lived (Furst 1972; Myerhoff 1975, 1976; Dobkin de Rios and Smith 1977). Against this background, anthropological studies of drug use in other cultures appeared to be a promising and timely way of counteracting the aggravating drug problem Western governments were facing. In nonmodern societies, ritual rather than legal means sufficed to control the consumption of mind-altering substances (Dobkin de Rios 1984: 205–214). Instead of prohibiting their use altogether, such ritual guided it toward specific cultural goals. Thus, somewhere far from home, anthropologists might learn from other peoples how to integrate hallucinogens into their own societies, rendering the recently declared and ultimately futile “War on Drugs” superfluous.
Furthermore, broadly based cross-cultural comparisons were meant to reveal an almost universal use of intoxicants by different ethnic groups all over the world and in all periods of human history. By demonstrating that, from a global perspective, Western opposition to ecstatic states was the exception rather than the rule, anthropology helped to legitimate the pharmacological quest for altered states of consciousness and corroborated the assumption of a perennial philosophy (Weil 1972; Bourguignon 1973; Furst 1976; Dobkin de Rios 1984). The cultural historian Andy Letcher (2006: 25–48) argues that these claims to the universality of hallucinogen use might tell us more about the utopian sentiments accompanying the psychedelic revolution of the 1960s than about other cultures or the human condition.
HALLUCINOGENS TODAY: FROM WONDER AND SHAME TO INQUIRY
Irrespective of these scholarly endeavors, both hallucinogen hype and scare eventually took care of themselves. The drugs did not bring about the cultural revolution announced by Timothy Leary and other proselytizers. Alongside the high hopes of the countercultural sixties, the widespread enthusiasm for these odd substances simply waned. By now, despite the revival of psychedelic research, hallucinogens are no cause for major public concern anymore. Their consumption has been reported to have stagnated or declined since the mid-1970s. Even in the neopsychedelic techno and rave scene emerging in the 1980s, the drugs of choice were amphetamines and especially MDMA (ecstasy), while psychedelics proper remained marginal (Reynolds 1999). The German authorities noticed a decline of LSD seizures (Amendt 2008: 103–104). Although concerned about their marketing on the Internet and by so-called smartshops, an EU report from 2006 stated: “The proportion of current users among those who have ever used is lower for the use of hallucinogenic mushrooms than it is for cannabis and ecstasy. It has been reported that the effects of hallucinogenic mushrooms limit the appeal of regular use” (Hillebrand et al. 2006: 9).6 In the same year, the US Drug Enforcement Administration (2006) announced that “LSD trafficking and abuse have decreased sharply since 2000, and a resurgence does not appear likely in the near term.” The resumption of a moral panic would sound different.7
Today, hallucinogens are located in a problem space very different from those of the early ethnographies of peyotism or the anthropological cultural critiques of the 1960s and 1970s. This inquiry departs from a less timely problematic and follows a very different anthropological trajectory. It grew out of an existential rather than political concern. When I took LSD for the first time in 1993, shortly after my eighteenth birthday, I temporarily suffered from a loss of self. But I did not become one with the universe. It is a sociological commonplace that peer leaders are members of a group that others identify with. Under the influence of LSD, however, I literally mistook myself for the classmate whom I was emulating by taking the drug. As I resurfaced from the depths of this deeply delusional experience, I was filled with joy about being me rather than the other person. I felt reconciled with myself and the world. Everything is as it should be, I thought. “This sense of happiness,” I wrote into my adolescent diary, even though I was taking pride in my materialism and abhorred all things ecclesiastical, “must have something to do with God.” As a fervent rationalist, I was dumbstruck by this experience of cosmic comfort. In its wake, I prayed for the first time since my childhood (for my mother and her partner who were about to separate). In the diary entry, I was quick to counteract this awkward piety with a set of slightly precocious and naïve scientific questions: “How about the activity of the locus coeruleus in children? Is it stronger than in adults? Do children experience the world like adults under LSD?” These questions merged Huxley’s (2009/1954: 25) claim that a drug-induced breakdown of what he called the “cerebral reducing valve” enabled the eye to recover some of the perceptual innocence of childhood with what Solomon Snyder’s popular science book Drugs and the Brain (1986) had taught me about the neuroanatomical substrates of the LSD experience. After all, it was the Decade of the Brain. The neurosciences were on the rise and I wanted to become a brain researcher myself. When talking to my friends about my drug experiment, which I took to be one of the most important experiences in my young life, I felt perfectly confident speaking about the neurochemistry underlying its breathtaking aesthetic dimension. But I felt too ashamed to mention either my self-loss or that, even long after the drug effects had worn off, I continued to think of my first trip as a profound, if ill-defined, spiritual experience.
Shame is an affect marking the return of social consciousness after having lost oneself in one way or another (Fisher 2002: 65–70). My secular orientation made it difficult for me to acknowledge any kind of religious sentiment. Max Weber (1958/1919: 155) articulated the contempt of the moderns—and modern I deemed myself in every respect—for those unable to endure the disenchantment of the world: “To the person who cannot bear the fate of the times like a man, one must say: may he rather return silently, without the usual publicity build-up of renegades, but simply and plainly. The arms of the old churches are opened widely and compassionately for him. After all, they do not make it hard for him. One way or another, he has to bring his ‘intellectual sacrifice’—that is inevitable. If he can really do it, we shall not rebuke him.” These condescending sentences were spoken in 1917 to university students who, in Weber’s eyes, were all too prone to give up science for the sake of religious enthusiasm. Today, Weber’s pathos sounds antiquated. I never felt that my chemically mediated glimpses into a spiritually transfigured world compromised my philosophical or scientific work. Even though I had also got to know the psychotic dimension of