All of this—the despair, the injustice, the inhumanity, the cruelty—pours out of Tommy as he weeps and rages in the headlights of Kathy’s car. And, standing with him, we know in our hearts that this society has sold itself out to a technology that rips people’s lives and dreams away from them, so that those with the privilege of not being labeled “clone” can live longer and healthier lives.
This, to me, is a message that stays with me long after watching Never Let Me Go—that if we are not careful, technology has the power to rob us of our souls, even as it sustains our bodies, not because it changes who we are, but because it makes us forget the worth of others. It’s a message that’s directly relevant to human cloning, should we ever develop this technology to the point that it’s widely used. But it also applies to other technologies that blur our definitions of “worth,” including the use of technologies that claim to predict how someone will behave, as we’ll see in our next movie: Minority Report.
Minority Report: Predicting Criminal Intent
“If there’s a flaw, it’s human—it always is.”
—Danny Witwer
Criminal Intent
There’s something quite enticing about the idea of predicting how people will behave in a given situation. It’s what lies beneath personality profiling and theories of preferred team roles. But it also extends to trying to predict when people will behave badly, and taking steps to prevent this.
In this vein, I recently received an email promoting a free online test that claims to use “‘Minority Report-like’ tech to find out if you are ‘predisposed’ to negative or bad behavior.” The technology I was being encouraged to check out was an online survey being marketed by the company Veris Benchmark under the trademark “Veris Prime.” It claimed that “for the first time ever,” users had an “objective way to measure a prospective employee’s level of trustworthiness.”
Veris’ test is an online survey which, when completed, provides you (or your employer) with a “Trust Index.” If you have a Trust Index of eighty to one hundred, you’re relatively trustworthy, but below twenty or so, you’re definitely in danger of showing felonious tendencies. At the time of writing, the company’s website indicates that the Trust Index is based on research on a wide spectrum of people, although the initial data that led to the test came from 117 white-collar felons. In other words, when the test was conceived, it was assumed that answering a survey in the same way as a bunch of convicted felons is a good way of indicating if you are likely to pursue equally felonious behavior in the future.
Naturally, I took the test. I got a Trust Index of nineteen. This came with a warning that I’m likely to regularly surrender to the temptation of short-term personal gain, including cutting corners, stretching the truth, and failing to consider the consequences of my actions.
Sad to say, I don’t think I have a great track record of any of these traits; the test got it wrong (although you’ll have to trust me on this). But just to be sure that I wasn’t an outlier, I asked a few of my colleagues to also take the survey. Amazingly, it turns out that academics are some of the most felonious people around, according to the test. In fact, if the Veris Prime results are to believed, real white-collar felons have some serious competition on their hands from within the academic community. One of my colleagues even managed to get a Trust Index of two.
One of the many issues with the Veris Prime test is the training set it uses. It seems that many of the traits that are apparently associated with convicted white-collar criminals—at least according to the test—are rather similar to those that characterize curious, independent, and personally-motivated academics. It’s errors like this that can easily lead us into dangerous territory when it comes to attempting to use technology to predict what someone will do. But even before this, there are tough questions around the extent to which we should even be attempting to use science and technology to predict and prevent criminal behavior. And this leads us neatly into the movie Minority Report.
Minority Report is based on the Philip K. Dick short story of the same name, published in 1956. The movie centers on a six-year crime prevention program in Washington, DC, that predicts murders before they occur, and leads to the arrest and incarceration of “murderers” before they can commit their alleged future crime. The “Precrime” program, as it’s aptly called, is so successful that it has all but eliminated murder in the US capital. And as the movie opens, there’s a ballot on the books to take it nationwide.
The Precrime program in the movie is astoundingly successful—at least on the surface. The program is led by Chief John Anderton (played by Tom Cruise). Anderton’s son was abducted six years previously while in his care, and was never found. The abduction destroyed Anderton’s personal life, leaving him estranged from his partner, absorbed in self-pity, and dependent on illegal street narcotics. Yet despite his personal pain, he’s a man driven to ensuring others don’t have to suffer a similar fate. Because of this, he is deeply invested in the Precrime program, and since its inception has worked closely with the program director and founder Lamar Burgess (Max von Sydow) to ensure its success.
The technology behind Precrime in the movie is fanciful, but there’s a level of internal consistency that helps it work effectively within the narrative. The program depends on three “precogs”: genetically modified, isolated, and heavily sedated humans who have the ability to foresee future murders. By monitoring and visualizing their neural activity, the Precrime team can see snatches of the precogs’ thoughts, and use these to piece together where and when a future murder will occur. All they then have to do is swoop in and arrest the pre-perpetrator before they’ve committed the crime. And, because the precogs’ predictions are trusted, those arrested are sentenced and incarcerated without trial. This incarceration involves being fitted with a “halo”—a neural device that plunges the wearer helplessly into their own incapacitating inner world, although whether this is a personal heaven or hell we don’t know.
As the movie opens, we’re led to believe that this breakthrough in crime prevention is a major step forward for society. Murder’s a thing of the past in the country’s capital, its citizens feel safer, and those with murderous tendencies are locked away before they can do any harm. That is, until Chief Anderton is tagged as a pre-perp by the precogs.
Not surprisingly, Anderton doesn’t believe them. He knows he isn’t a murderer, and so he sets out to discover where the flaw in the system is. And, in doing so, he begins to uncover evidence that there’s something rotten in the very program he’s been championing. On his journey, he learns that the precogs are not, as is widely claimed, infallible. Sometimes one of them sees a different sequence of events in the future, a minority report, that is conveniently scrubbed from the records in favor of the majority perspective.
Believing that his minority report—the account that shows he’s innocent of a future murder—is still buried in the mind of the most powerful precog, Agatha (played by Samantha Morton), he breaks into Precrime and abducts her. In order to extract the presumed minority report she’s carrying, he takes her to a seedy pleasure joint that uses recreational brain-computer interfaces to have her mind “read.” And he discovers, to his horror, that there is no minority report; all three precogs saw him committing the same murder in the near future.
Anderton does, however, come across an anomaly: a minority report embedded in Agatha’s memory of a murder that is connected with an earlier inconsistency he