Zucked. Roger McNamee. Читать онлайн. Newlib. NEWLIB.NET

Автор: Roger McNamee
Издательство: HarperCollins
Серия:
Жанр произведения: Биографии и Мемуары
Год издания: 0
isbn: 9780008319021
Скачать книгу
them. More disturbing, the images were spreading virally. Lots of my friends were sharing them. And there were new images every day.

      I knew a great deal about how messages spread on Facebook. For one thing, I have a second career as a musician in a band called Moonalice, and I had long been managing the band’s Facebook page, which enjoyed high engagement with fans. The rapid spread of images from these Sanders-associated pages did not appear to be organic. How did the pages find my friends? How did my friends find the pages? Groups on Facebook do not emerge full grown overnight. I hypothesized that somebody had to be spending money on advertising to get the people I knew to join the Facebook Groups that were spreading the images. Who would do that? I had no answer. The flood of inappropriate images continued, and it gnawed at me.

      More troubling phenomena caught my attention. In March 2016, for example, I saw a news report about a group that exploited a programming tool on Facebook to gather data on users expressing an interest in Black Lives Matter, data that they then sold to police departments, which struck me as evil. Facebook banned the group, but not until after irreparable harm had been done. Here again, a bad actor had used Facebook tools to harm innocent victims.

      In June 2016, the United Kingdom voted to exit the European Union. The outcome of the Brexit vote came as a total shock. Polling had suggested that “Remain” would triumph over “Leave” by about four points, but precisely the opposite happened. No one could explain the huge swing. A possible explanation occurred to me. What if Leave had benefited from Facebook’s architecture? The Remain campaign was expected to win because the UK had a sweet deal with the European Union: it enjoyed all the benefits of membership, while retaining its own currency. London was Europe’s undisputed financial hub, and UK citizens could trade and travel freely across the open borders of the continent. Remain’s “stay the course” message was based on smart economics but lacked emotion. Leave based its campaign on two intensely emotional appeals. It appealed to ethnic nationalism by blaming immigrants for the country’s problems, both real and imaginary. It also promised that Brexit would generate huge savings that would be used to improve the National Health Service, an idea that allowed voters to put an altruistic shine on an otherwise xenophobic proposal.

      The stunning outcome of Brexit triggered a hypothesis: in an election context, Facebook may confer advantages to campaign messages based on fear or anger over those based on neutral or positive emotions. It does this because Facebook’s advertising business model depends on engagement, which can best be triggered through appeals to our most basic emotions. What I did not know at the time is that while joy also works, which is why puppy and cat videos and photos of babies are so popular, not everyone reacts the same way to happy content. Some people get jealous, for example. “Lizard brain” emotions such as fear and anger produce a more uniform reaction and are more viral in a mass audience. When users are riled up, they consume and share more content. Dispassionate users have relatively little value to Facebook, which does everything in its power to activate the lizard brain. Facebook has used surveillance to build giant profiles on every user and provides each user with a customized Truman Show, similar to the Jim Carrey film about a person who lives his entire life as the star of his own television show. It starts out giving users “what they want,” but the algorithms are trained to nudge user attention in directions that Facebook wants. The algorithms choose posts calculated to press emotional buttons because scaring users or pissing them off increases time on site. When users pay attention, Facebook calls it engagement, but the goal is behavior modification that makes advertising more valuable. I wish I had understood this in 2016. At this writing, Facebook is the sixth most valuable company in America, despite being only fifteen years old, and its value stems from its mastery of surveillance and behavioral modification.

      When new technology first comes into our lives, it surprises and astonishes us, like a magic trick. We give it a special place, treating it like the product equivalent of a new baby. The most successful tech products gradually integrate themselves into our lives. Before long, we forget what life was like before them. Most of us have that relationship today with smartphones and internet platforms like Facebook and Google. Their benefits are so obvious we can’t imagine foregoing them. Not so obvious are the ways that technology products change us. The process has repeated itself in every generation since the telephone, including radio, television, and personal computers. On the plus side, technology has opened up the world, providing access to knowledge that was inaccessible in prior generations. It has enabled us to create and do remarkable things. But all that value has a cost. Beginning with television, technology has changed the way we engage with society, substituting passive consumption of content and ideas for civic engagement, digital communication for conversation. Subtly and persistently, it has contributed to our conversion from citizens to consumers. Being a citizen is an active state; being a consumer is passive. A transformation that crept along for fifty years accelerated dramatically with the introduction of internet platforms. We were prepared to enjoy the benefits but unprepared for the dark side. Unfortunately, the same can be said for the Silicon Valley leaders whose innovations made the transformation possible.

      If you are a fan of democracy, as I am, this should scare you. Facebook has become a powerful source of news in most democratic countries. To a remarkable degree it has made itself the public square in which countries share ideas, form opinions, and debate issues outside the voting booth. But Facebook is more than just a forum. It is a profit-maximizing business controlled by one person. It is a massive artificial intelligence that influences every aspect of user activity, whether political or otherwise. Even the smallest decisions at Facebook reverberate through the public square the company has created with implications for every person it touches. The fact that users are not conscious of Facebook’s influence magnifies the effect. If Facebook favors inflammatory campaigns, democracy suffers.

      August 2016 brought a new wave of stunning revelations. Press reports confirmed that Russians had been behind the hacks of servers at the Democratic National Committee (DNC) and Democratic Congressional Campaign Committee (DCCC). Emails stolen in the DNC hack were distributed by WikiLeaks, causing significant damage to the Clinton campaign. The chairman of the DCCC pleaded with Republicans not to use the stolen data in congressional campaigns. I wondered if it were possible that Russians had played a role in the Facebook issues that had been troubling me earlier.

      Just before I wrote the op-ed, ProPublica revealed that Facebook’s advertising tools enabled property owners to discriminate based on race, in violation of the Fair Housing Act. The Department of Housing and Urban Development opened an investigation that was later closed, but reopened in April 2018. Here again, Facebook’s architecture and business model enabled bad actors to harm innocent people.

      Like Jimmy Stewart in the movie, I did not have enough data or insight to understand everything I had seen, so I sought to learn more. As I did so, in the days and weeks after the election, Dan Rose exhibited incredible patience with me. He encouraged me to send more examples of harm, which I did. Nothing changed. Dan never budged. In February 2017, more than three months after the election, I finally concluded that I would not succeed in convincing Dan and his colleagues; I needed a different strategy. Facebook remained a clear and present danger to democracy. The very same tools that made Facebook a compelling platform for advertisers could also be exploited to inflict harm. Facebook was getting more powerful by the day. Its artificial intelligence engine learned more about every user. Its algorithms got better at pressing users’ emotional buttons. Its tools for advertisers improved constantly. In the wrong hands, Facebook was an ever-more-powerful weapon. And the next US election—the 2018 midterms—was fast approaching.

      Yet no one in power seemed to recognize the threat. The early months of 2017 revealed extensive relationships between officials of the Trump campaign and people associated with the Russian government. Details emerged about a June 2016 meeting in Trump Tower between inner-circle members of the campaign and Russians suspected of intelligence affiliations. Congress spun up Intelligence Committee investigations that focused on that meeting.

      But still there was no official concern about the role that social media platforms, especially Facebook, had played in the 2016 election. Every day that passed without an investigation increased the likelihood that the interference would continue. If someone did not act quickly, our democratic processes could be overwhelmed by outside forces; the 2018 midterm election would