With Sheryl on board as chief operating officer in charge of delivering revenues, Facebook quickly developed its infrastructure to enable rapid growth. This simplified Zuck’s life so he could focus on strategic issues. Facebook had transitioned from startup to serious business. This coming-of-age had implications for me, too. Effectively, Zuck had graduated. With Sheryl as his partner, I did not think Zuck would need mentoring from me any longer. My domain expertise in mobile made me valuable as a strategy advisor, but even that would be a temporary gig. Like most successful entrepreneurs and executives, Zuck is brilliant (and ruthless) about upgrading his closest advisors as he goes along. In the earliest days of Facebook, Sean Parker played an essential role as president, but his skills stopped matching the company’s needs, so Zuck moved on from him. He also dropped the chief operating officer who followed Parker and replaced him with Sheryl. The process is Darwinian in every sense. It is natural and necessary. I have encountered it so many times that I can usually anticipate the right moment to step back. I never give it a moment’s thought.
Knowing that we had accomplished everything we could have hoped for at the time I began mentoring him, I sent Zuck a message saying that my job was done. He was appreciative and said we would always be friends. At this point, I stopped being an insider, but I remained a true believer in Facebook. While failures like Beacon had foreshadowed problems to come, all I could see was the potential of Facebook as a force for good. The Arab Spring was still a year away, but the analyst in me could see how Facebook might be used by grassroots campaigns. What I did not grasp was that Zuck’s ambition had no limit. I did not appreciate that his focus on code as the solution to every problem would blind him to the human cost of Facebook’s outsized success. And I never imagined that Zuck would craft a culture in which criticism and disagreement apparently had no place.
The following year, 2010, was big for Facebook in surprising ways. By July, Facebook had five hundred million users, half of whom visited the site every day. Average daily usage was thirty-four minutes. Users who joined Facebook to stay in touch with family soon found new functions to enjoy. They spent more time on the site, shared more posts, and saw more ads.
October saw the release of The Social Network, a feature film about the early days of Facebook. The film was a critical and commercial success, winning three Academy Awards and four Golden Globes. The plot focused on Zuck’s relationship with the Winklevoss twins and the lawsuit that resulted from it. The portrayal of Zuck was unflattering. Zuck complained that the film did not accurately tell the story, but hardly anyone besides him seemed to care. I chose not to watch the film, preferring the Zuck I knew to a version crafted in Hollywood.
Just before the end of 2010, Facebook improved its user interface again, edging closer to the look and feel we know today. The company finished 2010 with 608 million monthly users. The rate of user growth remained exceptionally high, and minutes of use per user per day continued to rise. Early in 2011, Facebook received an investment of five hundred million dollars for 1 percent of the company, pushing the valuation up to fifty billion dollars. Unlike the Microsoft deal, this transaction reflected a financial investor’s assessment of Facebook’s value. At this point, even Microsoft was making money on its investment. Facebook was not only the most exciting company since Google, it showed every indication that it would become one of the greatest tech companies of all time. New investors were clamoring to buy shares. By June 2011, DoubleClick announced that Facebook was the most visited site on the web, with more than one trillion visits. Nielsen disagreed, saying Facebook still trailed Google, but it appeared to be only a matter of time before the two companies would agree that Facebook was #1.
In March 2011, I saw a presentation that introduced the first seed of doubt into my rosy view of Facebook. The occasion was the annual TED Conference in Long Beach, the global launch pad for TED Talks. The eighteen-minute Talks are thematically organized over four days, providing brain candy to millions far beyond the conference. That year, the highlight for me was a nine-minute talk by Eli Pariser, the board president of MoveOn.org. Eli had an insight that his Facebook and Google feeds had stopped being neutral. Even though his Facebook friend list included a balance of liberals and conservatives, his tendency to click more often on liberal links had led the algorithms to prioritize such content, eventually crowding out conservative content entirely. He worked with friends to demonstrate that the change was universal on both Facebook and Google. The platforms were pretending to be neutral, but they were filtering content in ways that were invisible to users. Having argued that the open web offered an improvement on the biases of traditional content editors, the platforms were surreptitiously implementing algorithmic filters that lacked the value system of human editors. Algorithms would not act in a socially responsible way on their own. Users would think they were seeing a balance of content when in fact they were trapped in what Eli called a “filter bubble” created and enforced by algorithms. He hypothesized that giving algorithms gatekeeping power without also requiring civic responsibility would lead to unexpected, negative consequences. Other publishers were jumping on board the personalization bandwagon. There might be no way for users to escape from filter bubbles.
Eli’s conclusion? If platforms are going to be gatekeepers, they need to program a sense of civic responsibility into their algorithms. They need to be transparent about the rules that determine what gets through the filter. And they need to give users control of their bubble.
I was gobsmacked. It was one of the most insightful talks I had ever heard. Its import was obvious. When Eli finished, I jumped out of my seat and made a beeline to the stage door so that I could introduce myself. If you view the talk today, you will immediately appreciate its importance. At the time I did not see a way for me to act on Eli’s insight at Facebook. I no longer had regular contact with Zuck, much less inside information. I was not up to speed on the engineering priorities that had created filter bubbles or about plans for monetizing them. But Eli’s talk percolated in my mind. There was no good way to spin filter bubbles. All I could do was hope that Zuck and Sheryl would have the sense not to use them in ways that would harm users. (You can listen to Eli Pariser’s “Beware Online ‘Filter Bubbles’” talk for yourself on TED.com.)
Meanwhile, Facebook marched on. Google introduced its own social network, Google+, in June 2011, with considerable fanfare. By the time Google+ came to market, Google had become a gatekeeper between content vendors and users, forcing content vendors who wanted to reach their own audience to accept Google’s business terms. Facebook took a different path to a similar place. Where most of Google’s products delivered a single function that gained power from being bundled, Facebook had created an integrated platform, what is known in the industry as a walled garden, that delivered many forms of value. Some of the functions on the platform had so much value that Facebook spun them off as stand-alone products. One example: Messenger.
Thanks to its near monopoly of search and the AdWords advertising platform that monetized it, Google knew more about purchase intentions than any other company on earth. A user looking to buy a hammer would begin with a search on Google, getting a set of results along with three AdWords ads from vendors looking to sell hammers. The search took milliseconds. The user bought a hammer, the advertiser sold one, and Google got paid for the ad. Everyone got what they wanted. But Google was not satisfied. It did not know the consumer’s identity. Google realized that its data set of purchase intent would have greater value if it could be tied to customer identity. I call this McNamee’s 7th Law: data sets become geometrically more valuable when you combine them. That is where Gmail changed the game. Users got value in the form of a good