This is the context for the crucial line in Bosworth’s post: “[Trump] got elected because he ran the single best digital ad campaign I’ve ever seen from any advertiser.” In other words, Donald Trump earned the Electoral College victory. Facebook played a crucial role, but merely as a conduit for fair-and-square campaigning. Trump’s campaign was running orders of magnitude more types of ads than the Clinton campaign, and it does seem like it was effective.
It’s easy to imagine the continuation of the argument: Would anyone blame the medium of television for John F. Kennedy’s 1960 victory just because Kennedy was so much better than Richard Nixon on TV? One might even ask the same thing about the direct-mail revolution—and its great kingpin, Karl Rove. Would you ban the mail just because some political operatives got good at using it to win elections?
Since the 2016 election, many different and sometimes conflicting critics have sprung up. Among the complaints: an openness to exploitation by bad-faith actors like Russian operatives, the propensity of the system to propel conspiracy theories and fake news, a lingering sense that the company has not fully accepted responsibility for the content that courses through its pipes, and a fractal irresponsibility epitomized by the company’s slow-motion responses to Facebook-inflected human-rights crises, such as that in Myanmar (also called Burma).
Meanwhile, people like the United Nations’ special rapporteur on human rights worry that Facebook could become a back door for speech suppression in authoritarian regimes without a better grounding in principle, and the far right accuses Facebook of being unfair to them, despite right-wing content thriving on American Facebook and around the world.
Bosworth extended his logic to the rest of the platform too, arguing against “limiting the reach of publications who have earned their audience, as distasteful as their content may be to me.”
But as the journalist Joshua Benton pointed out, “earned” is doing a lot of work there. Facebook has worked closely with many media companies over the years, pushing and pulling them with different incentives. Low-end publications have grown by posting stolen content, hoaxes, conspiracy theories, and lies. And it is at least plausible that Facebook’s reward system encourages sensationalistic, schmaltzy, or truthy content. Monthly lists of the most-shared stories tend to show exactly that.
To generalize: Over the years, Facebook has not been good at anticipating the second-order consequences of its actions. It reshapes people’s behavior and companies’ investments, and then is surprised when that changes the system.
Bosworth is fully within the individualistic traditions of Silicon Valley, which tend to see people as society-free market particles interacting with one another. In his memo, he compares Facebook to food companies. “What I expect people will find is that the algorithms are primarily exposing the desires of humanity itself, for better or worse,” he wrote. “This is a Sugar, Salt, Fat problem. The book of that name tells a story ostensibly about food but in reality about the limited effectiveness of corporate paternalism. A while ago Kraft foods had a leader who tried to reduce the sugar they sold in the interest of consumer health. But customers wanted sugar. So instead he just ended up reducing Kraft market share. Health outcomes didn’t improve. That CEO lost his job. The new CEO introduced quadruple stuffed Oreos and the company returned to grace.”