Discover more from Wonkette
Why The Hell Haven't You Deleted Facebook Yet?
You're drinking from a sewer. You should stop that.
BREAKING! ACHTUNG! Facebook is a Doomsday Machine that will kill us all.
Okay, that's not really news in the year 2021. But seriously, why haven't you deleted that filthy hell app yet?
In case you're still swimming in the blue sewer, every major news outlet in the country got its hands on another tranche of leaked Facebook documents this weekend from whistleblower Frances Haugen, and, spoiler alert, it's really, really bad . Turns out, the company consistently prioritized its own profits over the safety of its users and was delighted to monetize content that it knew was harmful to them. All while touting the world-spanning benefits of its "community" and publicly disclaiming responsibility for the poison it was pumping out into the world.
The Atlantic has a picture of the internal blowback at the company on January 7 after the Capitol had been overrun by people who coordinated their efforts on Facebook.
"This is a dark moment in our nation's history," Mark Zuckerberg wrote, "and I know many of you are frightened and concerned about what's happening in Washington, DC. I'm personally saddened by this mob violence."
Chief Technology Officer Mike Schroepfer echoed his boss's brow-furrowing, asking employees to "hang in there."
"We have been 'hanging in there' for years," one staffer shot back.
"All due respect, but haven't we had enough time to figure out how to manage discourse without enabling violence?" said another. "We've been fueling this fire for a long time and we shouldn't be surprised it's now out of control."
"I'm tired of platitudes; I want action items," another responded. "We're not a neutral entity."
Another put it even more bluntly: "History will not judge us kindly."
Facebook's staff was absolutely furious because they'd been screaming bloody murder about the ways the site was optimized to monetize outrage and radicalize users, all the while plastering upper management with policy memos proposing strategies to prevent the platform leading vulnerable people down the rabbit hole to meet up with like-minded maniacs. But nothing ever changed.
Watching in horror on November 5 as the "Stop the Steal" page grew to 330,000 people in its first 24 hours, one Facebook employee wrote, "Not only do we not do something about combustible election misinformation in comments, we amplify and give them broader distribution. Why?"
Well, the "Why" was a little something Zuck and his pals called "MSI" or "meaningful social interactions."
Here's the Washington Post's summary of how Facebook's algorithms got rejiggered to prioritize incendiary content over things you might actually want to see:
Zuckerberg has long been obsessed with metrics, growth and neutralizing competitive threats, according to numerous people who have worked with him. The company's use of "growth-hacking" tactics, such as tagging people in photos and buying lists of email addresses, was key to achieving its remarkable size — 3.51 billion monthly users, nearly half the planet. In Facebook's early years, Zuckerberg set annual targets for the number of users the company wanted to gain. In 2014, he ordered teams at Facebook to grow "time spent," or each user's minutes spent on the service, by 10 percent a year, according to the documents and interviews.
In 2018, Zuckerberg defined a new metric that became his "north star," according to a former executive. That metric was MSI — "meaningful social interactions" — named because the company wanted to emphasize the idea that engagement was more valuable than time spent passively scrolling through videos or other content. For example, the company's algorithm would now weight posts that got a large number of comments as more "meaningful" than likes, and would use that information to inject the comment-filled posts into the news feeds of many more people who were not friends with the original poster, the documents said.
MSI was the reason the site was initially reluctant to clamp down on COVID misinformation, even as it was infecting the entire platform: "Mark doesn't think we could go broad," read notes of one meeting. "We wouldn't launch if there was a material trade-off with MSI."
As NBC 's Brandy Zadrozny writes, the consequences were stark, pushing users toward ever more radical content. An internal report titled "Carol's Journey to QAnon" documented the time it took for Facebook's algorithm to start recommending QAnon pages to a dummy profile that claimed to be a mother from North Carolina whose interests included politics, Christianity, and Donald Trump. Forty-eight hours, from Trump to Q.
The Post also reports that Zuckerberg personally made the decision to cave to pressure from the Vietnamese Communist Party to censor anti-government posts in the country, reasoning that the compromise was necessary "to ensure our services remain available for millions of people who rely on them every day." Meanwhile in America, Zuckerberg cited the paramount importance of free speech — and the danger of pissing off Republicans — as a reason to leave dangerous misinformation on the platform. Because sometimes the ends justify the means, and sometimes the means justify the ends.
The Wall Street Journal notes that the CEO took a similarly laissez-faire approach to posts that promote racial violence against Muslims in India. Of course, that problem, like much of what goes on in the non-English speaking world, was due at least as much to neglect and incompetence as to hypocrisy.
In a report entitled "Adversarial Harmful Networks: India Case Study," researchers documented hate speech and misinformation flowing from pages associated with the Hindu nationalist Rashtriya Swayamsevak Sangh group, or RSS, promoting the idea of a "Love Jihad," whereby Muslim men would seduce Hindu women to convert them, or that "Muslim clerics spit on food to either 'make it halal,' or spread Covid-19, as a larger war against Hindus." The posts were allowed to proliferate both because "Facebook lacks sufficient technical systems for detecting material in the Hindi and Bengali languages" and because of "political sensitivities," i.e. the RSS's association with Prime Minister Narendra Modi.
Naturally Facebook has an answer to this, and it is that the news outlets are just jealous that Facebook is eating their lunch.
CNN got an internal memo from Vice President of Global Affairs Nick Clegg, bucking up the troops:
Social media turns traditional top-down control of information on its head. In the past, public discourse was largely curated by established gatekeepers in the media who decided what people could read, see and digest. Social media has enabled people to decide for themselves – posting and sharing content directly. This is both empowering for individuals – and disruptive to those who hanker after the top-down controls of the past, especially if they are finding the transition to the online world a struggle for their own businesses.
Why the hell haven't you deleted that goddamn app yet?
Follow Liz Dye on Twitter!
Click the widget to keep your Wonkette ad-free and feisty. And if you're ordering from Amazon, use this link, because reasons .