Big Lawsuits Not Going Great For Social Media Giants
Maybe these products are not good for their users. Where have we heard that before? (Big tobacco. The answer is big tobacco.)
It’s been a rough week for big evil tech companies that are monetizing our attention at every opportunity. On Tuesday, a state court jury in New Mexico found that Meta failed to protect minors who use Instagram and Facebook from online creepers, and approved a civil penalty of $375 million against the company for violating state consumer protection laws. Then on Wednesday, a jury in Los Angeles found Meta and YouTube negligent in a case brought by a young woman who argued their platforms were designed to be addictive, and that the companies had contributed to her mental illness when she was a minor. The jury in that case awarded the now 20-year-old woman, identified only as KGM, $3 million in compensatory damage.
And while the judgments won’t hurt the companies’ bottom line at all, given the huge amounts of money they make. Meta plans to appeal, because it’s a huge soulless megacorporation, but even the $375 million penalty in New Mexico is pocket change compared to the $80 billion that Meta blew on its virtual-reality “Metaverse” concept, from which it has largely pulled the plug because hardly anyone wanted to spend time and money on it. Not even after the VR avatars got legs.
But as New York magazine tech columnist John Herrman points out, the cases are important well beyond the minutes of lost profits they’ll cause, because
the legal theories behind these cases, and the signal they both send about future litigation and regulation, could be enormously consequential. The courts are starting to treat social platforms not as purveyors of protected speech — a safe legal haven for internet companies over for the last two decades — but rather like tobacco companies or asbestos manufacturers, firms whose products are obviously addictive, or dangerous, or marketed in misleading ways.
These two cases are among the first in a wave of lawsuits against Big Tech and its products. Maybe the days of huge liability cases like 1998’s landmark $206 billion big tobacco settlement are over, but we do think it would be a good idea to rein in these giant tech companies before they make human workers redundant or get exclusive content licenses for our temporal lobes. (Cue some technolibertarian writing an op-ed to argue “If you want to sell your brain to Google, you should have that right!”)
In the New Mexico case, the state argued, based on an undercover investigation in which investigators set up fake accounts posing as very young adolescents, that Facebook violated consumer protection laws by putting kids at risk of exploitation, and by failing to inform users of the risks. That investigation led to several criminal prosecutions, including of one 47-year-old man who in 2021 had one of his Facebook accounts shut down because he was being a creeper, but just switched to using another, because that’s really easy to do. The guy had 15 accounts in all, according to a stomach-turning Washington Post story (gift link).
One former Meta engineer testified at the trial that his own daughter, 14, got unwanted advances from creepy guys after she created an Instagram account. “The product is very good at connecting people with interests, and if your interest is little girls, it will be really good at connecting you with little girls,” he said.
A second phase of the trial, to begin in May, will determine whether Meta products constitute a “public nuisance” that would require them to fund programs to mitigate harms, just as if they had dumped raw sewage in a playground, only it’s in people’s brains. The state is also urging the company to institute reforms like “enacting effective age verification, removing predators from the platform, and protecting minors from encrypted communications that shield bad actors.” OK, and how about doing something about ads for that fucking “Royal Kingdom” mobile game, which is super annoying as well as redundantly named?
Following the New Mexico trial, a statement from a Meta spokes person said — presumably in a tinny, electronic monotone — “We respectfully disagree with the verdict and will appeal. We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content.” The spokesperson then probably issued a series of high-pitched beeps and self-destructed.
In the Los Angeles case, KGM’s attorneys argued that the social media companies had deliberately pack their products with features like infinite scroll and algorithms that constantly push (“recommend”) more content to keep people online, the better to get advertising in front of them. The plaintiff testified that when she was a minor, she was using social media constantly, which
caused or contributed to depression, anxiety and body dysmorphia. It “really affected my self-worth,” she said last month.
Speaking about her social media use, K.G.M. testified that she felt she wanted to constantly be on the platforms and feared missing out if she wasn't.
Lawyers for Meta and for Google, which owns YouTube, said, in essence, it was real sad that LGM had “profound challenges,” but that their platforms were entirely innocent and nonaddictive, and maybe she was already messed up by childhood abuse, how about that? Before the case went to trial, two other social media companies, TikTok and Snap, reached settlements with KGM.
Weird Meta CEO Mark Zuckerberg, who for now retains a physical form for occasions like galas, ribbon-cuttings, and liability trials, testified in both cases. In the LGM trial, he said that gosh no, Meta isn’t trying to make Instagram addictive, and if people don’t like how it ruins their lives, they can choose not to use it and the Free Market will save the day. (We are paraphrasing broadly.)
“I’m focused on building a community that is sustainable,” he said […] “If you do something that’s not good for people, maybe they’ll spend more time [on Instagram] short term, but if they’re not happy with it, they’re not going to use it over time. I’m not trying to maximize the amount of time people spend every month.”
“Beep-Boop, Illogical. Norman, please coordinate,” Zuckerberg then added, smoke coming from his ears.
The new trend in lawsuits against tech companies that are fucking up our lives seek to get past protections for platforms in Section 230 of the Communications Decency Act of 1996, which protects social media companies — including Yr Wonkette, because we have a comments section full of you filthy fuckaducks — from being held responsible for terrible things their users post on their platforms. Instead, the new strategy is to focus on the platforms as products whose features can cause harm, especially to younger users. Instead of handling social media harms in the legal frame of free speech, the suits argue that the platforms themselves are defective, addictive products that are unsafe at any screed.
Even more lawsuits are making their way through the courts, including several cases against Elon Musk’s Grok AI, over how easy it is for skeevy users to create gross nonconsensual sexual images of women and even children. Even though nudity is technically banned by xAI, users quickly found ways around the rule so they could upload real, fully clothed images of kids and get the AI tool to generate child sexual abuse images. Those lawsuits also allege that xAI was negligent in building a thing that could do that, what the fuck is wrong with you, arrrrgh.
Elon Musk and Grok developers glibly said that one of the cool things about Grok’s image generation function was that it would reply to “spicy” prompts that more staid machine-learning systems would avoid, as part of Musk’s mission to fight the “woke mind virus,” so gosh, we feel really bad that those words have made it into the lawsuits as part of the claim that the company is negligent.
It’s all pretty gross, and would be even if Elon Musk weren’t involved. Thankfully, you can stop thinking about That Man, because this is also your OPEN THREAD.
[CNBC / NBC News / New York / WaPo (gift link) / NYT / NBC News]
Yr Wonkette is funded entirely by reader donations. If you can, please become a paid subscriber, or make a one-time (or recurring) donation with this here button, Beep Boop, Danger Will Robinson.





Whoops, forgot to change the category for the apres-noon news roundup newletter from "Wonkette" to "Wonkette news one-a-day" so it was briefly up on the main site. The error has been rectified, and those responsible for sacking those who were sacked, have been sacked.
"Unsafe at any screed" just came to me while I was typing, and by golly I am pleased with that one.