

Discover more from Wonkette
Surgeon General's Warning: COVID Misinformation May Be Hazardous To EVERYBODY'S Health
Biggest challenge: Putting a warning label on Tucker Carlson's forehead.
Dr. Vivek Murthy has issued his first "surgeon general's advisory" during his term in the Biden administration, to call attention to the health dangers posed by misinformation about COVID-19 and the coronavirus vaccines. Murthy calls on all American institutions, and all Americans, to do everything they can to limit the spread of dangerous health misinformation. He didn't recommend warning labels be tattooed on the foreheads of all Fox News anchors or Republican members of Congress, at least not yet. Instead, he suggests peopleactually use their heads and seek credible sources of health information. We're doomed.
OK, we kid. Some. Seriously, go read the whole advisory, for your own edification and future Twitter arguments if nothing else. Minus the title page, table of contents, and references, it's just 12 pages, and very good reading!
Murthy announced the campaign against misinformation at Thursday's White House press briefing; here's the video:
Lies Are Hazardous To Your Health
Murthy noted that his report might seem a bit different from other surgeon general's advisories, which often address public health threats like smoking, addiction, and the like, but he added that right now, "we live in a world where misinformation poses an imminent and insidious threat to our nation's health."
We were impressed by Murthy's take on how bad information throws poison into the marketplace of ideas:
[The] truth is that misinformation takes away our freedom to make informed decisions about our health and the health of our loved ones.
During the COVID-19 pandemic, health misinformation has led people to resist wearing masks in high-risk settings. It's led them to turn down proven treatments and to choose not to get vaccinated. This has led to avoidable illnesses and death. Simply put, health [mis]information has cost us lives.
How Do We Fix This?
Murthy called for an "all of society" approach to combating COVID misinformation, starting with all of us: How about we all think twice before forwarding stuff we see online that sounds amazing, but might be bullshit? Instead, perhaps it would be a better idea to ask where that's coming from, and see if it is indeed credible or complete bullshit. (We really want Murthy, whose personality is like an acutely smart, well-informed, sciencey version of Kenneth the NBC Page, to just shout cusses some time, because he's so goddamn nice.)
He even suggested a slogan that might fit nicely on an old WW-II style poster: "If you're not sure, don't share."
Because it'll take more than just better individual behavior, Murthy also recommended that other institutions, like professional groups and foundations, help spread the word on how to spot health misinformation, and to study how it spreads and what works to counter it. He even touched on one of my own favorite hobby horses, calling on schools and teachers to improve media and health information literacy. (The full report goes into a bit more detail on that, as well)
Along similar lines, Murthy urged media organizations to "proactively address the public's questions without inadvertently giving a platform to health misinformation that can harm their audiences," which is a huge goddamn deal. There too, the full report has excellent guidelines, from training reporters to be more thoughtful about sources, to providing context when discussing health information, particularly warning against pushing "alternative" views that have no validity. ( Dok's Hobby Horse time again: the term "truth sandwich," as a means of discussing disinformation without granting it credibility, doesn't appear in the report, but it damn well should).
Vaccinating Against Social Media Disease
Murthy also called out tech platforms for their role in spreading misinformation:
We're asking them to operate with greater transparency and accountability. We're asking them to monitor misinformation more closely. We're asking them to consistently take action against misinformation super-spreaders on their platforms.
Let's dive into that one a bit more, because the full report is quite good, pointing out that the very things that make social media so addicting (and profitable) also help promote the spread of bad information. We've removed the footnotes here:
First, misinformation is often framed in a sensational and emotional manner that can connect viscerally, distort memory, align with cognitive biases, and heighten psychological responses such as anxiety. People can feel a sense of urgency to react to and share emotionally charged misinformation with others, enabling it to spread quickly and go "viral."
Second, product features built into technology platforms have contributed to the spread of misinformation. For example, social media platforms incentivize people to share content to get likes, comments, and other positive signals of engagement. These features help connect and inform people but reward engagement rather than accuracy, allowing emotionally charged misinformation to spread more easily than emotionally neutral content. One study found that false news stories were 70 percent more likely to be shared on social media than true stories.
Third, algorithms that determine what users see online often prioritize content based on its popularity or similarity to previously seen content. As a result, a user exposed to misinformation once could see more and more of it over time, further reinforcing one's misunderstanding. Some websites also combine different kinds of information, such as news, ads, and posts from users, into a single feed, which can leave consumers confused about the underlying source of any given piece of content.
Yep, yep, and holy crap Yep! The specific recommendations for tech platforms are promising but may also involve some whistling in the dark, since so much of social media is about connecting people with content that triggers their neurotransmitters and reinforces biases.
Hell yes, we would support social media platforms making changes that would discourage the spread of dangerous garbage, but the trick will be getting the companies to act on recommendations like these:
Redesign recommendation algorithms to avoid amplifying misinformation, build in "frictions"— such as suggestions and warnings—to reduce the sharing of misinformation, and make it easier for users to report misinformation.
There's some of that going on already, clumsy though it may be, like popping up links to valid information. But fundamental changes to the algorithm might mean fewer clicks, and fewer eyeballs on ads.
Here Comes The Rightwing Freakout
Other recommendations are more proactive, and are already eliciting howls of "censorship!" from the usual paranoid sources, which, to be clear, howl about "censorship" no matter what. Among them, the advisory recommends beefing up moderation, both by AI and by actual humans, particularly when it comes to non-English language posts and livestreams. But the one that's really got the wingnuts freaking out is this'n:
Prioritize early detection of misinformation "super-spreaders" and repeat offenders.
Impose clear consequences for accounts that repeatedly violate platform policies.
Oh noes! The federal government wants to do fascism to unapproved views! Muh free speach! Mind you, social media platforms are not the federal government, not even when lawyers for a disgraced former president say they are.
One problem with the recommendations is that, as we note, they simply reinforce the Right's own delusions that they're being repressed, which spurs them to cling to disinformation ever more strongly. We're not entirely certain that's necessarily avoidable, though, and if social media platforms really do take more measures to reduce the size of the wingnut misinformation bubble, maybe that would still be good for everyone not already inside it?
Don't Expect Tech Platforms To Help Much
It's certainly something that deserves closer study, which, hey, is also one of Murthy's recommendations, both for nonprofits that study how misinformation spreads and can be dealt with, and for the platforms themselves. The advisory calls on platforms to "Give researchers access to useful data to properly analyze the spread and impact of misinformation," which sounds like a great idea, although it flies in the face of the platforms' actual behavior. They consider their algorithms and internal data so vital to their profits that they guard them with a paranoid fervor that may rival wingnuts' attachment to conspiracy theories.
Back in April, Facebook gutted its own data analytics outfit, Crowdtangle, as the New York Times reported just the day before Murthy's advisory. The company reassigned staff from the semi-independent operation and brought it more tightly under Facebook's own control, because Crowdtangle had been too darn transparent, and that was pretty embarrassing to Facebook.
Executives behind the move
argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.
Oops, so much for cooperation and transparency! Facebook data revealed Facebook spreads misinformation and rightwing craziness, so Facebook decided the solution isn't to fix that, but to hide the data.
Damned if we can see any easy answer to that! But Joe Biden saying that Facebook is killing people with misinformation might help shame them. Or not, who knows?
Still, we like Murthy's recommendation that much of the solution to dealing with misinformation will require getting good information to people, on a one-to-one basis if needed. (That certainly seems to be what the research on communicating about climate change shows, too.) For all the crazy it revealed, that March Frank Luntz focus group on "vaccine hesitancy" suggested that people do at least trust doctors who give them unemotional facts , so that seems valuable too.
Now, if we could just find some way of making sure people actually can see doctors.
[ Confronting Health Misinformation / White House / AP / NYT / NPR ]
Yr Wonkette is funded entirely by reader donations. if you can, please help us bring you both the very best information AND fart jokes with a donation of $5 to $10 a month.
Do your Amazon shopping through this link, because reasons .
Surgeon General's Warning: COVID Misinformation May Be Hazardous To EVERYBODY'S Health
Thank you. I got really angry at that family member. Somehow these full-screen garbage memes, one after the other, a one-line break in between of condolence, just seemed so offensive. I got over it.
I just hate that people accept these memes without any skepticism or doubt, a moment of consideration; they don't even comment with an up or down, they just spread the shit. It would be nice to block the originators, but as long as people enjoy spreading spoonfed garbage, it won't matter.
Fine, I had no reactions. They wanted me to document my temp and any site reactions for the 1st 4 days, now it's exception charting. I go back to see them in a month, and then phone contacts. It's a 3 year study, and I may have gotten a placebo.