President Joe Biden and Surgeon General Vivek Murthy spent the last several days hammering social media companies for their platforms’ roles in spreading misinformation about COVID-19 vaccines.
“They’re killing people,” Biden said, when asked about the role of social networks in the spread of misinformation. “Look, the only pandemic we have is among the unvaccinated. They’re killing people.” His comments came after Facebook reportedly stonewalled the White House. For weeks, officials unsuccessfully petitioned Facebook to share details about how it is fighting vaccine misinformation on its platforms, according to a report in The New York Times.
The assault continued on Sunday when Murthy appeared on CNN. “These platforms have to recognize they’ve played a major role in the increase in speed and scale with which misinformation is spreading,” he said. And White House Press Secretary Jen Psaki faulted Facebook last Thursday for the pace of its moderation. “Facebook needs to move more quickly to remove violative posts,” she said. “Posts that will be within their policies’ removal often remain up for days. That’s too long. The information spreads too quickly.”
On Saturday, Facebook issued a statement rebutting the White House’s assertions. “At a time when COVID-19 cases are rising in America, the Biden administration has chosen to blame a handful of American social media companies. While social media plays an important role in society, it is clear that we need a whole of society approach to end this pandemic,” Facebook Vice President of Integrity Guy Rosen wrote in a blog post Saturday. “President Biden’s goal was for 70% of Americans to be vaccinated by July 4. Facebook is not the reason this goal was missed.” Rosen pointed to internal data that suggested 85 percent of Facebook users in the US “have been or want to be vaccinated against COVID-19.”
Though people who are hesitant to get the vaccine get their news from a variety of sources, surveys by the Kaiser Family Foundation suggest that social media remains a significant source of “misconceptions” about the shots.
Other companies have caught flak for allowing their platforms to host misinformation, including a Russian campaign that used YouTube channels to undermine the Pfizer vaccine in France. But Facebook has been at the epicenter of misinformation about COVID-19 vaccines and vaccines in general. For years, the company allowed anti-vaccine groups to run ads on the site, and it only began to seriously consider a crackdown on COVID-19 vaccine misinformation months after the vaccines were approved.
Yet even after the crackdown was announced in February, misinformation kept popping up on the social network. One investigation by the Center for Countering Digital Hate said that just 12 people were responsible for a significant portion of lies and misleading claims about COVID-19 vaccines.
And misinformation that gets taken down from English-language accounts is often simply tweaked and reposted in other languages. So while Facebook may be seeing some success in the US—though it’s hard to verify the company’s claims without seeing the underlying data—the problematic posts still threaten public health because COVID-19 doesn’t care which language you speak.
Not all social media networks have suffered from similar amounts of misinformation, though. Pinterest has taken a far more aggressive approach to countering lies and misleading posts about COVID-19 and vaccines by carefully curating search results for key terms, suspending people who violate its rules, and refusing to exempt world leaders from the policies.
“Pinterest’s results suggest that if Facebook scaled up its moderation, it might get further,” Neil Johnson, a professor at George Washington University, told Stat News.
Or, as Sen. Amy Klobuchar (D-Minn.) said on CNN yesterday, “There’s absolutely no reason they shouldn’t be able to monitor this better and take this crap off of their platforms.”