The decision comes after a group found that roughly 40 percent of claims debunked by fact-checkers remained on the social networking giant.
LONDON — Millions of Facebook users will soon be told if they saw online posts containing misinformation about the coronavirus pandemic after the social networking giant announced Thursday its latest plans to contain the spread of rumors, half truths and lies connected to the public health crisis.
The move, which will start over the next three weeks, represents a major step by Facebook — an acknowledgment that its efforts to scrub the platform of falsehoods related to the coronavirus have not been sufficient to stop millions of people sharing, liking and engaging with misinformation.
“Through this crisis, one of my top priorities is making sure that you see accurate and authoritative information across all of our apps,” Mark Zuckerberg, the company’s chief executive, wrote on his Facebook page.
The decision, in part, comes after the campaign group Avaaz discovered that over 40 percent of the coronavirus-related misinformation it found on Facebook — which had already been debunked by fact-checking organizations working with the tech giant — remained on the platform despite the company being told by these organizations that the social media posts were false.
In total, Avaaz said that these fake social media posts — everything from advice about bogus medical remedies for the virus to claims that minority groups were less susceptible to infection — had been shared, collectively, 1.7 million times on Facebook in six languages.
“Facebook, given its scale, is the epicenter for misinformation,” Fadi Quran, Avaaz’s campaign director, told POLITICO, adding that the company’s efforts to combat the problem had steadily improved since the social network announced it would do all it could to stop the spread of such life-threatening falsehoods.
Facebook said Thursday that its existing steps, including pinning government public health warnings to the top of people’s news feeds, had led to 350 million people worldwide clicking through to authoritative sources in search of accurate information.
“Facebook should be proud of this step,” added Quran in reference to the company’s decision to retroactively notify people they had seen misinformation. “But the step doesn’t reflect the full gamut of what we would like to see them do.”
As part of its latest push to quell the spread of coronavirus misinformation, Facebook will show people who engaged with false content, which has now been deleted, messages that debunk those claims based on fact-checking efforts by the WHO. That includes claims that 5G mobile networks can spread the disease and rumors that hot climates can render people immune — posts that have since been removed.
The U.N. agency has become a political target after President Donald Trump announced the U.S. was cutting its funding from the WHO.
Facebook would not give a figure on the number of its users who would see the notifications. But as the company said it had removed hundreds of thousands of posts containing misinformation — and because the number of its global users now tops more than 2.2 billion people — it’s likely that millions of users will be told they have engaged with some form of coronavirus misinformation.
Despite this latest step, the company will not run similar labels next to more politically motivated content, promoted by global leaders such as Trump and Jair Bolsonaro, Brazil’s president, stating that certain medical treatments like hydroxychloroquine may aid recovery from the coronavirus. There is no clear evidence that is the case.
So far, Facebook said that, based on the work of independent fact-checking organizations, it had labeled 40 million Facebook posts with warnings that the content may be false.
“We will need more transparency and better access to data for researchers to fully verify the scope and impact of false content,” Vêra Jourová, the European Commission vice president, told POLITICO in a written statement. She welcomed Facebook’s latest announcement, but added: “I’m worried to see such high volumes of harmful content and misinformation spread in times of pandemic.”
Fake content not removed
Since the global crisis began in late December, coronavirus misinformation remains widespread on Facebook and its other digital services, Instagram and WhatsApp, often fueled by people’s desperate efforts to protect themselves from a global pandemic that, so far, has left almost 140,000 dead.
To check how the social network was handling the flood of false posts, Avaaz tracked 104 claims debunked by fact-checkers to see how quickly they were removed from the platform. Along with its use of artificial intelligence to clamp down on misinformation, Facebook has promoted its work with these independent organizations as a cornerstone of its response to the global pandemic. It is now working with 60 organizations across 50 languages.
While the sample size of its fact-checked reports was relatively small compared with the millions of daily social media posts, Avaaz said that 41 percent of those posts remained online without any warning labels telling people that they contained misinformation.
For English-language posts, that figure fell to 29 percent. But for coronavirus fake reports in both Italian and Spanish, roughly 70 percent of these fact-checked false posts had not been removed from Facebook. In response, the company said it had taken down 17 of the false posts flagged by Avaaz, and it was prioritizing fact-checking work that debunked the most viral forms of digital lies.
While the company subsequently deleted some posts, Quran from Avaaz said identical social media messages, including the same images and texts, had often spread like wildfire globally as others copied the misinformation to share it with their friends and families.
“Facebook takes the ‘mother’ misinformation down, but its ‘babies’ stay online,” he added.
The campaigning group also said that, on average, it took Facebook a week to remove fact-checked fake reports from its platform after being informed by independent groups the posts contained misinformation. The delay, according to Quran, allowed millions of people to view the falsehoods after the company knew they were bogus.
Clara Jimenez Cruz, a co-founder of Maldita.es, the Spanish fact-checking service, who was not affiliated with Avaaz’s report and worked closely with Facebook, said the social networking giant’s existing debunking system, which tells told people if they have seen a fake post, was working relatively well. But, she added, misinformation continued to be shared widely and Facebook had to do a better job at promoting fact-checkers’ work to those being bombarded with digital lies about the coronavirus.
“We still need them to work on how to increase the reach of what we’re doing,” she said. “We need to get better at finding related content.”