Is Facebook finally fighting fake news? Learn about their new policy
Facebook is finally taking steps to combat the flow of grossly incorrect and dangerous fake news about the coronavirus on its platform, just in time for COVID-19 vaccines to begin rolling out around the world.
The real deal
Millions of Facebook users will soon be told if they saw online posts containing fake news about the coronavirus pandemic after the social networking giant announced its latest plans to contain the spread of rumors, half-truths, & lies connected to the public health crisis last week.
Facebook is banning claims about COVID-19 vaccines that have been quashed by public health experts, as governments prepare to roll out the first vaccinations against the virus. That includes posts that make false claims about how safe and effective the vaccines are, and about their ingredients & side effects.
“For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn’t on the official vaccine ingredient list,” Facebook’s head of health, Kang-Xing Jin, said in a blog post. “We will also remove conspiracy theories about COVID-19 vaccines that we know today are false, like specific populations are being used without their consent to test the vaccine’s safety.”
The new ban is an expansion of Facebook’s rules against fake news about the coronavirus that could lead to imminent physical harm. The company said it removed 12 million such posts from Facebook & Instagram between March & October.
The move, which will begin over the next three weeks, represents a major step by Facebook — an acknowledgment that its efforts to remove the platform of falsehoods related to COVID-19 have not been enough to stop millions of people sharing, liking & engaging with misinformation.
“Through this crisis, one of my top priorities is making sure that you see accurate and authoritative information across all of our apps,” Mark Zuckerberg, the company’s chief executive, wrote on his Facebook page.
The approach to COVID-19 vaccines is different from Facebook’s general approach to vaccine misinformation. The company has made false claims about other vaccines less visible on its platform but stopped short of removing them. In October, it banned anti-vaccination ads.
Facebook said it was extending the policy because COVID-19 vaccines will soon be rolled out around the world. The U.K. became the first country to approve a vaccine this week, with the first doses expected to be available next week. Regulators in the U.S. are expected to approve vaccines before the end of the year.
On Monday, Facebook CEO Mark Zuckerberg said the company would show users “authoritative information” about vaccines. It’s adding a section to its coronavirus information center — a section of its site that promotes credible sources — with details about how vaccines are tested & approved.
The reason why
The decision, in part, comes after the campaign group Avaaz discovered that over 40 percent of the coronavirus-related fake news it found on Facebook — which had already been debunked by fact-checking organizations working with the tech giant — stayed on the platform despite Facebook being told that the social media posts were false.
In total, Avaaz said that these fake social media posts — everything from advice about bogus medical remedies for the virus to news claims that minority groups were less susceptible to infection — had been shared, collectively, 1.7 million times on Facebook in six languages.
Fake news earthquake
“Facebook, given its scale, is the epicenter for misinformation,” Fadi Quran, Avaaz’s campaign director, told Politico, adding the company’s efforts to combat the problem had steadily improved since the social network announced it would do all it could to stop the spread of such life-threatening falsehoods.
Facebook said Thursday that its existing steps, including pinning government public health warnings to the top of people’s news feeds, had led to 350 million people worldwide clicking through to authoritative sources in search of accurate information.
“Facebook should be proud of this step,” added Quran in reference to the company’s decision to retroactively notify people they had seen misinformation. “But the step doesn’t reflect the full gamut of what we would like to see them do.”
Social media ‘me toos’
YouTube, owned by Google, & TikTok also have said they will remove false claims about COVID-19 vaccines. Despite efforts by Facebook and other platforms to curb the spread of hoaxes and conspiracy theories, misinformation about the pandemic has spread widely on social media this year.
What’s your take on Facebook’s new policies? Speak your mind in the comments.