As pressure mounts to do more to fight misinformation about the novel coronavirus, Facebook and its photo service Instagram have been trying to direct social media users to trustworthy content from health organizations.
As pressure mounts to do more to fight misinformation about the novel coronavirus, Facebook and its photo service Instagram have been trying to direct social media users to trustworthy content from health organizations. It’s been unclear, though, how well those efforts have been working, especially as hoaxes and conspiracy theories continue to spread about COVID-19, the respiratory illness caused by the virus.
On Wednesday, Facebook said that together, its self-named service and Instagram have directed more than 1 billion people to resources from organizations including the World Health Organization and the Centers for Disease Control and Prevention. More than 100 million people clicked through pop-ups on the social networks to learn more about these resources, Facebook said.
The new data gives a sense of whether users are actually reading information from these trustworthy sources. And it sheds some light on a strategy that’s meant to help the social network battle falsehoods such as the notion that the coronavirus is caused by 5G or that you can cure COVID-19 by drinking bleach. Roughly 2.6 billion people use at least one of Facebook’s services daily, including Instagram and messaging service WhatsApp.
Nick Clegg, Facebook’s vice president of global affairs and communications, outlined the social media giant’s efforts in a blog post Wednesday. The company’s strategy has included launching new features, providing donations and taking down coronavirus information that could cause physical harm.
In January, in several countries heavily impacted by the spread of the virus, the social media powerhouse started showing Facebook and Instagram users pop-ups in their News Feed, directing them to information from the WHO, the CDC and regional health authorities. Users also see this information when they search for COVID-19. Last week Facebook also started showing users in several countries, including the US and Europe, a new coronavirus information center, an online hub that displays data, news articles, tips about social distancing and other content.
Despite those efforts, misinformation appears to be slipping through the cracks. Earlier this month, The New York Times reported that it found dozens of videos, photos and posts on Facebook, Twitter and Google that included coronavirus misinformation. BuzzFeed also reported that while Facebook is cracking down on coronavirus misinformation in English, it’s struggling to police similar content in other languages.
Misinformation has even surfaced on private messaging apps, including Facebook-owned WhatsApp, a service that’s encrypted and that’s popular in developing countries. WhatsApp also launched a coronavirus information hub, and donated $1 million to help fact-checkers debunk hoaxes, falsehoods and the like.
About 57% of Americans who rely mostly on social media for news say they’ve encountered some or a lot of news and information about COVID-19 that seemed made up, according to a Pew Research Center survey published Wednesday.
This week, Facebook’s Messenger service said it was launching a new program to connect developers with health organizations for free. And soon Messenger plans to start testing limits on the number of chats people can forward at one time using its service.
Comments