The European Union wants social media platforms to submit monthly reports on how they're handling misinformation around the coronavirus pandemic.
The new guidelines were unveiled by the European Commission on Wednesday and apply to companies including Facebook, Google and Twitter that have signed up to an existing EU code of conduct on disinformation.
World health leaders have warned for months about an “infodemic” surrounding the virus, on everything from its origin to fake cures and how countries are responding to the virus.
“We have witnessed a wave of false and misleading information, hoaxes and conspiracy theories as well as targeted influence operations by foreign actors,” Commission Vice President Josep Borrell said at a press conference. “Some of these are aimed at harming the EU and its member states, trying to undermine our democracies, the credibility of the EU and of national authorities. What is more, this information in times of the coronavirus can kill.”
The European Commission wants platforms who have signed up to the 2018 code of conduct to provide monthly reports with detailed data “on their actions to promote authoritative content, improve users’ awareness, and limit coronavirus disinformation and advertising related to it.” It also wants social media to be more “transparent about implementation of their policies to inform users that interact with disinformation.”
The code of conduct and monthly reports are due to be incorporated into a broader set of binding regulations before the end of 2020.
Věra Jourová, the Commission’s vice president for values and transparency, told reporters that TikTok, owned by China’s ByteDance, had signed onto the code of conduct and talks were also being held with WhatsApp, which although owned by Facebook has not yet come on board.
The Commission also called out Russia and China for their role in seeding and spreading misinformation about the virus around the world. Both countries have been recently named by officials in Europe and the United States as trying to muddy the waters around the coronavirus narrative through a combination of official state outlets and unofficial social media campaigns, with the aim of bolstering their image and tearing down the standing of their adversaries.
All major online platforms have struggled to contain the spread of misinformation around the virus, and all have since instituted measures ranging from banning accounts accused of peddling the worst content, to adding more fact checking tags. Most now try to surface legitimate sources of information whenever a user searches or posts something related to the virus.
But success has been patchy. A slickly produced conspiracy video called “Plandemic” claiming the virus was planned by major world figures and health organizations, garnered millions of views before being taken down.
Google’s YouTube was recently excoriated by senior British lawmaker Yvette Cooper at a parliamentary hearing. She said the video site’s algorithm started recommending anti-vaccination videos after she searched for “David Icke,” a conspiracy theorist whose account had already been deleted because of misleading claims about coronavirus.
In a statement, Google’s president for Europe, Middle East and Asia, Matt Brittin, said the pandemic has proven people need accurate information “more than ever” and that the company is working with European and national authorities.
“We’re committed to the Code of Practice and to our work together to find new and creative ways to continue the fight against disinformation,” Brittin said.
A Facebook spokesperson said it shares the goal of “reducing misinformation about COVID-19,” and that it has instituted extra steps to prevent the spread, including adding warning labels to misinformation and expanding relationships with independent fact checkers.
Sinead McSweeney, vice president of public policy at Twitter, said in a statement the company also supports the efforts “to bring increased transparency and understanding to the fight against disinformation,” noting that Twitter’s programming interface has been kept open for researchers to better understand how information spreads on the platform.