Skip to content

Lawmakers, tech companies struggle to curb coronavirus disinformation

Among the wackier bits of disinformation is that new 5G technology is spreading COVID-19

While public health and government officials scramble to contain the spread of COVID-19 around the United States, some lawmakers are seeking to address another scourge: online disinformation about the virus that could pose health risks and sow division among Americans.

False information about the virus has spread quickly across the internet in recent months, taking the form of misinformation — inaccurate information spread unwittingly by social media users, such as those about supposed vaccines for the virus — and more malicious disinformation, which can be spread by “bots” deployed by Russia, China or other adversaries.

Technology companies Google, Facebook and Twitter have each taken steps to curb the spread of misinformation and disinformation on their platforms, but it’s nearly impossible to stop it completely. Recently, lawmakers on Capitol Hill who have already called on Silicon Valley to do more are beginning to shape legislative action, although it’s unclear what could become law.

“The spread of false and potentially dangerous claims during a lethal pandemic clearly poses a threat to our national security,” said Rep. Lauren Underwood, D-Ill., at a virtual forum on disinformation hosted by the House Homeland Security Committee last month. “When it comes to vital public health information, the stakes are life and death.”

Underwood said she planned to introduce legislation in the coming weeks to “address the impact and the threat of disinformation to public health and safety.”

The legislation being crafted by Underwood would focus on educating state and local governments about the threat of false information related to the virus and fostering research on how it spreads, according to a committee aide. The legislation would also focus on enabling the Homeland Security Department to take the lead on combating virus-related misinformation.

Loading the player...

Other Democrats also want more information about how COVID-19 disinformation is spreading. A provision in House Democrats’ latest coronavirus relief legislation, which passed the chamber last month, would provide $1 million for the National Science Foundation and the National Academies of Science, Engineering, and Medicine to conduct a disinformation study.

“We are tasking the brightest scientific minds to study the very real threat this growing trend poses to public health,” said Rep. Jennifer Wexton, D-Va., who authored the provision. “Independent, objective scientific analysis will better inform policymakers so that we can tackle this challenge in a bipartisan way.”

Still, the legislation is unlikely to pass the Senate, with Senate Majority Leader Mitch McConnell dismissing the overall bill as a “totally unserious” proposal by Democrats.

Federal agencies are also stepping up to help educate the public about false virus-related information online. A brief published by the Homeland Security Department’s Cybersecurity and Infrastructure Security Agency last month outlines the types of disinformation social media users are encountering, including false narratives about China’s response, harmful supposed treatments, and conspiracy theories alleging that 5G wireless technology is spreading the virus.

“Information manipulation and fabrication about COVID-19’s origin, scale, government response, and/or prevention and treatment surged as creators leveraged people’s increased uncertainty,” CISA said.

Different platforms, different problems

Disinformation has posed a variety of problems for social media companies, each of which operates differently and faces unique challenges. For example, Twitter faces an onslaught of bots that experts say are designed to divide Americans based on whether they believe the economy is ready for “reopening” following the issuance of nationwide stay-at-home orders.

Since January, researchers at Carnegie Mellon’s School of Computer Science have collected more than 200 million virus-related tweets and found that the most influential re-tweeters are overwhelmingly more likely to be bots than real people. More than 80 percent of the top 50 influential re-tweeters are bots, the university said, along with 62 percent of the top 1,000.

Kathleen Carley, a professor who runs the university’s Center for Computational Analysis of Social and Organizational Systems, says the pandemic-related bot activity is up to two times more prevalent than what was measured in previous global events, including elections.

“What we’ve seen is bots both pushing for reopening and against reopening,” Carley told CQ Roll Call. “They are operating on both sides, and there are a lot of them.”

Carley says a huge number of bots are operating as a “propaganda arm” for websites that feature fake news, while others are being used to amplify polarizing opinions. Portions of the bots are following the Russian and Chinese disinformation playbooks.

“Some of the bots are doing things consistent with common strategies that Russia has used in the past, such as infiltrating online groups and using them to polarize,” Carley said. “They’re also doing things consistent with what we’ve seen Chinese accounts do in the past, such as promote stories that are very strongly pro-China and dismiss stories that are anti-China.”

Twitter says it’s working to limit the reach of disinformation posted on its platform. The company recently announced a new set of labels that would be placed on tweets containing inaccurate information about the virus and says it has removed thousands of misleading tweets.

Facebook and Google are facing challenges of their own. Facebook users interacted with a 26-minute conspiracy theory video called “Plandemic,” which claims a group of global elites are spreading the virus to profit off an eventual vaccine, around 2.5 million times. A version of the video posted on YouTube, which is owned by Google, was viewed more than 7 million times.

Both companies have worked to remove the video from their sites and taken additional steps to flag inaccurate information posted on their platforms. In April, Facebook placed warning labels on roughly 50 million posts related to the pandemic, the company said last month, and has directed 2 billion people to information provided by the World Health Organization.

Google has also been scrutinized for selling advertisements to websites that peddle conspiracy theories and other inaccurate information about the virus. A recent study by the Global Disinformation Index found that Google placed advertisements on more than 80 percent of a sampling of 49 websites publishing false claims about the virus, according to Bloomberg.

But the company said none of the websites in the study violated its advertising policies.

“We are deeply committed to elevating quality content across Google products and that includes protecting our users from medical misinformation,” a spokesperson, Christina Muldoon, told Bloomberg. “Anytime we find publishers that violate our policies, we take immediate action.”

The pressure is on

Despite their efforts, the social media giants have yet to convince lawmakers that they’re doing everything possible to stop the spread of disinformation. Wexton and others, including House Speaker Nancy Pelosi, say they should take more significant action.

“I think the social media companies definitely need to take an aggressive stand against disinformation,” Wexton said. “They have these policies but there are obviously loopholes and gaps. So they need to be eternally vigilant and police themselves better because they are breeding grounds for the spread of disinformation.”

Speaking to reporters after President Donald Trump issued an executive order in response to allegations of anti-conservative bias by social media companies last month, Pelosi blasted Silicon Valley for what she called a “complete failure to fight the spread of disinformation.”

“Again and again, social media platforms have sold out the public interest to pad their corporate profits,” Pelosi said. “Their business model is to make money at the expense of the truth.”

Recent Stories

Capitol Lens | O’s face

Mayorkas impeachment headed to Senate for April 11 trial

Muslim American appeals court nominee loses Democratic support

At the Races: Lieberman lookback

Court says South Carolina can use current congressional map

Joseph Lieberman: A Capitol life in photos