Disinformation can hurt you locally, too

And it’s time the government did something about it

A U.S. flag is seen in Talent, Ore., on Sept. 16 amid charred cars and structures destroyed by wildfires.  (Paula Bronstein/AFP via Getty Images file photo)
A U.S. flag is seen in Talent, Ore., on Sept. 16 amid charred cars and structures destroyed by wildfires. (Paula Bronstein/AFP via Getty Images file photo)
Posted May 10, 2021 at 10:30am

ANALYSIS — If you need another example of how disinformation can actually hurt people and turn an emergency into a crisis, go back to the wildfires that consumed California, Oregon and Washington last September.

The fires, historic in scale, burned some 5 million acres and thousands of homes and structures. Drought-like conditions throughout the Northwest allowed them to rage, and the resulting smoke cast a pall over the entire region. Officials and emergency responders had enough on their hands without having to convince people the threat was real.

But in Oregon, the situation was made worse by a stream of disinformation — rumors, really — on social media. As the fires spread, so did false claims that they had been deliberately set by antifa, a loosely organized group of far-left activists that clashes with right-wing organizations and has been labeled by conservatives as the source of all violence and disruption.

First posted on niche social media platforms used by QAnon conspiracy theorists, the disinformation graduated to more mainstream networks such as Facebook and Twitter. The Russian government-sponsored television channel and website, RT, amplified the story and repeated falsehoods that antifa members had been arrested for arson.

Soon, police departments and 9-1-1 dispatchers were being bombarded by calls from Oregon homeowners who had heard that roving gangs of antifa members were setting fires. Some neighborhoods set up armed checkpoints, as they mobilized to guard their homes.

The phone calls, rumors and checkpoints all interfered with fire, police and rescue personnel trying to compile accurate information about the numerous fires and effectively dispatch resources to fight them, save lives and property, and evacuate as many residents as possible.

The disinformation went so viral that agents from Oregon’s FBI office, the Oregon state police and local sheriffs jointly took to television and radio to tell residents to stop forwarding unverified information. The antifa stories were untrue, the authorities bluntly stated, and the only person arrested for arson was a homeless man with a history of drug use.

It took two to three days of media saturation by state and local officials to finally quell the antifa conspiracy. That is the power of social media to propel a set of lies.

Last month, Glenn S. Gerstell, a former general counsel for the National Security Agency, cited the Oregon saga in testimony before the House Armed Services subcommittee on cybersecurity, innovative technologies and information systems during a hearing on disinformation and what to do about it.

“Russia jumped on a couple of misleading and false statements that were set forth in some QAnon accounts and really weaponized them in a concerted, coherent way, amplified them and turned them into a detailed, rich story of falsehoods about who started the wildfires, claiming that antifa protesters were doing it,” Gerstell, now a senior adviser at the Center for Strategic and International Studies, told the House panel.

“It reached a point because of what Russia was doing that civilians actually set up roadblocks in Oregon in an effort to stop these perceived but erroneous protesters who, of course, weren’t there,” Gerstell continued. “It actually hurt people who were trying to flee the fire so much so that the Douglas County sheriff and the FBI pleaded with the public to stop circulating these falsehoods.”

Disinformation campaigns are sophisticated, but amazingly cheap. And all of the Pentagon’s bombs and bullets, fighter jets, Navy destroyers and nuclear weapons can’t do much about them.

So, what would work?

For a long time, experts have talked about a “whole of government” approach to the problem — a phrase reporters tend to dismiss when we hear it in Washington, which is often. But this time, they might be right. Gerstell and his fellow witnesses had numerous suggestions for lawmakers.

First, regulate social media by changing Section 230 of the Communications Decency Act, or somehow convince them to regulate themselves with an aim, as Gerstell testified, “to limit the virality of falsehoods, to check them before they get spread too widely.”

Second, establish a bipartisan, agreed-upon campaign of civic education and digital literacy to teach people young and old the differences between falsehood and truth and how to recognize the differences online. Nina Jankowicz, the disinformation fellow at the Wilson Center, suggested such a campaign be based in our public library system, one of the most trusted institutions in America.

Next, the Pentagon must elevate the fight against disinformation, with high-level civilians and officers making it a key part of their strategy portfolios. After all, undermining confidence in our democracy and institutions is a strategy, not just tactics, used by our enemies to weaken, divide and conquer.

The Pentagon should also examine and strengthen its psychological operations — and perhaps marry them to its cybersecurity teams — and reconsider how spies, the National Security Agency and U.S. Cyber Command carry out their intelligence missions to protect the United States without interfering in free and unfettered domestic debate.

Finally — and this is the hardest Rubicon to cross — there must be a consideration of the extent to which the U.S. should fight fire with fire: Do we engage in falsehoods abroad to weaken our enemies, as they do to us? Agencies like the CIA have done so in the past, but most of the witnesses before Armed Services were reluctant to offer support for such activities.

One of them, Herbert Lin, the senior research scholar at the Center for International Security and Cooperation at Stanford University, put it this way: “Do we want to adopt the tactics of the Russians in this? I’m very uncomfortable about that as an American citizen.”

On the other hand, Lin continued, “it’s pretty clear that speaking the truth, just the truth, doesn’t work very well.”

“And we, the Americans, believe in speaking the truth — that the truth will eventually win,” he said. “Maybe eventually. But there’s good evidence that it doesn’t always win in the short term. And how far are we willing to go down that path? That’s a very tough policy question.” 

But that’s why we elect members of Congress and presidents. To answer the tough questions. With disinformation spreading like wildfire, it’s time they do.

Patrick B. Pexton is CQ Roll Call's technology policy editor.