Facebook bans Trump for inciting violence as social media criticisms mount

Lawmakers frustrated that it took a mob storming the Capitol before social media companies took action against Trump's posts

In this photo illustration, President Donald Trump Twitter messages are displayed on a smartphone screen in front of the U.S. flag and Capitol building during the Congress session to certify the 2020 Presidential election results.  (SOPA Images/LightRocket via Getty)
In this photo illustration, President Donald Trump Twitter messages are displayed on a smartphone screen in front of the U.S. flag and Capitol building during the Congress session to certify the 2020 Presidential election results. (SOPA Images/LightRocket via Getty)
Posted January 7, 2021 at 5:26pm

Facebook banned President Donald Trump from its site indefinitely on Thursday following pressure from critics who said the social media giant’s earlier refusal to remove Trump from the platform helped lead to the deadly riot by his supporters at the Capitol on Wednesday. 

Mark Zuckerberg, the company’s chief executive officer, accused the president of using the site “to incite violent insurrection against a democratically elected government” on Wednesday. Trump’s ban would last “at least” until he is no longer in office, Zuckerberg said.

Too little, too late, said Democratic lawmakers and online extremism experts who previously warned that social media’s refusal to silence Trump online could result in real-life violence.

“While I’m pleased to see social media platforms like Facebook, Twitter and YouTube take long-belated steps to address the president’s sustained misuse of their platforms to sow discord and violence, these isolated actions are both too late and not nearly enough,” Sen. Mark Warner, D-Va., the top Democrat on the Intelligence Committee, said in a statement.

House Homeland Security Chairman Bennie Thompson, D-Miss., said he was “deeply frustrated that it took a group of domestic terrorists storming the Capitol” for Facebook to finally ban Trump “at least for the next 13 days.”

“I also can’t help but wonder if the decision was an opportunistic one, motivated by the news of a Democratically controlled Congress,” said Thompson in a statement. "[Facebook and Twitter] should announce a permanent ban of his accounts. Nothing short of that will meet this moment.”

Loading the player...

Both companies took incremental steps to limit the reach of Trump’s posts on Wednesday, first labeling Trump’s posts for misinformation, before removing posts and eventually announcing temporary, overnight suspensions of his accounts. All while violence raged at the Capitol, as a mob of his supporters clashed with police, ransacked offices and forced evacuations.

“Clear, publicly discussed red-lines would have allowed both platforms to act in minutes, not hours,” tweeted Alex Stamos, a former chief security officer at Facebook who now runs the Stanford University Internet Observatory.

A spokesperson for Twitter declined to say whether the company would follow Facebook in banning Trump completely, but said the company is continuing to monitor the situation. On Wednesday, Twitter said it would lock his account for 12 hours because of several tweets violating company policies against election misinformation and violent content.

Facebook’s ban on Trump also extended to Instagram, its subsidiary. YouTube, owned by Google, announced policies that would make it easier to remove content containing false information about the 2020 election. Trump was banned from Snapchat outright.

The action by social media companies on Thursday stood in stark contrast to a largely hands-off approach to Trump that lasted, with few exceptions, until last November’s election, when Facebook and Twitter began adding misinformation labels to his posts. Still, the reaction yesterday from experts who have long called for action was one of exasperation.

The events at the Capitol on Wednesday were “surreal” and “horrifying,” tweeted Kate Starbird, a professor at the University of Washington who tracks the spread of disinformation online. “And yet, for researchers studying these online networks and the participatory dynamics of disinformation, there’s this strange comfort. Everyone else can finally see what we’ve been seeing for years,” she said.

New conspiracies already spreading

Despite the ban on Trump’s accounts, it took only hours for new conspiracy theories about the riot at the Capitol to begin spreading online, specifically about who was rioting.

A story by the Washington Times that falsely claimed a facial recognition company had identified members of Antifa, a loosely organized group of far-left activists who often clash with far-right and white nationalist groups at protests, was shared and liked thousands of times on Facebook before it was removed from the newspaper’s website, Buzzfeed News reported.

“This article is disinformation and, unsurprisingly, is rewarded with hidden virality on Facebook,” Joan Donovan, the research director of Harvard University’s Shorenstein Center on Media, Politics and Public Policy, said on Twitter. “No, a facial recognition firm did not uncover Antifa inside the Capitol. But, if you were wondering, yes, MANY people do believe this!”

The quick spread of disinformation about the riot at the Capitol underscored the challenges still facing social media companies even without Trump active on their platforms, namely a deeply entrenched network of conspiracy-friendly groups and right-wing accounts used to spread false information and organize real-world events — including yesterday’s riot.

“Disinformation and extremism researchers have for years pointed to broad network-based exploitation of these platforms,” said Warner, which he said “have served as core organizing infrastructure for violent, far-right groups and militia movements for several years now — helping them to recruit, organize, coordinate and, in many cases, generate profits.”

How to fight disinformation

While it remains to be seen whether Facebook and Twitter will take additional steps to combat disinformation and violent content on their own, Democrats are all but certain to exert increased pressure on the companies once President-elect Joe Biden takes office and control of the Senate shifts to Democratic leader Charles E. Schumer of New York. Schumer will become majority leader after the two new Georgia senators, Raphael Warnock and Jon Ossoff, take office.

During last year’s campaign, Biden was critical of how social media companies handled disinformation, specifically Trump’s posts, and said they should lose longstanding legal immunity provided to them by a 1996 law that protects online companies from lawsuits related to third-party content.

That law, known as Section 230, emerged as a popular target for Democrats looking for a new avenue to rein in the power of Big Tech companies last year, but no legislation passed despite the introduction of multiple bipartisan bills that would have changed the terms of the companies’ immunity. 

But some on the left, including free speech advocates, oppose changes to Section 230 on the grounds that they could result in increased government censorship and human rights violations. Evan Greer, deputy director of the digital rights group Fight for the Future, said changing the law to punish social media companies would be “utterly counterproductive.”

“Creating carve-outs in Section 230 for certain types of political speech would be devastating for anti-racist and anti-fascist social movements –– movements whose voices and organizing are clearly needed now more than ever,” she said. “But this doesn't mean we let Big Tech off the hook for their role in eroding democracy and emboldening white supremacists.”

Instead of changing Section 230, Greer said, Congress should finalize a bipartisan agreement on federal data privacy legislation, “which would make it much harder for companies like Facebook to harvest people’s data and use it to micro-target misinformation and hate into the feeds of the people who are most susceptible to it.”

Last month, House Democrats led by Rep. Jamie Raskin of Maryland proposed a multi-agency Digital Democracy Task Force to help the incoming Biden administration respond to disinformation-laden events. They also said Biden should dedicate resources to help the departments of Justice and Homeland Security de-radicalize online communities.

“Disinformation and misinformation will continue to evolve and our job as policymakers is to keep up,” the lawmakers said, “while also looking ahead at building a more resilient society.”