At a House Energy and Commerce joint subcommittee hearing Thursday on the spread of disinformation online, lawmakers of both parties repeatedly expressed alarm that social media companies, including Facebook, Google, YouTube and Twitter, were profiting from keeping their users hooked to their platforms.
“The dirty truth is that they are relying on algorithms to purposely promote conspiratorial, divisive or extremist content so that they can take money, make money in ad dollars, and this is because the more outrageous and extremist the content, the more engagement and views these companies get from their viewers,” said Rep. Frank Pallone Jr., D-N.J., the chairman of the full committee.
In summary, Pallone said, more engagement and more views equaled more money for the platforms.
In 2020, Facebook’s profits rose 58 percent to $29.1 billion. Google’s parent company, Alphabet, saw its profit rise 17.5 percent to $40.3 billion last year. Twitter, a smaller social media company than its other two rivals, reported a loss of $1.14 billion for 2020.
In any other context, a business trying to make more money by keeping its customers hooked to its product or service would be seen as the sine qua non of a capitalist enterprise. But in this case, user-generated content that’s flowing through social media platforms could range from a middle-school kid bullying a classmate or a sexual predator preying on young people to messages promoting violence against religious and ethnic minorities or — planning an attack on Congress.
In response to numerous complaints from Congress, researchers and watchdog groups, social media companies have formed several in-house and independent committees to decide what content to keep and what to remove, how to get rid of repeat offenders, all while continuing to keep their platforms humming along with clicks, likes, and forwarded messages.
Facebook, Google and Twitter also have pledged several hundred million dollars to news organizations and fact-checking bodies.
And yet Congress keeps holding hearings on what to do about online disinformation and each new hearing appears to come after online discord has been ratcheted up another notch, leading to fresh violence or another rip in the country’s social fabric.
Lawmakers are still divided on how to amend Section 230 — the portion of U.S. law that gives social media companies protection from liability for content posted by their users.
House Energy and Commerce ranking member Cathy McMorris Rodgers appeared to capture the waning enthusiasm among tech’s early adopters.
“Ten years ago when I joined Big Tech platforms, I thought they would be a force for good,” she said Thursday. “I thought they would help us build relationships and promote transparency in Congress. I can testify today I was wrong.”
Rodgers said her trust in social media platforms had eroded after she saw how “you’ve abused your power to manipulate and harm our children.” She said, as a parent of three school-going children, she and her husband were “fighting the Big-Tech battles in our household everyday” to keep their kids from succumbing to online addiction.
Despite actions by Facebook, Twitter and Google’s YouTube to remove misinformation about election results and the pandemic, Rep. Mike Doyle, D-Pa., said misinformation and disinformation continue to spread on the platforms.
Doyle, chairman of the Energy and Commerce Communications and Technology Subcommittee, said the Jan. 6 attack on the Capitol was “started and nourished” on Facebook. He said misinformation on COVID-19 vaccines continues to be spread by about a dozen “super-spreader” accounts on all three platforms.
“You can take this content down, you can fix this, but you choose not to,” Doyle said.
At the start of the pandemic last year social media companies cracked down on incorrect information about the virus and it led to a drop in disinformation, Doyle said.
But since then, the companies appear to have stepped back and “time after time you are picking engagement and profit over the health and safety of your users, our nation, and democracy,” Doyle said.
When pressed by Doyle to say if Facebook would take down 12 accounts that are said to be spreading disinformation on the vaccines, Facebook CEO Mark Zuckerberg said he would have to “have my team look at the exact examples” to see how they violated the platform’s policies.
Sundar Pichai, CEO of Google and YouTube’s parent, Alphabet, said the company had clear policies and had removed hundreds of thousands of accounts spreading misinformation. But the company’s policies allow for people to post their “personal experiences” even if they seem like misinformation, Pichai said.
Twitter CEO Jack Dorsey said the company removes any account in violation of its policies.
Attorneys general from 12 states have written to the CEOs of the three companies asking them to take down anti-vaccine information from their sites, because such misinformation is affecting vaccine intake among the most vulnerable Americans.
While many lawmakers peppered the CEOs with questions demanding a yes or no answer, the executives often responded with a variation of “it’s complicated” or the question required a more nuanced reading.
After seeing the CEOs avoid a yes or no answer several times, Rep. Bill Johnson, R-Ohio, was frustrated.
“There’s a lot of smugness among you,” Johnson told the CEOs. “There’s this air of untouchableness in your responses to many of the tough questions.”
Johnson then likened the social media companies to Big Tobacco, which after years of contesting claims that they were promoting an addictive substance, agreed to a $245 billion civil litigation settlement in the late 1990s.
“While this is not your first hearing in front of Congress, I can assure you that this hearing marks a new relationship between all of us here today,” Johnson said. “There will be accountability.”