Policy

Father of slain journalist seeks regulation of internet content

Activist says Google not doing enough to police violent footage available on YouTube

Andy Parker, right, seen with Sen. Mark Warner, D-Va., at a gun control rally in 2015, says YouTube has not done enough to remove videos related to his daughter's murder from its platform. (Tom Williams/CQ Roll Call file photo)

Following Saturday’s mass shooting at a mall in El Paso by a suspect who appears to have been steeped in a white supremacist internet subculture, activist Andy Parker on Tuesday accused Google executives of lying about their efforts to remove objectionable content, including footage of shootings, from its YouTube platform.

Parker also called for a new law revising the Communications Decency Act of 1996, which regulates online platforms, so that it would prohibit “targeted harassment, incitement, and murder videos” and open up technology companies to civil and criminal liability.

[‘Come back ... immediately’: Democrats call for special session in aftermath of mass shootings]

Parker, speaking at the National Press Club in Washington, is the father of slain Virginia journalist Alison Parker. She and a cameraman, photojournalist Adam Ward, were gunned down on live television in 2015 by an aggrieved former colleague.

According to Andy Parker, Google has failed to adequately police content posted on YouTube.

Since the killing, which was also recorded and uploaded to Facebook by the shooter, Parker has attempted to have footage of the attack removed from YouTube, but has been stymied by Google’s legal team and the protective umbrella of a clause of the Communications Decency Act. He says the law means tech companies have no incentive to respond to complaints.

“[YouTube Legal Director] Lance Kavanaugh swore up and down that Google’s algorithms had blocked this stuff, further proof of their deception and indifference,” Parker said. “It hasn’t been blocked.”

[El Paso, Dayton shootings prompt protest outside White House]

“I implored both Google and YouTube to take down the footage of her murder and the related conspiratorial content, and their response was to suggest that I view and flag the content I found offensive,” Parker said. “Instead of self-policing, they put the onus on me. In essence, they wanted me to watch my daughter’s murder and explain to a robot why it should be removed.”

In response to a request for comment, YouTube said it does not allow ads on what it termed sensitive content, and that content in violation of its community guidelines is removed. YouTube did not address Parker’s specific allegations.

The communications law draws a distinction between a platform and a publisher. Publishers, such as news organizations and book publishers, are responsible for their content and subject to libel and defamation lawsuits. But the law says online platforms are not responsible for third-party content shared by individual users of the platform. That section is often described as an internet free-speech protection, but it has also been criticized as granting “power without responsibility,” in the words of Georgetown Law professor Rebecca Tushnet.

The law has come under fire for other reasons as well. In June, when a doctored video of House Speaker Nancy Pelosi, intended to falsely make her appear drunk, was posted to Facebook, the company refused to take the video down, though it did label it as a hoax.

The role of online platforms has also come under increased scrutiny after a white supremacist live-streamed an attack on two New Zealand mosques in March that left 51 people dead.

In the aftermath of that attack, YouTube tweeted that “we are working vigilantly to remove any violent footage.” However, says Parker, the company failed to do so.

Parker says there is a financial incentive for Google’s reticence to remove videos of his daughter’s murder, as well as similar content hosted on YouTube.

“Every time you click on a video — and one of these videos had over 800,000 views on it — that makes money for Google,” he said. “Early on, they were running banner ads on that murder video.”

Restrictions on internet platforms were tightened slightly in March 2018, when Congress passed a law to address child sex trafficking online by increasing the responsibility placed on websites. Parker says he would like to see new legislation, named for his daughter, that extends that regulation to digital harassment, incitement to violence, and murder videos.

“I want to put a face on it. I want to call this Alison’s Law,” Parker said. “For the Parkland kids, for the Sandy Hook families, for all of us who have been affected and those yet to be, I urge legislators from both sides of the aisle to adopt and pass Alison’s Law.”

Get breaking news alerts and more from Roll Call on your iPhone.