[2x Match] Stand for Truth. Work for Justice. Learn More

What Are These 'Facts' You Speak Of?

Big tech's ban of Alex Jones didn't tackle the whole issue at hand.

Sharaf Maksumov / Shutterstock.com

FOR 20 YEARS, Alex Jones, a radio show host and founder of the Infowars website, has been spreading one off-the-wall conspiracy theory after another, and, for the past decade, social media have amplified his voice and his reach to a level his predecessors on the “paranoid Right” could never have imagined. In early August, Facebook and Google-owned YouTube finally took measures to effectively ban Jones from their platforms. But the way they did it raises more questions than it answers about the possibility of restoring respect for truth to public life in the United States.

Way back in the dying days of the 20th century, Alex Jones started his career ranting about the old conspiracy standbys, such as fluoride in our drinking water. But then 9/11 happened, and Jones took his act to a whole new level, claiming that the attacks on the World Trade Center and the Pentagon were really “inside jobs” unleashed by the secret government to launch a global war and suspend civil liberties.

In days gone by, such a theory would have been passed around on mimeographed fliers, and mainstream journalism, shackled by considerations of fact, wouldn’t have touched it. But the social media era has freed us from all that. Now anybody can say anything, and everybody can hear it. Suddenly Alex Jones had an audience of millions for his Facebook pages, his YouTube channel, and his website; this success seemed to egg him on to ever more outrageous pronouncements. Finally, he hit rock bottom with the claim that the Sandy Hook school shooting was faked (to provide a pretext for seizing Americans’ guns) and all those grieving parents were only acting.

Now some Sandy Hook parents are suing Jones for defamation. But those parents can’t sue Facebook or YouTube, which disseminated the falsehoods to an audience of millions and made millions of advertising dollars in the process.

This is where the problem with the Alex Jones ban comes in. Facebook and YouTube didn’t ban Alex Jones for intentionally spreading defamatory falsehoods, or for shouting fire in a crowded theater. Instead they banned him for hate speech for videos that contained anti-Muslim slurs and depicted cruelty to children.

Facebook and Google won’t get anywhere near anything that suggests they might be making decisions about the truth or falsehood of stories disseminated on their platforms because to do so would upset their business model, which is built upon Section 230 of the 1996 Communications Decency Act. That portion of the law exempts all “interactive computer services” from any liability for the content users may post on their site.

As long as social media simply provide an open forum, those harmed by a false story can only get justice from the individual who may have posted it. The deep pockets of the real publishers, Facebook and Google, are off limits. If Facebook and Google accept responsibility for determining the truth and falsity of the material they spread, they then become legally responsible when harmful falsehoods are published.

For more than 50 years, the standard governing U.S. journalism has held that the publisher must take reasonable measures to determine the truth or falsehood of a story. If the story is known (or should have been known) to be false and is still published, the publisher must pay. In the old days, this meant the owner of the newspaper, not the kid who threw the paper onto your front porch.

This rule has worked pretty well to serve the common good by maintaining a balance of freedom and responsibility. But now most Americans get at least some of their news via social media, and a substantial portion of the population is losing faith in the very existence of independently verifiable facts.

A new regulatory regime for social media is long overdue if we are ever going to be able to talk to each other clearly and truthfully about things that matter.

This appears in the November 2018 issue of Sojourners