[2x Match] Stand for Truth. Work for Justice. Learn More

The Internet's Toxic Waste Dump

Souls are being stained and scarred to keep your internet experience relatively clean.

BY NOW MOST of us know that when we use the internet, we are giving up privacy and exposing ourselves to potential surveillance and fraud. But we probably didn’t know until recently that in poor and distant lands, souls are being stained and scarred for the sake of our internet browsing experience.

There’s an old saying that nobody really wants to know the details of making sausage or passing legislation. Now there’s an update: You really don’t want to know how—in a world peopled by thousands of internet-capable sickos, murderers, perverts, and fanatics—your social media feeds remain so remarkably free of beheadings, snuff videos, and child porn.

Like me, if you ever thought about that, you perhaps assumed that some miraculous algorithm was automatically filtering all the bad stuff. Well, think again.

In the early days of the internet, porn sites occasionally popped up in the course of ordinary, innocent internet use. But search engine filters and various parental control programs seem, to my experience, to have made that a thing of the past. Today, every evil and dehumanizing image and act under the sun is still out there somewhere on the internet, and we are all much more connected to one another than ever before—yet, for the most part, you have to go looking for the dark side.

But keeping your internet experience relatively clean doesn’t happen by magic. In an article posted on Wired.com on Oct. 23, 2014, Adrian Chen reports that more than 100,000 human beings are employed in the business of internet “content moderation”—viewing and deleting offensive material that users have attempted to post to social networks. That’s twice as many as the total number of employees at Google and 14 times the number at Facebook.

This huge sector of the high-tech world is unknown and invisible to us in part because much of the work is done in the Philippines where, for salaries of $300 to $500 per month, young college graduates spend their entire eight-hour shifts in a cubicle screening social media posts and deleting the ones that violate a site’s policies. As Chen writes, they “soak up the worst of humanity in order to protect the rest of us.”

The Philippines was chosen as the place to filter our filth for the same reason that other countries end up with our physical toxic waste—because many of their people are poor and desperate for employment. The Filipinos are also favored because they are familiar with the English language and U.S. culture owing to a “special relationship” that goes back to our conquest of the islands in 1898.

The moderators are employed by companies that specialize in content moderation and contract with the internet giants to do their dirty work. Chen visited a site outside Manila where workers for a company called TaskUs were doing real-time screening for Whisper, a fast-growing new, anonymous mobile messaging service. A white board at the front of the room bore a list of the items the moderators were seeking: “pornography, gore, minors, sexual solicitation, sexual body parts/images, racism.”

 Before the piece on Wired.com, there had been no reporting on the content-moderation industry. But now we know what happens, and we can’t un-know it any more than the people who do the moderating can ever completely forget the horrible things they see. The company Chen visited employs a psychologist to screen and counsel the workers. She says that the effect on the moderators from their constant exposure to degrading images is “like PTSD. There is a memory trace in their minds.” Chen interviewed a moderator who reports being still haunted by watching one minute of a half-hour video of sexual torture. “I watched that a long time ago,” she said, “but it’s like I just watched it yesterday.” 

This appears in the January 2015 issue of Sojourners