News

Actions

Online horrors: Social media’s mental toll

Posted
and last updated

Social media content reviewers work to purge the internet of disturbing posts that can depict animal abuse, murder and child pornography.

Shawn Speagle says he is haunted by his work as a former content moderator for Facebook.

“I have seen a video of a babysitter choking a toddler to death,” Speagle said.

His story is one of many in a new report from The Verge, focusing on Facebook moderators employed by Cognizant in Tampa, Florida.

According to the report, the moderators make less than $30,000 a year and only get a nine-minute wellness break to see an on-site counselor. As a result, many moderators say they’re experiencing post-traumatic stress disorder.

Eventually, the hope is that human reviewers can be replaced with artificial intelligence, but the technology is simply not ready.

“Facebook users get mad when you take down their post for the wrong reason or leave up something horrible,” said Casey Newton with The Verge.

A Facebook spokesperson responded to the report, saying, in part: “We go to great lengths to support the people that do this important work, and take any reports that we might not be doing enough incredibly seriously.”

Facebook announced changes for moderators in May, including additional benefits and higher wages for contractors, but some say it’s not enough.

“All of us have to have some empathy and that includes the companies,” said Dr. Alan Hack with the New York State Psychological Association.

The tension between Facebook and its content moderators rose Thursday after a group of contractors in Austin, Texas posted an internal complaint, according to the Washington Post. The letter demands better pay and changes to the nondisclosure agreements workers are required to sign to secure the jobs.