An ex-Facebook moderator said the pressure to go through a never-ending pile of disturbing material ultimately made her numb to child pornography and bestiality.

27-year-old Sarah Katz worked as a content reviewer at Facebook's headquarters in Menlo Park, California, for eight months in 2016 through a third-party contractor, Vertisystem. Her job was simple: figure out whether posts reported to Facebook violated the company's specific community standards.

Practically, this involved eyeballing new and horrific material every 10 seconds and making a snap decision about whether it needed to be discarded. Posts that needed reviewing were called "tickets," and there were about 8,000 every day.

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

To deal with this surge, Facebook had 4,500 moderators like Katz on its payroll last year, and in May 2017 it stated plans to hire another 3,000 to help it in the battle the darkest corners of its user output. Facebook is also investing in artificial intelligence to help remove posts that break its rules.

In a transparency report in May, Facebook revealed the scale of its problem with prohibited content. It said that in the first three months of this year it "took action" on 21 million posts containing nudity and sexual activity and 3.4 million featuring graphic violence. Millions of publications with hate speech, spam, and terrorist content were also removed.

Reviewers have to sign a waiver document about offensive material

Content reviewers begin their Facebook journey by signing a waiver document, which acknowledges that they are braced to view the disturbing material. It's also designed to protect Facebook from any potential legal action.

The one Katz signed says moderators will be exposed to material that "may be offensive to some people," including pornographic images. It adds that staff members should "promptly notify" Facebook if they "do not wish to continue."

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

"Facebook has billions of users, and people don't know how to use the platform correctly, so there was a lot of pornography, bestiality, graphic violence," Katz told Business Insider. "There was a lot of content that you might not expect to see shared on Facebook."

Katz worked in an open-plan office in Menlo Park, where free snacks flowed, and there was reasonable camaraderie among her colleagues. They would set to work on their queue of posts for review, and when in full flow, Katz would make decisions within seconds, she said.

If ticket targets were not met, there would be consequences. Failing to hit a goal "once or twice" would result in a warning, Katz said; more than three times, "you would probably get let go." Katz said that she never witnessed this but that it was informally known among staff members.

"It's kind of a monotonous job after a while," she said. "You definitely grow desensitized to some of the graphic material because you see so much of it. A lot of the content tends to recirculate."

A disturbing image that kept resurfacing

Katz said there was a particularly sinister photo and video that popped up frequently in her queue.

She said it featured two children between 9 and 12 years old standing facing each other, wearing nothing below the waist, and touching each other. It was clear, Katz said, that there was someone behind the camera telling them what to do.

"It would go away and come back. It would appear at multiple times of the day," she said. "Each time the user location would be different — one day shared from Pakistan, another day the US. It's kind of hard to track down the initial source."

At the time, Katz said, she was not asked to report the accounts sharing the material — something she said "disturbed" her.

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

Inside Facebook's headquarters in Menlo Park, California.

"If the user's account was less than 30 days old, we would deactivate the account as a fake account," she said. "If the account was older than 30 days, we would simply remove the content and leave the account active."

Her experience raises questions about the effectiveness of Facebook's efforts to tackle child exploitation on its platform.

The company signed a deal with Microsoft in 2011 to use its PhotoDNA technology, designed to scan all images on Facebook and Instagram, flag known child porn, and prevent it from being reuploaded. Furthermore, Facebook moderators are trained to recognize and escalate such content internally when they see it.

Sarah Katz Opens Up On What It's Like To Work As A Facebook Moderator

Inside Facebook's headquarters in Menlo Park, California.

Facebook told the New York Post in 2012 that it reported all instances of child exploitation to the US's National Center for Missing and Exploited Children.

"We have zero tolerance for child pornography being uploaded onto Facebook and are extremely aggressive in preventing and removing child exploitative content," a company spokesman said at the time.

Katz said she was not aware that any of this was in place in 2016.

"Facebook might have a policy [now] where you're supposed to report it, but back then they didn't," she said.

Facebook declined to comment on the discrepancy between Katz's account and its stated policies.