- In 2017, Facebook announced it would
expand the number of people it had
working on safety and security to 30000,
15000 of which would
be content moderators.
Content moderation is
a really difficult job.
You have to take Facebook's policies,
which can change every day,
and then apply them to decide
what stays up on Facebook and Instagram
and what comes down.
A couple of months ago,
I was contacted by some moderators
who worked for Facebook in Tampa, Florida
through a company called Cognizant,
and they told me that they
wanted to go on the record.
Now, this is a really big deal.
Everyone who works for Cognizant
signs a lengthy nondisclosure agreement
in which they pledge not to talk about
the work that they're doing.
But the moderators who I met in Florida
told me that they wanted to tell
their story to the world
so that we all could understand
what it's like to do this job day to day
and about the longterm
mental health consequences
of policing the biggest
social network in the world.
(somber music)
- [GPS] Take the next
left onto West Avenue.
- [Shawn] I was ecstatic.
I thought I was gonna
climb the corporate ladder.
I thought, you know, this is Facebook,
I'm gonna be doing some really good stuff.
- Um, I thought I was gonna be
reading Facebook posts all day,
so people going, you know, everybody posts
their business on Facebook,
so I was gonna read everybody's stuff
and then be able to, you
know, decide if it has to
stay up or come down, so, I
thought it would be a fun job.
(somber music)
- [Casey] What did they tell
you the job was gonna be like?
- [Shawn] Basically they
told me that I would
be going onto like high
profile social media accounts,
such as like Disney World
or like Animal Orlando,
and I'd be doing kind of
like some data searching,
like seeing what types of
posts most people react to
and comment to and like to.
- At what point did it become clear to you
that you weren't going to be
helping businesses on Facebook?
- Uh, probably the second
or third day of training,
and basically we had an
outline of what we were doing
and none of it was business,
it was graphic violence, hate speech,
sexual solicitation, sexual exploitation,
that kind of stuff.
- So they told you we're gonna put you
in a queue of content that is dedicated
to graphic violence, and hate speech?
- Yes.
You would get the occasional random thing,
but for the most part, it was always
graphic violence and hate speech,
because that's all that
was coming in for us.
- Did anybody ask you
about your mental health
before they assigned you to that queue?
- Nobody ever asked about my
mental health there, yeah.
Nobody said anything about mental health.
(somber music)
- [Casey] What were like
some of the kinds of things
that you would see that
would be really hard for you?
- [Michelle] Oh, where do I start?
Um, animals, mostly animals,
the abuse of animals.
I've seen them, had a puppy
with a rope hanging it,
and I've seen a pit of
pigs and they threw fire
and you can hear the pigs screaming.
I don't wanna get emotional
talking about the animals.
- There was one where
there was a baby that was,
they were twin babies.
- [Michelle] Twin babies.
- From like Saudi Arabia, and the mother
was dropping the baby on the ground.
This is one we saw over and over again,
and then choked the baby,
and you hear the baby gurgling,
and trying to breathe,
and for days, it infected my mind.
I had to know what happened to this baby
because I'd seen it over
and over and over again,
and luckily the baby was okay.
(sighing)
(sobbing)
- Sorry, um.
- [Casey] It's okay.
(sighing)
- I just think about all
the animals all the time,
and that's what I'm still thinking about,
even though I left.
- [Casey] Yeah.
Do you remember the
first video that you saw?
- It was a video in a different language,
and it was these two teenagers,
and they came across an
iguana on the street,
and one of the kids grabbed
the iguana by the tail
and they started to smash
the iguana onto the ground,
and you could just hear
the iguana screaming.
And that was one of the first
videos I saw on that queue.
- [Casey] Yeah.
- And they just, they kept
slamming the iguana onto the ground,
and the iguana just kept
screaming and screaming,
and then the screaming stopped.
It was just a bloody pile,
and the kids were just
laughing at the iguana.
- Were you able to remove
that video from Facebook?
- No, since that video had
no title and no caption,
we were supposed to send it
to a different queue for Spanish speaking.
But I don't there really
was a Spanish speaking queue
that was taking care of that.
- Killing an animal on screen,
uploading that to Facebook,
at least when you worked
there, that was okay?
- That was okay.
(sighing)
I just think about that.
And we're not helping the
animals either, we're not,
not even humans, we're
not even helping humans.
I have seen videos of a babysitter
choking a toddler to death
and giving bloody noses to babies,
and it stays, and nobody does anything,
and it's just there, it's always there.
You have to always look at it.
You always see death, every single day.
You see pain and suffering.
And it just makes you angry,
because they're not doing anything.
The stuff that does get deleted,
it winds up back there anyway.
(somber music)
- [Michelle] Maybe a
psychological test would help,
because, you know, some people
can handle certain things
than others, and maybe they won't be
affected with PTSD or any anxiety
or whatever the case
may be that could come
from seeing this stuff
over and over again.
- How did you get through
it during the day?
- I ate.
When you look at bad stuff all day,
sometimes you just wanna
eat something sweet
to make you feel better.
- How else did you change
while you were doing this work?
- I was very snappy with everybody.
I had night terrors,
like almost every night.
I was only getting like
an hour or two of sleep
because I was just so,
I was just always thinking
about the content,
the videos, the pictures,
the people and the animals
that were basically,
you know, their whole
deaths were broadcast.
Like the most cruel things imaginable.
Just, it's there, and
it's allowed to be there.
- Did you go talk to somebody
about your night terrors?
- Yes, I did.
I went to a mental health
facility in Clear Water,
and they diagnosed me with PTSD.
They gave me some medication
for night terrors.
They also prescribed me some Xanax.
It really has helped a
lot with my sleeping.
I'm able to sleep again.
Just knowing that there's this
kind of stuff still going on
just scares the heck out of me.
It's terrifying to know
that that stuff is real.
(somber music)
- [Casey] So Facebook has told me
they don't have quotas for how many jobs
that moderators are supposed to do.
How did you feel when you were there?
- There's a quota.
- Yeah, what is it?
- Well, it started where
it used to be higher,
like 354, was it?
And then, right about
the time we left, it was,
they're like well, we want to
at least do two, 250 a day.
That's a quota to me, if you're telling me
to do that many jobs.
- What is the score?
- Oh, the score.
You're supposed to be
getting a 96 to 98 percent,
but nobody in our training class
actually got anywhere close to that.
Everybody was in the
eighties, including myself.
- So the basic idea of the score
is that the 15000 moderators
it has around the world
should be executing
their policies perfectly
and they should be taking down
everything that should be taken down
and leaving up everything
that should be taken up,
and they should do that with
a 98 percent degree of accuracy?
- That is correct.
- But it's not happening in practice,
because as you're just saying,
the policies are changing how often?
(laughing)
- Daily.
- How much pressure is there on moderators
to keep that score high?
- That's their main focus.
- That's probably
the main focus there, and every day, it's,
you've got bad quality,
you've gotta send it back,
you've gotta do, like, all the time,
you've gotta do disputes on this,
oh, why is your quality so low?
Every day, every day.
- Every day there they're
sort of hammering that home,
that you need to be perfect.
(somber music)
Walk us through, like, your
average day doing this job.
- You sat at your desk,
you put on your headphones,
and you worked all day.
No one came to comfort you.
If you were upset, no one came
to talk to you throughout the day.
If you turned around to talk to a friend,
you were being screamed at
for not looking at your
content and doing your work.
They say all the time okay, we have
these counselors here to help you,
but we've got nine minutes
of wellness every day.
So I'm supposed to go
talk to this counselor
about the 500 videos I've
looked at today in nine minutes
and I'm supposed to be okay?
It doesn't make any sense.
- It's a toxic environment.
The higher ups don't really care.
They're very nonchalant about
the problems that are there,
such as workers having
sex in the building.
People are drinking alcohol
and smoking weed in the parking lot.
Just a lot of sexual harassment going on
from the higher ups to
the content moderators.
And there was a problem of the bathrooms.
Some employees thought it was funny
to smear feces all over the stalls
and urinate on the floor.
- And how many bathrooms are there
for the 800 employees that work there?
- One bathroom.
There is one bathroom
in the entire building.
- Why didn't you quit while
you were doing this job?
- Well, as I said before,
the market was tough.
It's tough down here, and you know,
I had such difficulties finding a job
and I was scared to find another job,
and it was also just
kind of something weird,
that the managers would
always tell the employees
where it's like, uh, oh, if you quit,
you're gonna have to go
back to call centers,
because like I guess like
that was the only thing
that a lot of these people
did, was call centers.
- So they're reminding you,
this is the best you can do around here.
- Yes.
(laughter)
- For 15 dollars an hour?
(laughter)
- I was actually really excited for that,
because in college, all
my professors were like
you know, you wanna get that good
30000 dollars starting entry level job.
- Absolutely disgusting.
Always the desks were disgusting,
pubic hairs on the desks.
Boogers on the desks.
They never did a fire
drill because they said
Facebook wouldn't let us off
the content to do a fire drill
and one time, Facebook
was coming to visit,
and the day before, they had every manager
painting and cleaning the building
so it would look presentable for Facebook,
and that just proves that Cognizant knew
it was not acceptable for the building
to be in this condition.
It's like a sweatshop in
America, it really is.
All they care about is getting
that content moved through.
(somber music)
As long as you were sitting in your desk
not talking to anybody else
and doing your content,
they were happy.
It didn't matter.
Nobody there matters.
- What do you think people
should know about this job
that they don't already?
- When I actually got into all of this,
and they explained what
I was really doing,
they made it feel like you were going
to make a difference on social media,
and there were going to
be people and animals
that you could help bring justice to.
You're not doing that at all.
All that you're doing is
covering up Facebook's mistakes.
- Hey, thanks for watching,
and if you want to know more
about our ever changing social networks
and their effects on the world,
I invite you to subscribe
to my daily newsletter
The Interface, you can find it
at theverge.com/interface.
And of course if you want more
great videos from the Verge
subscribe to this channel.
We are a participant in the Amazon Services LLC Associates Program, an affiliate advertising program designed to provide a means for us to earn fees by linking to Amazon.com and affiliated sites.