Quantcast
Channel: The New Republic
Viewing all articles
Browse latest Browse all 15328

I Was a Facebook Content Moderator. I Quit in Disgust.

$
0
0

A lot of people in Austin do this job. I knew that it was moderation. I knew that I would be looking at gross content. There’s a lot of turnover at first. For most people, they’re coming from retail or food service. It’s a very entry-level position. You’re not programming. You’re just implementing these really specific rules about the platform.

At the beginning I was making $16.50 an hour, and at the end I was making $18 an hour, and that was better than I could have done at a lot of other places. Still, I think that we should have been paid more.

So the job is moderation. Every space—IRL spaces and online spaces—has some set of speech norms. You have this largely fragmented early internet where people can say whatever they want because people can find these different spaces. But then Facebook comes into the picture, and all of a sudden the internet isn’t anonymous anymore. Facebook is dictating the speech of all these different people. Suddenly these things that you wouldn’t really care about because you could just opt to go somewhere else become a big deal. The policies that they make—it’s partly this STEM guy quasi-liberalism stuff and partly just, they get a bunch of pushback so they do this, they do that. Basically you get this lowest common denominator thing, which doesn’t make anybody happy.

We had this introductory training period where we had to go through the different policies. It’s really hard to make a policy that actually captures what you want it to. You have to get into this really technical, specific, and often arbitrary detail. You spend a lot of time talking about nipples.

They were always changing up the guidance and never really gave us a specific quota. Things were tremendously mismanaged. When I first started working there and wasn’t totally burnt out, I’d be doing 500 to 700 content moderation decisions per day.

Edge cases—various usages of the n-word, which depend on context and whether the word ends in “er” or “a”—can take a long time. I need to make this decision, and if this decision gets audited and I got it wrong, what am I going to say to justify this decision? You need to cite policy, and you need to cite the policies to how you apply the policy. (Those policies are based on internal documents that aren’t accessible to the public.) There are weekly audits where they question us about certain moderation decisions.

In other coverage of this job, there’s a lot of misery porn stuff: the content moderator as the receptacle of all this bullshit, rather than actual thoughts about their job, beyond that they hate it. I wish it was content moderators talking about this. In terms of improving the work, there need to be more demands made about mental health. One thing that would be important to organize around would be to give content moderators more of an active role in crafting policy. They’re not going to ever have an effective policy unless you have the people implementing the policy shaping it in some way.

For most of the time, I was in a group called IG Proactive. It was stuff that was picked up by the AI or whatever. It was sort of everything on Instagram. Then I was on a team called Ground Truth, which is also a catch-all queue, and it’s a mixture of stuff that gets picked up by the AI and stuff that is user-reported. That queue isn’t actually live; your choices aren’t directly affecting the website. It’s a research queue. A bunch of people are supposed to get the same job, and they will try to refine the policy based on how you classify the job and possibly train the AI.

There are child porn groups on Facebook. They’re taken down, but the people posting seem to keep one step ahead of all this stuff. I did not specialize in that. There was one Instagram page that I saw that wasn’t straight-up porn, but it was for looking at young girls. It was geared around that. It was fucking disgusting. And also, What are they thinking? It’s so confusing. How do you think you’re not going to get caught?

I was super frustrated. I’d wanted to quit for a long time, but I felt trapped. I’d heard stories of people who’d quit and hadn’t been able to find anything for a while. I was afraid of being unemployed. I managed to save up a little bit of money. I interviewed at this coffee shop, and I got the job, and that’s what I’m doing now.

I sent in my resignation at night. The next morning … I wrote the letter about how Facebook’s content moderation is failing and posted it. I wanted to prove that as a moderator, I had certain insights. I wanted to prove the stuff that I said in the letter, that it would actually be effective, which is that if you take into account what the people who are actually doing this job think, you’re going to have a better time of all of this. Here are the problems as I see them, and here are what I think could be solutions to these problems.

But also I just kind of wanted to be like, Fuck you.

I saw one thing where someone tweeted: “This seems like a bad job but if I really hated Facebook I would simply not be a tech weenie who works for Facebook.” I am not a tech weenie. I just know this policy. I’m not doing programming. I’m not trying to do programming. I am at the bottom of this industrial totem pole.

I just don’t think content moderation should be anybody’s sole job responsibility. Because it’s gross, and people are better than that.

I wouldn’t want to make a law against Facebook, but it’s a force for bad in the world. I’m no longer on Facebook.


Viewing all articles
Browse latest Browse all 15328

Trending Articles