A look into Facebook's content policy 4 replies

Please wait...

Commissar MercZ

Notable Loser

300,005 XP

29th January 2005

0 Uploads

27,113 Posts

0 Threads

#1 6 years ago

A few days ago a former employee of oDesk, one of the companies contracted by Facebook to help with keeping Facebook 'clean' and legal, discussed with Gawker some of the odder ends of policy. I don't have much to say here, might be interesting to some of you interested in 'free speech' and the spread of social networking.

Inside Facebook’s Outsourced Anti-Porn and Gore Brigade, Where ‘Camel Toes’ are More Offensive Than ‘Crushed Heads’

Amine Derkaoui, a 21-year-old Moroccan man, is pissed at Facebook. Last year he spent a few weeks training to screen illicit Facebook content through an outsourcing firm, for which he was paid a measly $1 an hour. He's still fuming over it.

"It's humiliating. They are just exploiting the third world," Derkaoui complained in a thick French accent over Skype just a few weeks after Facebook filed their record $100 billion IPO. As a sort of payback, Derkaoui gave us some internal documents, which shed light on exactly how Facebook censors the dark content it doesn't want you to see, and the people whose job it is to make sure you don't.

Facebook has turned the stuff its millions of users post into gold. But perhaps just as important as the vacation albums and shared articles is the content it keeps out of user's timelines: porn, gore, racism, cyberbullying, and so on. Facebook has fashioned itself the clean, well-lit alternative to the scary open Internet for both users and advertisers, thanks to the work of a small army of human content moderators like Derkaoui.

"We work to foster an environment where everyone can openly discuss issues and express their views, while respecting the rights of others," reads Facebook's community standards.

But walking the line between keeping Facebook clean and excessively censoring its content is tricky, and Facebook's zealousness in scrubbing users' content has led to a series of uproars. Last April, they deleted an innocent gay kiss and were accused of homophobia; a few months before that, the removal of a nude drawing sparked the art world's ire. Most recently, angry "lactivists" have been staging protests over Facebook's deletion of breast-feeding photos.

These censorship scandals haven't been helped by Facebook's opacity regarding its content moderation process. Whenever Facebook deletes an image it deems objectionable, it refers the offending user to its rambling Statement of Rights and Responsibilities. That policy is vague when it comes to content moderation, and probably intentionally so. If users knew exactly what criteria was being used to judge their content, they could hold Facebook to them. It would be clear what Facebook was choosing to censor according to its policies, and what amounted to arbitrary censorship.

Well, now we know Facebook's exact standards. Derkaoui provided us with a copy of the astonishingly specific guidelines Facebook dictates to content moderators. It's the public's first look at exactly what Facebook considers beyond the pale, and what sketchy content it won't allow in videos, images and wall posts. The document is essentially a map of Facebook's moral terrain.

The content moderation team Derkaoui was a member of uses a web-based tool to view a stream of pictures, videos and wall posts that have been reported by users. They either confirm the flag, which deletes the content, unconfirm it, which lets it stay, or escalate it to a higher level of moderation, which turns the content in question over to Facebook employees.

Inside Facebook's Outsourced Anti-Porn and Gore Brigade, Where 'Camel Toes' are More Offensive Than 'Crushed Heads' A one-page cheat sheet in the 17-page manual lays out exactly what must be confirmed and deleted by the team. (The cheat sheet is viewable to the left; the full document can be seen below.) It's divided into categories like "Sex and Nudity," "Hate Content," "Graphic Content" and "Bullying and Harassment." The document was current as of last month.

Facebook, it appears, will delete pretty tame stuff. For example, any of the following content will be deleted, according to the guidelines:

Blatant (obvious) depiction of camel toes and moose knuckles. Mothers breastfeeding without clothes on. Sex toys or other objects, but only in the context of sexual activity. Depicting sexual fetishes in any form. ANY photoshopped images of people, whether negative, positive or neutral. Images of drunk and unconscious people , or sleeping people with things drawn on their face. Violent speech (Example: "I love hearing skulls crack.").

When it comes to sex and nudity, Facebook is strictly PG-13, according to the guidelines. Obvious sexual activity, even clothed, is deleted, as are "naked ‘private parts' including female nipple bulges and naked butt cracks." But "male nipples are OK." Foreplay is allowed, "even for same sex (man-man/woman-woman)" Even the gays can grope each other on Facebook.

Facebook is more lenient when it comes to violence. Gory pictures are allowed, as long somebody's guts aren't spilling out. "Crushed heads, limbs etc are OK as long as no insides are showing," reads one guideline. "Deep flesh wounds are ok to show; excessive blood is ok to show."

Drugs are a mixed bag. Pictures of marijuana are explicitly allowed, though images of other illegal drugs "not in the context of medical, academic or scientific study" are deleted. As long as it doesn't appear you're a dealer, you can post as many pictures of your stash as you want.

Under "hate content," the guidelines specifically ban "Versus photos... photos comparing two people side by side," which is ironic considering Mark Zuckerberg's first hit, FaceSmash, ranked the attractiveness of female Harvard students.

Some types of content are judged highly on context. For example, school fight videos are deleted only if "the video has been posted to continue tormenting the person targeted in the video." Hate speech is allowed in the case of a joke, and animal abuse videos can stay only if it's clear the user doesn't approve of it.

Facebook has struggled with the issue of abortion, as it can encompass both sex and gore. Last month, a post offering DIY abortion instructions was 'accidentally' deleted from the page of a Dutch reproductive rights activist. Anti-abortion advocates sometimes complain that Facebook cenors their graphic pictures of aborted fetuses.

While the guidelines don't mention abortion, a discussion in a forum used by Facebook content moderators Derkaoui showed us sheds light on Facebook's abortion policy. The lead content moderator writes:

abortion in all its forms is allowed to be depicted, unless it violates the graphic violence or nudity standards. That means users can show images as well as talk about abortion, have discussions, etc. It's only a violation if graphic content such as dismemberment or decapitation is shown, or if the insides are visible.

Certain content must be escalated to a higher level of review. This includes credible threats against people or public figures, suicidal content, and encouragements to commit crimes.

Perhaps most intriguing is the category dedicated to "international compliance." Under this category, any holocaust denial which "focuses on hate speech," all attacks on the founder of Turkey, Ataturk, and burning of Turkish flags must be escalated. This is likely to keep Facebook in line with international laws; in many European countries, holocaust denial is outlawed, as are attacks on Attaturk in Turkey.

A Facebook spokesman told us in a statement that the document "provides a snapshot in time of our standards with regards to [one of our] contractors, for the most up to date information please visit www.facebook.com/CommunityStandards."

Derkaoui found his job through the California-based outsourcing firm oDesk, which provides content moderation services to both Google and Facebook. After acing a written test and an interview, he was invited to join an oDesk team of about 50 people from all over the third world—Turkey, the Philippines, Mexico, India—working to moderate Facebook content. They work from home in 4-hour shifts and earn $1 per hour plus commissions (which, according to the job listing, should add up to a "target" rate of around $4 per hour).

The job posting made no mention of Facebook, and Derkaoui said his managers at oDesk never explicitly said that it was the client. Facebook is secretive about how it moderates content; the company is hesitant to bring attention to the torrent of horrible content it's trying to control. Other former moderators I spoke to mentioned that they had signed strict non-disclosure agreements with oDesk. One even refused to talk to me because she believed I was a disguised Facebook employee trying to test her.

However, a Facebook spokesman confirmed they were oDesk's client. "In an effort to quickly and efficiently process the millions of reports we receive every day, we have found it helpful to contract third parties to provide precursory classification of a small proportion of reported content," he said in a statement. "These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service."

Derkaoui, who now works as a regional content manager for the New York-based tech company Zenoradio, didn't make it out of oDesk's extensive training process. He missed a crucial test because it was Ramadan and never caught up. But Skype interviews with a number of other former members of the team offer a picture of the strange work of a content moderator, the prototypical job of the social media boom.

The former moderators I spoke to were from countries in Asia, Africa and Central America. They were young and well-educated, which is unsurprising considering their proficiency in English. One is applying to graduate schools to study political science in the U.S. next year; another is currently in an engineering program. For them, Facebook content moderation was a way to make money on the side with a few four-hour shifts a week. Many had done other jobs through oDesk, such as working as a virtual assistant.

Like Derkaoui, most agreed that the pay sucked, while also acknowledging that it was typical of the sort of work available on oDesk. Derkaoui was the only one who cited money as a reason for quitting. The others seemed more affected by the hours they'd spent wading in the dark side of Facebook.

"Think like that there is a sewer channel," one moderator explained during a recent Skype chat, "and all of the mess/dirt/ waste/shit of the world flow towards you and you have to clean it."

Each moderator seemed to find a different genre of offensive content especially jarring. One was shaken by videos of animal abuse. ("A couple a day," he said.) For another, it was the racism: "You had KKK cropping up everywhere." Another complained of violent videos of "bad fights, a man beating another."

One moderator only lasted three weeks before he had to quit.

"Pedophelia, Necrophelia, Beheadings, Suicides, etc," he recalled. "I left [because] I value my mental sanity."

Some firms have recognized the mental toll of content moderation and have begun providing psychological counseling to workers. But former moderators said oDesk just warned them that the job would be graphic.

"They did mention that the job was not for the light of heart before hiring me," said the moderator who quit after three weeks. "I think it's ultimately my fault for underestimating JUST how disturbing it'd be."

And the attached graphic:

Spoiler: Show
88e77406de7ac8efad90dc5d096deaa2.jpg



Caprica-Six

Toaster

50 XP

24th August 2010

0 Uploads

339 Posts

0 Threads

#2 6 years ago

very eye opening, i had never really thought about the lack of grotesque content on facebook, it must be handled with very quickly i wonder how many people are moderating it at any one time.....maybe we should sned them some of our moderators on a forum exchange?




Commissar MercZ

Notable Loser

300,005 XP

29th January 2005

0 Uploads

27,113 Posts

0 Threads

#3 6 years ago
Caprica-Six;5615612very eye opening, i had never really thought about the lack of grotesque content on facebook, it must be handled with very quickly i wonder how many people are moderating it at any one time.....maybe we should sned them some of our moderators on a forum exchange?

Looking at some of their accounts of how stressed out they are over it apparently, I don't envy what they are doing. Still, considering the scope and scale of Facebook it isn't surprising I guess the company resorts to using third-parties to help scrub it down.

Twitter made some news earlier about discussing how it will act in compliance with national laws, which of course invited discussion on how this relates to the role of services like Facebook and Twitter in organizing the protests in Egypt- would they comply with another government for this in the future?




Admiral Donutz VIP Member

Wanna go Double Dutch?

735,271 XP

9th December 2003

0 Uploads

71,460 Posts

0 Threads

#4 6 years ago

I'm not suprised, we all know that in the US violence (acted or real) captured on any media is much more tolerated then media related to sex(uality). Who should we blame for this? FB doesn't remove content that isn't reported (unless it is obviously against US laws), do they? If your images aren't public and you happen to share a picture such as a woman with exposed breasts with your friends, and nobody reports it, it won't be removed even if some FB staff would happen to stumble across it.




Hfx-Rebel VIP Member

AzH owns my ass

50 XP

15th March 2004

0 Uploads

10,426 Posts

0 Threads

#5 6 years ago

From what the article says, the article/image/whatever has to be reported in order for the content moderators to act on it, which makes sense, I heard the other day day they were creeping up on a billion accounts...thats an awful lot of content to actively be going through...