Digital News: Why Facebook Content Moderator Mistakes Have Increased
Have you ever felt that Facebook made the wrong call and deleted your content, banned you from the platform, or even put you in "Facebook jail"?
You're not alone. More than 300,000 users (including our profiles) have experienced issues like these!
According to the latest released report by NYU Stern, Facebook content moderators review posts, images, and videos that have been flagged by AI or reported by users about 3 million times a day. Even CEO Mark Zuckerberg admitted that moderators “make the wrong call in more than one out of every 10 cases,” which, roughly, translates into 300.000 times a day - mistakes happen.
One of the main problems, according to the report:
“Major social media companies have marginalized the people who do this work, outsourcing the vast majority of it to third-party vendors.”
Facebook employs about 15,000 content moderators directly or indirectly. If they have three million posts to moderate each day, that’s 200 per person: 25 each and every hour in an eight-hour shift. That’s under 150 seconds to decide if a post meets or violates community standards.
But what if one of the things you need to review is a 10-minute video?
This could mean that a content moderator might only have a few seconds for other posts.
That's not an easy task.
That's not an easy task.
And, as you might expect, all this became much more difficult during the Coronavirus pandemic. With the COVID-19 risk content, moderators were sent home, and without the proper technology, connectivity, and safety requirements met, Facebook’s automated system took full control. That was an unmitigated disaster, leading to widespread blocking or deleting of posts mentioning Coronavirus from reputable sources such as The Independent and the Dallas Morning News, not to mention millions of individual Facebook users.
Those automated systems still have problems.
Sherry Loucks, who runs multiple Facebook pages including one on breastfeeding, doesn’t understand how they work or why they’re blocking her.
“All the pages I run are now blocked by Facebook for ‘spam’ yet I only share credible content from fact-based sources,”
And, she’s concerned about double standards. While over 50,000 people follow the page, Loucks gets routinely blocked for “spam,” a problem that other pages such as one run by Ben Shapiro, with almost seven million followers, doesn’t seem to have.
“I have 16 Facebook pages but only post from four regularly,” “Ben Shapiro posts far more content than I do.
What’s the solution to that?
According to the NYU Stern report, ending outsourcing is one: make sure all content moderators are official Facebook employees, with adequate salaries. Doubling the number of moderators is another way, as well as placing content moderation under the dedicated oversight of a senior manager.
The platform should expand oversight in underserved countries too and sponsor research into the mental health impacts of moderating content from all over the world. And last but not least, the company should expand fact-checking to control the spread of fake news.
As the report says, social media without moderation would be simply unusable:
“Picture what social media sites would look like without anyone removing the most extreme content posted by users. In short order, Facebook, Twitter, and YouTube would be flooded not just by spam, but by personal bullying, Neo-Nazi screes, terrorist beheadings, and child sexual abuse.”
That, of course, would be an existential problem for Facebook and other social media platforms.
If the company can reduce the number of false positives, it could also reduce the frustrations of legitimate users who are getting blocked or jailed seemingly in arbitrary ways.
Leave a Comment