Tech

Undercover Reporter Shows Facebook Moderators Keeping ‘Child Abuse’ Online. Company Responds

(Photo: CHRISTOPHE SIMON/AFP/Getty Images)

Daily Caller News Foundation logo
Eric Lieberman Managing Editor
Font Size:

An undercover reporter disguised as an employee for a contractor recently discovered that Facebook doesn’t always filter content most would agree is either unsavory or reprehensible.

Posing as a staff member for an Ireland-based content moderation company known as CPL Resources, the journalist for the British broadcaster Channel 4 covertly filmed the company’s work. Videos of child abuse and other violence were purposefully not taken down, according to Channel 4’s documentary “Inside Facebook: Secrets of the Social Network.”

“It’s clear that some of what is in the program does not reflect Facebook’s policies or values and falls short of the high standards we expect,” Monika Bickert, Facebook’s vice president of global policy management, wrote in a blog post published Tuesday directly responding to the video. “We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again.”

Facebook says it is now requiring “all trainers in Dublin to do a re-training session,” and plans to do the same for all content moderators employed around the world. Also, the tech giant says the journalist’s work helped spark a review of policies, leading to the fixing of certain “mistakes.” Facebook, however, does not admit to what it sees as the underlying postulation that ignoring or accepting disreputable content is in its commercial interests.

“Creating a safe environment where people from all over the world can share and connect is core to Facebook’s long-term success,” Bickert said. “If our services aren’t safe, people won’t share and over time would stop using them.”

“Nor do advertisers want their brands associated with disturbing or problematic content,” she continued, referencing the fundamental way it makes money, and thus sustains its business.

After the disclosure of the secret video and subsequent television reports, Facebook decided to be fairly transparent, with not only the company blog post, but with a written follow up with the Channel 4 team, and an on-camera interview to explain.

Still Facebook’s main problems lie in the fact that there are billions (probably trillions) of pieces of content on its platform, and substantial portions of the public want it do something to stop the ones deemed horrible, and not worth being accessible.

Content like child abuse is more widely agreed as some that should be taken down, as is terrorist propaganda and communications.

The undercover journalist was reportedly told that posts racially abusing immigrants is allowed under most circumstances. Also, he was advised that a specific cartoon describing the drowning of a girl if her boyfriend is black is permissible. Facebook said after that its rules against hate-speech should have been applied there.

But for misinformation and the oft-ill-defined “hate speech,” the desire is not so clear cut.

Different ends of the political spectrum for the most part disagree on what social media companies should do about the alleged rise of “fake news,” with the right generally saying it should be the responsibility of the reader to educate oneself and be skeptical of anything told to them, and the left asking for more aggressive removal measures. (RELATED: Are Faulty Algorithms, Not Liberal Bias, To Blame For Google’s Fact-Checking Mess?)

And there’s also political ads, a new concern for Facebook and some of the public following the attempts by Russian operatives to drive an already intense wedge in the American electorate through certain contentious content and the unofficial organization of political events.

“More than 1.4 billion people use Facebook every day from all around the world. They post in dozens of different languages: everything from photos and status updates to live videos,” wrote Bickert. “Deciding what stays up and what comes down involves hard judgment calls on complex issues — from bullying and hate speech to terrorism and war crimes.”

To help with the complicated endeavors of content filtering, Facebook says it’s doubling the number of people who work on its safety and security teams, bringing the number to 20,000. Specifically, 7,500 of those will be to review content. Facebook CEO Mark Zuckerberg said in May that they are hiring so many moderators that the company will lose money on political ads. But, as mentioned previously, earning users’ trust may be more important in the long run.

Technology, like machine learning, is also assisting in the process.

Follow Eric on Twitter

Send tips to eric@dailycallernewsfoundation.org.

All content created by the Daily Caller News Foundation, an independent and nonpartisan newswire service, is available without charge to any legitimate news publisher that can provide a large audience. All republished articles must include our logo, our reporter’s byline and their DCNF affiliation. For any questions about our guidelines or partnering with us, please contact licensing@dailycallernewsfoundation.org.