Facebook To Hire 3000 To Monitor Content - Piranha Digital

Start Your Project with Piranha Today: Call 01772 888331

Facebook To Hire 3000 To Monitor Content

By piranha

5th May 2017

Facebook has promised to hire 3,000 more people to join its “community operations team” to help combat violent and illegal posts on the site.

These human moderators will be responsible for reviewing and taking down content that is not suitable. They will be looking out for content that features hate speech, child abuse and self-harm.

The announcement comes after Facebook, as well as other internet giants like Google and Twitter, have received criticism from MPs for not being proactive in tackling violent content.

In a Facebook post, the company’s CEO Mark Zuckerberg said; “Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.”

“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation. And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.”

One of the main problems is that Facebook’s live stream feature makes it incredibly easy for users to post whatever they want. Back in April, a US man’s death was live streamed on Facebook. In the same month, a man killed his child before ending his own life live on the social media site.

If the company can’t stop people sharing violent content in the first place, they need to take steps to remove the content immediately. That’s why Facebook has promised to extend its team with thousands more moderators.

In addition to this, they have promised to develop more tools to “make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”

This week, the company also revealed that the social media platform has almost 2 billion monthly users. With this in mind, keeping users safe online has never been so important!

Do you think their latest move will help to reduce violent content on the site? Let us know your thoughts on Twitter or Facebook!

  • share

Have You
Read these?