Facebook will hire 3,000 people to monitor live streams

Facebook CEO Mark Zuckerberg says he plans on hiring at least 3,000 content moderators who can keep crime, suicides, and other violent acts from being shared on the social network's popular video app. 

The new hires will be in addition to the already 4,500 strong content moderator team already employed by the online social media giant. 

Zuckerberg announced the hires in a Facebook post he made on Wednesday morning. He said in the post that the videos are heartbreaking and he's been working to figure out how the company could do better for the community. 

He writes: 

Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community.

If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.

In addition to monitoring the streams, the content reviewers will help Facebook get better at removing hate speech and child exploitation, as well as assist in creating better tools to keep people on Facebook safe. 

Zuckerberg also stressed the importance of the content reviewers, and provided an example of how they can help 

This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.

Facebook recently added suicide prevention tools to its live video features - including giving the ability for users to report someone who is broadcasting suicidal intentions. 

The new tool was built partly as a response to a 14-year-old girl in Miami who killed herself while streaming live on Facebook. 


Sponsored Content

Sponsored Content