Facebook To Hire 3,000 To Review Videos Of Crime And Suicide

NEW YORK (CBSNewYork/AP) — Facebook says it will hire another 3,000 people to review videos of crime and suicides following murders shown on its site.

The announcement comes from CEO Mark Zuckerberg in a post-Wednesday. That’s on top of the 4,500 people Facebook already has for such reviews.

"Over the last few weeks, we've seen people hurting themselves and others on Facebook -- either live or in video posted later. It's heartbreaking, and I've been reflecting on how we can do better for our community.

If we're going to build a safe community, we need to respond quickly. We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down.

Over the next year, we'll be adding 3,000 people to our community operations team around the world -- on top of the 4,500 we have today -- to review the millions of reports we get every week, and improve the process for doing it quickly.

These reviewers will also help us get better at removing things we don't allow on Facebook like hate speech and child exploitation. And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else.

In addition to investing in more people, we're also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.

This is important. Just last week, we got a report that someone on Live was considering suicide. We immediately reached out to law enforcement, and they were able to prevent him from hurting himself. In other cases, we weren't so fortunate.

No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need."

“If we’re going to build a safe community, we need to respond quickly,” Zuckerberg said in the Facebook post. “We’re working to make these videos easier to report so we can take the right action sooner — whether that’s responding quickly when someone needs help or taking a post down.”

He said the reviewers “will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.”

“And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else,” Zuckerberg said.

Facebook has been criticized recently for not doing enough to prevent videos — such as a murder in Cleveland and a killing of a baby in Thailand — from spreading on its service.

Videos and posts that glorify violence are against Facebook’s terms of service. But in most cases, users have to report them to the company for them to be reviewed and possibly removed.

(© Copyright 2017 CBS Broadcasting Inc. All Rights Reserved. The Associated Press contributed to this report.)

Share this article: