Facebook is nearly doubling the number of workers it employs to monitor Facebook Live video feeds to boost its efforts to catch violent live streams before they spread across the network.
The social network has had to grapple with the widespread sharing of several graphic videos on its network in the past several months — including a spate of live-streamed suicides, rapes and the real-time confessions of a killing suspect who posted a video of himself gunning down a Cleveland man.
Chief executive Mark Zuckerberg said in a Facebook post Wednesday that the social network is hiring 3,000 additional workers to its “community operations” team, which is in charge of fielding reports from users that flag inappropriate material on the site. The company would then have 7,500 workers on the team.
The new reviewers “will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” Zuckerberg said. He said Facebook will keep working with “local community groups” — such as suicide prevention groups — and law enforcement to offer assistance to those who post or are seen in the videos who may need help.
The network hopes to cut down on the response time between when someone reports a violent or inappropriate video and when Facebook can take the video down.
Until this announcement Zuckerberg has said little about these violent Facebook incidents; at the company’s annual conference, he expressed his sympathy for those affected by the crimes live-streamed on Facebook’s platform.
Facebook has faced heavy criticism for not taking sufficient measures to vet and react to users who stream inappropriate content on the social network.