Facebook’s Mark Zuckerberg announced today that the company will hire 3,000 people in efforts to prevent crime and suicide on their platform.

The new hires will add on to the labor pool of 4,500 that Facebook employees to do just that: monitor videos and suspicious posts. Facebook policy is to reprimand users who share violent videos and posts. Unfortunately, most of those that break the rules are not removed until other users report it. By then, it’s too late (i.e. Cleveland killing of elder man and a baby in Thailand).

In his statement, Zuckerberg says these occurrences are heartbreaking and that the company is stepping up to do better.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later,” Zuckerberg wrote in a Facebook post on Wednesday. “It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.”

“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” he wrote. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it — either because they’re about to harm themselves, or because they’re in danger from someone else.”

Facebook also implemented a suicide prevention tool to its live video feature. With the tool, users are able to report someone who seems suicidal while going live. This reactive stance was developed in response to a Miami teen who killed herself while livestreaming back in January.