YouTube is recruiting thousands of moderators to address questionable content on its video streaming platform.
YouTube executive director Susan Wojcicki wrote in a blog post on Monday that the company will have more than 10,000 people working to address content that could violate its policies in 2018. The firm is also implementing stricter restrictions on advertising and the use of smarter technology, according to CNN.
The hiring push will increase the number of human reviewers on YouTube by 25 percent, according to USA Today.
The company has faced a series of controversies this year over the videos on its platform, CNN said. He was forced to adopt additional detection measures in his children's service, YouTube Kids, after reports found many videos that contained profanity and violence.
According to CNN, several companies withdrew their advertising of the controversies in the last year. 1
She said that YouTube employees have reviewed almost 2 million videos for inappropriate content during the last six months. He added that the company is working on comments by launching new moderation tools and in some cases closing comments completely.
The firm has also been closing problematic accounts.
Technology will continue to play a critical role in content moderation, the company said, according to USA Today. Advances in machine learning allow YouTube to remove nearly 70 percent of violent and extremist content within eight hours of uploading.
Critics argue that YouTube receives so many shipments that it's easy to fool artificial intelligence, USA Today said. The service receives 400 hours of new videos per minute.
"Our goal is to stay one step ahead of the bad actors," Wojcicki wrote.
Contact Kevin Tampone at any time: Email | Twitter | Google + | 315-454-2112