Now that its end result is affected, YouTube says it will start taking additional measures to protect its advertisers and creators of inappropriate content on its network. In a blog post created by YouTube executive director Susan Wojcicki, the company said it will increase its staff to more than 10,000 in 2018 to help better moderate video content. The news follows a series of scandals on the video sharing site related to their lack of surveillance around content aimed at children, obscene comments on children's videos, spooky search suggestions, and more.
The company has been suffering the consequences of accusations that for too long it allowed bad actors to use their recommendation algorithms to reach children with videos that are not intended for younger viewers. At the same time, it has apparently fostered a community of creators who create videos that involve placing children in situations of concern and even exploitation.
One example, the ToyFreaks channel, was recently canceled after concerns surfaced about its videos, where the daughters of a father were filmed in strange, annoying and inappropriate situations, at times.
YouTube had said then that the elimination of the channel was part of a new hardening of its policies of endangering children. Also last month he implemented new policies to mark videos where inappropriate content was aimed at children.
Since then it has eliminated thousands of videos of children as a result, and eliminated the publicity of almost 2 million videos and more than 50,000 channels. 
Having policies is one thing, but having staff available to make them a reality is another.
That's why YouTube says it now plans to increase its workforce focused on this task. While Wojcicki's blog post only offered the number of total hires he planned to have for staff next year, a BuzzFeed report notes that this "more than 10,000" figure represents a 25 percent increase from current levels of personal.
However, YouTube still relies heavily on algorithms to help monitor its content. As Wojcicki pointed out in a blog post, YouTube plans to use machine learning technology to help her "quickly and efficiently remove content that violates our guidelines."
This same technology has helped YouTube to mark violent extremist content on the site, leading to the removal of more than 150,000 videos since June.
"Today, 98 percent of the videos we eliminate for violent extremism are marked by our machine learning algorithms," Wojcicki wrote. "Our advances in machine learning now allow us to eliminate almost 70 percent of violent extremist content within eight hours after loading and almost half in two hours and we continue to accelerate that speed," he added.
The goal is now Turn those technologies into a more difficult (and sometimes less obvious) area for the police.
While some content is easier to detect, such as videos where children seem to be suffering or being "cheated" by their parents in a cruel way – there are other videos in a much more gray area.
There are so many parents who have surrounded their children in their search for YouTube stardom, it is difficult to draw a thin line between what is appropriate and what is not. 19659002] One question that must be asked is to what extent can a preschooler or a scholar really give their consent to participate in the daily videos of mom or dad. Should not they be free to play instead of being constantly instructed to represent several sketches, or have the camera trained without stopping? After all, these channels are not just occasional funny videos; They are often full-time jobs for parents. In the USA UU There are laws on child labor and child actors in particular, but YouTube has continually danced around that line, since "it's not really TV", and that means you do not have to follow the rules of TV regarding deceptive ads . junk food ads, and more.
In addition to the new policies and promises of increased staffing, YouTube also says that it will create regular reports where it is transparent about the aggregated data regarding the flags it receives, and the actions it takes to eliminate videos and comments that infringe your content policies
And most importantly, in terms of its business, YouTube says it will consider more carefully which channels and videos are eligible for advertising using a set of stricter criteria, combined with a more manual curator.
"We are taking these measures because it is the right thing to do," Wojcicki wrote. "Creators create incredible content that builds global fan bases, and fans come to YouTube to see, share and interact with this content." Advertisers who want to reach those people finance this economy of creators, each of which is essential to YouTube's creative ecosystem, none can thrive on YouTube without the other, and all three deserve our best efforts. "
Personally, I'd love YouTube cut the creators' ability to make money from videos with kids, period . Perhaps too young stars could finally take a break and let them go back to being children. But I will not hold my breath.
Featured image: nevodka / iStock Editorial