Home / Business / Google wants more humans to help solve the problem of child exploitation on YouTube

Google wants more humans to help solve the problem of child exploitation on YouTube



In announcing plans to hire many human moderators to mark disruptive and extremist content this week, YouTube has become the latest Silicon Valley giant to recognize that software by itself it will not solve many of the problems that afflict the industry.

YouTube, which is owned by Google, said in a blog post on Monday night that it would significantly increase the number of people who would monitor such content in the company next year. By 201

8, Google will employ approximately 10,000 content moderators and other professionals charged with addressing infractions of its content policies. The search giant will not disclose the number of employees currently doing such jobs, but a person familiar with the company's operations said the hiring represents a 25 percent increase of where Google is at the moment.

The initiative follows last month's reports that YouTube videos emerged showing children in potentially disruptive and potentially exploitative situations, including being stuck to walls, abducted and forced to wash machines, according to a BuzzFeed report. Google said it has eliminated 150,000 violent extremist videos since June. The company has also removed hundreds of thousands more videos that showed content that exploited or endangered children, said the person familiar with the company's thinking. Some disturbing content appeared on YouTube Kids, the company's application marketed to children.

"I've seen some bad actors exploit our openness to cheat, manipulate, harass or even harm," said Susan Wojcicki, YouTube's executive director. submit. "Our goal is to stay one step ahead of the bad actors, making it harder for them to appear or remain on YouTube."

Google's decision to control publishers more aggressively comes at a time when Silicon Valley companies are struggling to control unwanted content in a number of areas, including violent videos that appear on Facebook Live, hate speech, terrorism and Russian disinformation campaigns that try to thwart the political debate.

Last month, in the midst of Capitol Hill hearings about Russian meddling, Facebook said it would hire another 10,000 security professionals to deal with political misinformation and other security threats. Earlier this year, the company said it would hire 3,000 content moderators more than the 4,500 it already had.

Google says that many of these content indicators, many of which are low-level contractors, while others are experts in the field, will work to train computer algorithms to identify and thwart unwanted content. Ninety-eight percent of violent extremist videos removed are now marked by Google software, up 76 percent in August, the company said in its blog. The company's advances in data mining software, known as machine learning, now allow Google to remove almost 70 of that content within 8 hours of uploading.

Google and its Silicon Valley counterparts have said they expect to train software to do most of the police work. But this year's hiring wave shows that a large number of undesirable content is leaking, and that more humans are needed, said Paul Barrett, Deputy Director of the Stern Center for Business and Human Rights at New York University.

Companies are recognizing that they will have to rummage through their pockets and pay more people if they are going to embrace this problem, "said Barrett, emphasizing that software would be an important part of any solution." Given the volume of material in these sites, it is almost impossible for the answer to be totally human. "

Technology platforms are not legally required to control most of the content published by third parties in their services, thanks to a 30 year old , Article 230 of the Communications Decency Act, which grants intermediaries immunity from liability.Child pornography, however, has been treated as an exception to Section 230 and is an area that companies aggressively control, particularly when they make inroads in new services aimed at children.

Google's haste to address the problems related to extremists and the The exploitation nest with children still leaves aside some of the most thorny questions about how judgments will be made about other forms of unwanted content, especially political misinformation, Barrett said. In October, YouTube started the news site backed by the Russian government RT, or Russia Today, of its premium advertising program, and CEO Eric Schmidt said the company was working to lower the ranking of sites in the results of search. But RT remains an important presence on YouTube with 2.3 million subscribers. Later Google told a Russian regulator that the company would not change its ranking algorithms.

"Child abuse is a subject in which they feel very comfortable acting aggressively to control their ground in a focused manner," Barrett said. "My feeling is that they will have to focus on both Russia and child abuse, if they do not, we will see a repetition in 2018 of what happened in 2016"


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *