YouTube apologizes for the suggestion of AutoComplete & # 39; How to have S * x with your children & # 39;



[ad_1]

YouTube tweeted an apology hours after one of the worst autocomplete instances was reported on Sunday on the Google-owned video platform.

A user shared a videograb with YouTube that shows his typing in the words "how to do it" in the site's search bar even when the YouTube autocomplete suggestions arise, from "how to tie a tie" to "how to make slime without glue ".

But when the user added the word "have" after "how to do it", the autocorrect option exploded, which says "how to have s * x with your children".

When the user told @TeamYouTube, they issued the official apology on Sunday night that said: "This is a horrible autocomplete result and we really appreciate you letting us know, we have removed it and we will continue to investigate what caused this." [19659003] The latest incident is one of many cases in which YouTube has been proven wrong in some way with its main video. hosting service.

Recentl and several major advertisers said they would suspend their advertising campaigns on YouTube after they discovered that their ads were shown along with videos that had received millions of visits because they showed children in compromising situations.

In addition, many of the comments that say the videos attracted were pedophile in nature.

According to the Wall Street Journal, among advertisers who have suspended their YouTube ads are heavyweights such as Adidas, Mars Inc., Diageo and Captain Morgan.

The existence of such videos – and the kind of views and comments they received – got great worldwide attention when BuzzFeed reported on the same last week.

In his report, BuzzFeed mentioned a "vast, disturbing and wildly popular universe of videos" among which were live action videos showing children in bedding, etc.

In response, YouTube removed many of these videos from its platform and said they would strictly enforce the community guidelines.

As a result of this, the implication arose that YouTube was inciting pedophiles by hosting such videos with the naked eye.

Before this, Google was forced to remove content that promoted terrorism and extremist opinions, that also later left that content unmonitored on the platform for years. YouTube was also under pressure recently to remove content that is considered disruptive to its YouTube Kids platform.

Among these were cartoons that represented animated characters of popular children who performed strange and violent acts.

It is in this context that the incongruent autocomplete The fiasco has happened.

What's happening with YouTube could be seen in the broader context of how social networks work, including Twitter and Facebook.

While it is true that the growth of such companies is driven by their strong supportive spirit of freedom of expression, the same ideology also helped attract these platforms, harbadment and marginal content.

So far, algorithms have not been very successful in curbing such problems. This points to the scenario by which it would take a large amount of human labor to effectively monitor these platforms.

So far, none of these platforms has presented a well-designed plan to prevent such problems from arising in the future. , at least none that the world knows.

[ad_2]
Source link

Leave a Reply

Your email address will not be published.