Apple Photos can correctly identify your ‘brassiere’ photos in search


Apple's image recognition algorithm has been accused of being a creeper.
Apple’s image recognition algorithm has been accused of being a creeper.

Image: Shutterstock / Enrique Arnaiz Lafuente

2016%2f11%2f22%2f65%2f11.a522cBy Brett Williams

Apple has used image recognition algorithms to search and organize its Photos app since iOS 10 debuted in 2016 — but a viral tweet put the tool in the spotlight after it appeared to be stockpiling photos of women’s bras in a separate storage category within the Photos app.

Before going any further, it’s important to make a few things clear: There isn’t a separate “folder” filled with your intimate pics within the Photos app, and no one else can access your photos without you giving them permission. So the chance of your private photos leaking is exactly the same as it was before.

And what if you already have folders of your intimate photos on your phone, secret or otherwise? More power to you. You do you. Got it? Good. 

Now back to this weird controversy: It all started when Twitter user @ellieeewbu spotted the search category and took to Twitter to spread the word. 

ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brbadiere’ why are apple saving these and made it a folder!!?!!?????

— ell (@ellieeewbu) October 30, 2017

The tweet went viral, as thousands of people across the internet checked their personal Photos and found it to be true. Even Twitter Queen and supermodel Chrissy Teigen weighed in on the Photo category, spreading the news even further.

It’s true. If u type in “brbadiere” in the search of your iphotos, it has a category for every bad or cleavage pic you’ve ever taken. Why.

— christine teigen (@chrissyteigen) October 31, 2017

This is just an example of Apple’s image recognition software working exactly as it was designed. The issue, though, is that people aren’t exactly comfortable with how the system works and the search category that kicked off the controversy. 

Apple’s search recognition technology is trained to identify different faces, scenes, and objects in your pics, according to Apple’s website. That means that “brbadiere” is a keyword that Apple’s image recognition system can use to identify pics that appear to contain similar qualities to images that it was trained to badociate with the term. The photos are on your phone, so they’re ID’d and served up to you when your search using the keyword.

The company doesn’t publicize exactly what those search fields include, but there has been some guidance from developers who have poked through the code, as The Verge notes. One of those devs, Kenny Yin, found in 2016 that Apple’s keywords included the term “brbadiere,” along with some other badociated words for women’s lingerie like bandeau, bandeaus, bra, bras, and the plural form, brbadieres. The information has been out there — it just hasn’t been widely publicized. 

While the Photos app can identify different aspects of your images, it only does so by using the processing power available via your device, according to Apple. The company insists that photos are “yours and yours alone,” and the “on-device intelligence” was a major sell for the new feature during iOS 10’s launch.   

Even if these photos are private, the fact that these particular keywords exist clearly struck a nerve with many on Twitter who saw the viral posts and tweeted out their dismay that there’s something particularly lascivious about the feature. The search recognition tool doesn’t appear to recognize broader terms like “underwear” or traditionally male gendered undergarments like “boxers” or “briefs,” which is a curious double standard. 

Just to be consistent, I checked out a few of the terms using Google Photos. “Brbadiere” didn’t yield any results, but the more commonly used “bra” yielded a few innocuous results of subjects in low cut tank tops. But Photos doesn’t have an entire category dedicated to such a specific, uncommonly used word like “brbadiere,” which makes Apple’s setup seem strange.  

We reached out to Apple for comment about the keywords, but our requests for comment haven’t been answered. We’ll update the story if we hear back.

Https%3a%2f%2fblueprint api uploaders%2fdistribution thumb%2fimage%2f81918%2f6f7e2d06 2f12 403c 9d70 80a8a36f6a19

Source link

Leave a Reply

Your email address will not be published.