Here’s Why You Don’t Need to Freak Out About Apple Recognizing Your Sexy Selfies
Perhaps you've heard the news that Apple now has the technology to recognize your semi-nude selfies (hey, no judgment, sometimes you're feelin' yourself) and automatically group them all into a folder it calls “Brassiere” (ick). After a tweet about the feature went viral earlier this week, Chrissy Teigen confirmed that it is indeed a thing, sparking understandable concern from people who want to know why, exactly, this is happening. But fear not: while it is kind of weird (and mildly sexist), it's nothing to worry about from a privacy standpoint.
ATTENTION ALL GIRLS ALL GIRLS!!! Go to your photos and type in the ‘Brassiere’ why are apple saving these and made it a folder!!?!!?😱😱😱😱
— ell (@ellieeewbu) October 30, 2017
Apple has used image recognition technology to scan, sort, and group photos since iOS 10 dropped in 2016, meaning this brassiere folder has been on most of our phones for a year without anyone noticing. The AI technology is capable of classifying over 4,000 objects without the user having to label them, making it easy to open your photo app and search for, say, all of your food pics or all of the photos you have of cats (so, so many). Or, yes, all of the pictures where you're rocking a so-called brassiere.
This doesn't mean, however, that your sexy photos are in any way available for access by a third party. Upon discovering the feature, many people feared that it meant Apple was exporting them to a cloud somewhere to have the brassieres identified, but this is not the case. As the Guardian notes, “the actual recognition is carried out entirely on the iPhone itself, with a unique version of the AI running on each device, meaning your brassiere pictures remain entirely private – a secret between you and Siri.” Phew.
The confusion is understandable because this is actually different from how many Apple competitors do image recognition. Google, for example, does upload all images stored on its Google Photos app to a cloud, and uses them, albeit anonymously, to train its AI to be better at recognizing different aspects of images. But, again, it's just a robot viewing them, not some person.
It's worth noting, however, that Apple Photos doesn't appear to make little folders for the equivalent terms for male undergarments, like “boxers,” making the feature (and by association the person who engineered it) not just weird but kind of sexist. As Fader noted, terms like “shower,” “bathtub,” and “sunbathing” are all searchable. Thankfully, the technology does not appear to be categorizing full-on nudes. Unless, I guess, they happen to take place in a bathtub.
Also? The technology isn't even that good. Pull up your own brassiere folder and you're likely to find, yes, any photos you've taken in lingerie or a bikini, but also any pictures where you're wearing thin or no straps or even just a top with lace detailing. So if any trolls manage to steal your phone and type in “brassiere”, they're likely to be pretty disappointed.
Apple has yet to publicly comment on the controversy, but I think it's safe to say we'd probably all appreciate the removal of the bra folder from the next iOS update.