The camera is the most used component on our phones. We’re taking several photos a day on average. Some of us more than others. We are encouraged to take photos. All of our social networks allow us to upload photos to share with our friends, and we get more interaction with photos than just words. We are a visual animal, after all. So it makes sense that we take a lot of photos.
We also save a lot of photos. Whether they be memorable wedding photos, a vacation, or just a funny meme to share later, we have huge libraries of photos.
Did you know that Apple has access to anything you store in iCloud and send on iMessage? It has to in order to be able to hand over data when presented with a warrant for it. It also recently announced that it’ll be scanning your photos and comparing them against a known database of child pornography. Apple will be distributing a database of hashes of known child pornographic photos in an iOS update. A hash is a long string of characters that you get when you run a file through a complex mathematical equation. They’re generally impossible to reverse, meaning if you have the hash it’s impossible to use that hash to reconstruct the file from which it was made.
Your device will compare every photo on it against hashes of known child pornography. If enough of your photos present as a match, the matches will be sent to a human for review and then on to law enforcement if that human believes the photos are indeed child pornography.
For iMessage it’ll be using machine learning to try and determine if the photo is explicit or not. At rollout, the iMessage system will only be active for children enrolled in a family account. If that child tries to send a message that the machine-learning algorithm deems to be explicit, the phone will pop up a message saying that it will notify the parent if the child sends it. Likewise, if the child receives an image that is deemed explicit, before viewing the image the child will receive the same popup stating that a parent will be notified if they choose to view the image.
All of this is being done under the guise of protecting children. But so would installing security cameras in our bedrooms. This system that Apple is building is ripe for governmental abuse. How long will it take China to tell Apple to add photos of Winnie the Pooh, a character banned in China, to its illicit database? How long will it take the U.S. government to tell Apple to add photos of people at protests to its illicit database?
I don’t mind Apple turning over data it has access to when presented with a warrant. But one of the main reasons I use Apple products is privacy. These features erode that privacy and create a backdoor into our devices and our digital lives. It’s a door that’s open just a crack right now, but it’ll be opened much wider soon enough. I’d rather the door not exist.