Oh, Apple. Can't you weigh into anything without making a mess?
The latest: Apple wants to use its extensive powers to fight child pornography. As is typical, the company has good intentions, wants to advance a great goal — and then uses such overreach as to give people dozens of reasons to oppose them. To paraphrase the old adage, the road to hell in this case starts at One Apple Park Way. Alternatively, think of Cupertino as where good ideas go to become monstrous executions.
This started last week with Apple announcing plans to do something to slow down child pornography and children being taken advantage of. Fine, so far. Its tactics include telling parents when their offspring download nude or otherwise erotic imagery. Before we get into the technology aspects of all of this, let's briefly consider the almost infinite number of ways that this could go bad. (Maybe that's where the old Apple headquarters got its Infinity Loop name.)
Consider young teens who may be exploring their feelings, trying to understand their desires and thoughts. And to then have those searches immediately shared with their parents. Isn't it that child's right to discuss those feelings with whom they want, when they want? As others have noted, in some households, those kids might face severe punishments. This from a search on their phone to explore their minds?
As a parent, I have serious doubts about whether this is necessarily the right move for the child. But whether it is or not, I do know that I don't want Apple engineers — and certainly not Apple algorithms — making that call. For other arguments about the privacy implications, here is an excellent open letter.
Don't forget that, as a matter of policy, Apple engages in a manner consistent with local laws and regulations. Then think about how some countries view these issues and let that sink in. As Apple phrased it, the changes "will enable parents to play a more informed role in helping their children navigate communication online. The Messages app will use on-device machine learning to warn about sensitive content…." And "as an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it."
But there is yet a potentially worse issue for enterprise IT and, like all bad things, it involves getting around encryption.
Let's start with Apple's announcement. Here is a longer passage from the statement to offer more context:
"Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image. Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images. Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC. If a user feels their account has been mistakenly flagged they can file an appeal to have their account reinstated."
Before getting into the technology issues, let's try and realistically envision how fast, easy, and convenient Apple will undoubtedly make that appeal process. I think it's safe to say many of these kids will be collecting Social Security long before they see resolution from an appeal decision and explanation.
Pay particular attention to this: "Using another technology called threshold secret sharing, the system ensures the contents of the safety vouchers cannot be interpreted by Apple unless the iCloud Photos account crosses a threshold of known CSAM content. The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents of the safety vouchers associated with the matching CSAM images."
There are two things going on here that should freak out any CISO or cybersecurity staff. For any cryptographers out there, this will likely make your head explode. First, Apple's system is grabbing images before they get encrypted. This is not defeating encryption as much as it is sidestepping it. From a cyberthief perspective, it's not that dissimilar.
Second, consider this from the last quoted line: "Only when the threshold is exceeded does the cryptographic technology allow Apple to interpret the contents…."
That is begging for a nightmare. If Apple’s crypto controls can be opened “when the threshold is exceeded,” all a bad guy needs do is trick the system into thinking its threshold is exceeded. Forget porn. This could be a fine backdoor into viewing all manner of content on that phone.
The whole premise of phone-based cryptography is for it to be as close to absolute as practical. If it allows for something to be accessed prior to encryption or permits that encryption to be undone when some algorithm concludes that some criteria is met, then this isn’t secure anymore. It is simply drawing a roadmap for attackers to access all means of data.
Is it a backdoor? Perhaps, but even if it isn't, it is far too close.