In other forums, Apple has been claiming that if the U.S. requires Apple to cooperate in providing access to the phone, all other governments around the world will then expect the same sort of cooperation. It is a bogus claim — more FUD. Do Apple’s lawyers really not know that the law of one country does not apply to another? Apple’s winning its case in the U.S. would do nothing to stop another country from initiating a similar action. Its losing its case should have no influence on whether other countries decide to pursue such matters.
So even if the Supreme Court decides that Apple does not have to modify the iPhone software for this case, China can make such a demand of Apple, and if Apple wanted to resist that demand, citing the U.S. precedent would be irrelevant. Russia can make such demands. Even Canada can do it, assuming its laws allow for such a thing. And frankly, no one can say for sure that Apple hasn’t already been required to provide Russia or China with the ability to backdoor its phones.
But of all of Apple’s arguments, the one that is most ludicrous, or perhaps the most damning of its much-touted security prowess, is revealed in this response to the government’s request for a key that could unlock one phone:
“Of course, Apple would do our best to protect that key, but in a world where all of our data is under constant threat, it would be relentlessly attacked by hackers and cybercriminals. As recent attacks on the IRS systems and countless other data breaches have shown, no one is immune to cyberattacks.”
First, Apple is already relentlessly attacked by hackers and criminals. I would like to hope that Apple has better security practices than the IRS. But when you unpack this statement, you are left with the impression that we should not trust any of Apple’s software or products. You have to assume that, should Apple write the software that the FBI wants, it would be among the most protected software in the company. If Apple is concerned about this software being compromised, what does that say about all of its other software?
So if you believe that Apple fears that this software is at risk of compromise and would create a serious security risk, you have to believe that all of its iOS software is at serious risk.
Even assuming that a bad guy gets hold of just the software that law enforcement wants created, it would have to be signed by Apple’s security certificate to load on any phone. If the criminal gets a copy of the software and it has already been signed with the certificate, Apple could revoke the certificate. But if a bad guy gets hold of Apple’s digital certificate, then the whole Apple software base is at risk, and this feature that the FBI wants bypassed is irrelevant. After all, Apple has stated that it is not immune from attack, and it has implied it is a reasonable concern that its most protected software can be compromised.
Sign up for CIO Asia eNewsletters.