Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Why an Apple victory against the FBI would be a win for all of us

Steven J. Vaughan-Nichols | Feb. 22, 2016
Putting back doors into any software, even once, is just asking for trouble.

apple headquarters cupertino california
Credit: Andrew Hitchcock, CC-BY-1.0, via Wikimedia Commons

We could argue endlessly over the legal, political and technical fine points of the FBI getting a court order requiring Apple to assist it in cracking open a locked iPhone 5c. That’s not really the point.

True, Apple has cheerfully helped the government look into customers’ data before. No, the FBI isn’t asking for an iPhone backdoor —this time. All Magistrate Judge Sheri Pym of the U.S. District Court for the Central District of California really wants is for Apple to provide a one-off, signed iOS image that will enable the FBI to try different iOS 9 passcodes quickly without triggering the iPhone’s auto-erasure feature after 10 failed attempts.

Some people think Apple shouldn’t do this. Nonsense!

As Dan Guido, CEO of Trail of Bits, an information security startup, points out, “As many jail-breakers are familiar, firmware can be loaded via Device Firmware Upgrade (DFU) Mode. Once an iPhone enters DFU mode, it will accept a new firmware image over a USB cable. Before any firmware image is loaded by an iPhone, the device first checks whether the firmware has a valid signature from Apple. This signature check is why the FBI cannot load new software onto an iPhone on their own — the FBI does not have the secret keys that Apple uses to sign firmware.”

Give me those keys and I could crack your iPhone! And, while I may know more tech than the average bear — or the average IT pro for that matter — I am far from being a top-notch programmer.

But giving the FBI those keys is still a lousy idea for reasons that have nothing to do with the law or the politics of the issue.


Because, as Rich Mogull, a security analyst at Securosis, pointed out, the real issue is companies should be “required to build security circumvention technologies to expose their own customers?” Or, as I put it, “Should companies be required to put back doors in their software?”

To that question, I answer: “Hell no.”

My reasoning for this position is very simple. First, if a business or a government can crack open our records for a good reason, how long will it be before they can do it for a bad one? Answer? No time at all.

And it’s not just a slippery slope because other government agencies could (will?) get their noses into our private business. It’s a slippery slope because, once there’s a back door of any sort, it’s only a matter of time before it’s misused by hackers.


1  2  Next Page 

Sign up for CIO Asia eNewsletters.