For those people, it can take months for an Android release to become available for their devices, and it might not happen at all. Many product vendors just stop supporting older devices -- and by "older" I mean six months or more. That's right: You might be locked into a two-year contract with your service provider, but there's a good chance that you will never see any of Google's updates made available for your device. Seriously, at the rate Google releases new versions, you could find yourself two or three releases behind by the time that contract expires.
It is in the security area that this most matters. The fragmentation of the Android market means that app developers can't start using all the cool new security (and functional) improvements in an OS release for a long time after it's been released by Google. At least, not without disenfranchising much of their potential market. So things like keychain storage for authentication credentials just aren't widely used by Android apps yet. App developers instead have to invent their own way of securely storing such data, with predictably random results.
Apple's Security Pros and Cons
OK, so how about Apple's iOS? What's the best and worst there?
Apple's closed and regulated App Store has served its users well so far, despite the many bitter complaints from the tech community. The App Store has a vast selection, and the all-encompassing digital signature hierarchy ensures that only signed and approved apps can be installed. (Again, I'm referring to white-bread consumers here, not the niche market of jailbreakers who bypass the App Store.)
Between Apple's App Store curation and the strict enforcement of signed apps, iOS users have enjoyed a largely malware-and-Trojan-free environment -- something that Android can't even approach. Despite its restrictive nature, the App Store process gives the consumer a pretty darned safe (vs. secure) mobile app world to work and play in.
On the other hand, data protection via encryption has been a near-complete failure for iOS. Back when the iPhone 3Gs came out, we got hardware AES-256 encryption. Hooray, said the security world -- until we saw the details. Indeed, today's iOS devices all do whole-disk encryption in addition to encrypting every file on the file system. That encryption is AES-256 performed in hardware. On paper, this sounds military grade, but in practice it's not even comic-book grade, because Apple stores the primary encryption keys (the EMF! and Dkey keys) in plaintext in Block 1 of the solid-state hard drive.
In cases when an attacker has possession of a lost or stolen iOS device, only encrypted files that are further protected with keys derived from the device's unique hardware ID and the user's passcode, such as email, keychains and a few others, are somewhat protected from trivial compromise. But those AES keys can be broken in most cases, since most consumers use nothing more than a four-digit PIN to passcode lock their devices -- and many use nothing at all.
Sign up for CIO Asia eNewsletters.