Five Years Later, the iPhone’s Legacy is Secure

It’s nearly impossible to remember now, but there was a time when the iPhone didn’t exist. That time was five years and one day ago, and up to that point the idea of standing in line overnight for a mobile phone was almost as ridiculous as the notion of Apple being thought of as an innovator in security. But the former is now commonplace and the latter is straight fact, if only in discussions about the iPhone.

It’s nearly impossible to remember now, but there was a time when the iPhone didn’t exist. That time was five years and one day ago, and up to that point the idea of standing in line overnight for a mobile phone was almost as ridiculous as the notion of Apple being thought of as an innovator in security. But the former is now commonplace and the latter is straight fact, if only in discussions about the iPhone.

When the first iPhone hit the market on June 29, 2007, Apple was a company in flux. Lightly regarded in the enterprise, Apple was seen mainly as a maker of shiny, pricey baubles that were gobbled up eagerly by its millions of devoted fans. The iPod was a worldwide phenomenon, having completely turned the music business on its head and Apple was selling tens of millions of the devices every year, swelling the company’s coffers and helping lead it back from the brink of financial disaster.

And yet there was trouble on the horizon in 2007. Steve Jobs, the company’s co-founder and guiding force, was ill. Investors and customers worried that if Jobs was forced to pull back from his duties running the company on a daily basis that Apple would falter again, as it had in the 1980s and 1990s when he was forced out. They needn’t have worried, as it turned out, because Jobs had an ace up his sleeve in the form of an oddly shaped black and silver collection of silicon and glass called the iPhone.

At the time of its introduction, the iPhone intrigued many people in the tech community, and not just because of its groundbreaking design and intuitive user interface. For those in the security industry, the iPhone was a black box, a tiny little computer with a proprietary operating system.

The iPhone was a challenge and there’s nothing that security researchers and reverse engineers love more than that.

Researchers and hardware hackers immediately set about looking for ways to get to the iPhone’s guts. Some were looking for ways to jailbreak the device, others just wanted to see what made it tick, while still others were poking and prodding the iPhone, hoping to find a bug or vulnerability. Apple had done it’s level best to make all of this difficult, not necessarily for security reasons, but mostly because that’s how Apple does. No company is more opaque and tight-lipped about its products and engineering than Apple, and this naturally extended to the iPhone, with the company providing very little in the way of documentation or explanation of how the iPhone OS worked or what happened under the covers.

It didn’t take long, though, for researchers to crack the iPhone. Within a few weeks of its release, Charlie Miller, then of Independent Security Evaluators, discovered the first bug in the iPhone OS and developed an exploit that could be delivered through a drive-by download against the mobile version of Safari. Because Miller had no real visibility into the iPhone’s inner workings, he had to repeatedly crash the mobile browser and then read the resultant crash reports to look for interesting data. There was no jailbreak bug to help him out.

“It was really hard because we couldn’t jailbreak the phone so couldn’t, for example, install a debugger. All we could do is crash the browser, plug in the device, retrieve system generated crash reports, and repeat. It was agonizing!” said Miller, now a principal research consultant at Accuvant and one of the top Apple security researchers in the industry.

That first-generation iPhone mostly relied on security through obscurity to defend itself. Apps could only be installed from the Apple-controlled iTunes App Store, giving the company total control of the software ecosystem, a dream for vendors. But the iPhone then had none of the built-in defenses that we now associate with iOS, and, in fact, was pretty poorly designed when it came to security. The early versions of what came to be called iOS had nothing in the way of exploit mitigations or process restrictions. As Miller discovered, if you got on the phone, you owned it completely.

“Of the things that make current iOS versions secure: mandatory code signing, ASLR, DEP, sandboxing, etc, the original iPhone had none of them. In fact all processes ran as root, including MobileSafari,” Miller said.

Since that inauspicious beginning, a steady stream of researchers have taken swings at iOS, finding more bugs and vulnerabilities and helping to spark the interest of the jailbreak community. Apple, which was not used to that kind of attention from researchers and attackers, eventually took the hint and began adding significant security features to iOS. Exploit mitigations such as ASLR and DEP are now standard in iOS. But if there’s one change that’s made the biggest mark, it’s the addition of code-signing.

“The differentiator is definitely mandatory code signing. This helps make both ways malicious code get onto devices more difficult.  First, it makes malware harder to download, because instead of being able to download any app from anywhere, apps can only be downloaded from the app store.  Apps from the app store must be approved by Apple which has the opportunity to spot malware and prevent it from showing up in the store, although its not clear how they actually do this,” Miller said.  

“The other way mandatory code signing makes it hard for bad guys is that it makes exploitation (drive-by-downloads) harder too.  This is because normally exploits would like to either run some payload in memory, or even better download and execute some payload, like say a trojan.  Mandatory code signing prevents either of these scenarios because the attacker payloads will not be signed and so cannot be executed, even in memory. Therefore, the entire payload must reuse existing signed code (i.e. use return oriented programming) which is a total pain, and if the process dies or the phone reboots, the attacker is completely gone from the device.”

And what’s been the result of all of these security improvements? In the five years since its debut, the iPhone still has yet to be the target of a major piece of malware. Bugs? Sure. Vulnerabilities? Yup. But nothing at all in the way of real attacks or malware. The iPhone has been the most attack-resistant mobile platform developed thus far.

As Miller said: “They’ve come a long way, baby!”

Suggested articles