Happy 10th birthday iPhone, the nearest thing to a secure pocket computer

Keith Martin, Professor, Information Security Group, Royal Holloway and Kenny Paterson, Professor of Information Security, Royal Holloway

It’s common for security experts to regard themselves as necessary critics, guardians against malpractice, and raisers of worst-case scenarios. While there is a very present fear of insecurity these days, it’s rare that we celebrate security. But on the tenth anniversary of a revolutionary technology, we’d like to do just that: happy birthday to the iPhone, first released in June 2007.

Ten years ago, a computer was something that hurt your foot if you accidentally dropped it. Mobile phones were devices that were chiefly used for making phone calls. Today, the idea that we can’t use these palm-sized pocket computers to command all our digital communications, and also as a camera, games console, torch, and a hundred other things, is quite unthinkable.

There is no such thing as complete security, and the iPhone is not perfect. Like many other technologies, the iPhone’s security relies on a user’s ability to choose and protect a strong password, which is a pragmatic rather than ideal basis for security. Researchers have also uncovered weaknesses in the protection of messages stored on the iPhone. Nonetheless, in an era when the rush to market has resulted in far too many insecure technologies, the iPhone stands out as an exemplar for how it’s possible to do things right.

A benevolent dictatorship

The internet, in case you hadn’t noticed yet, can be a dangerous place. Apple has often been criticised for its restrictions on what programs its users can and cannot load onto an iPhone. Users are required to download apps from the well-marshalled Apple App Store, which provides a secure gated compound within which software has been scrutinised by Apple before being made available for download.

While this may be seen as nannying, in a world of ruthless ransomware and untold other malicious programs that can ruin both our computers, our bank accounts, and even our lives – what’s wrong with a benign governess? The Android app store by comparison allows users to install any software of their choice, not all of which has been closely inspected for vulnerabilities or malicious intent.

Getting cryptography right

The iPhone makes extensive use of state-of-the-art cryptography to protect data on the device. Cryptography provides mathematical tools to ensure secret data is kept secret, ensuring data is not maliciously altered or deleted, and identifies the source of data. Cryptography is easy to get wrong when used in a computer, but the iPhone mostly gets cryptography right. Everything from photos, messages, email and app data is protected using strong cryptography. The iPhone also supports innovative applications of cryptography, such as the contactless payment system ApplePay.

Cryptography relies on cryptographic keys, which are secret components critical to providing secure services, and security. Many of the spectacular past failures of security technology, for example the infamous Diginotar hack, have resulted from careless management of keys. There is no point, after all, in using the best lock to lock your front door, only to leave the key under the doormat. The iPhone has a secure hardware vault known as the Secure Enclave within which its critical keys are safely stored. In fact the keys are so safe that they are inaccessible even to Apple or any other companies involved in manufacturing iPhones.

Standing up for privacy

Which brings us to the matter of Apple’s skirmish with the FBI. Apple has been at the forefront of a much wider and more fundamental debate about security and privacy on the internet.

In one corner stand national security agencies and law enforcement. They have been demanding the means to access data secured on mobile phones, including encrypted messaging services like WhatsApp and emails, in order to defend the realm. In the other corner stand proponents of digital freedom. They argue that building “backdoors” into strong encryption even for legitimate use by investigators would become a potential weakness for cybercriminals to exploit.

Apple has not shied away from taking a strong stance in favour of privacy. Apple does not know the keys on your iPhone, or the PIN needed to unlock it, by design. That protects you from Apple, just as much as it prevents Apple handing them over to law enforcement. The iPhone was designed to be secure, so why make it insecure just because bad guys sometimes use them?

Apple’s security design decisions haven’t always made them popular, especially among its community of developers or with government agencies. But, unlike many of its competitors, the iPhone is a personal device which is just as secure for children and grandparents to use as it is for the few these days who really understand how the technology works. That’s something to celebrate, not bemoan. So, many happy returns to the iPhone, perhaps the closest we’ve come to having a secure computer in our pocket.

This article was originally published on The Conversation. Read the original article.

The Conversation

Keith Martin receives funding from the EPSRC and the European Commission.

Kenny Paterson receives funding from EPSRC and the European Commission. He is co-chair of the Crypto Forum Research Group of the Internet Research Task Force. He serves as an advisor to Huawei Technologies, SkyHigh Networks and CYBERCRYPT ApS.

By using Yahoo you agree that Yahoo and partners may use Cookies for personalisation and other purposes