Many iOS Encryption Measures ‘Unused’, Say Cryptographers

IOS doesn’t use integrated encryption measures as much as it could, allowing for potentially unnecessary security vulnerabilities, according to cryptographers at Johns Hopkins University (via Wired)

IPhone 12 security feature

Using publicly available documentation from Apple and Google, law enforcement reports on how to get around mobile security features and their own analysis, cryptographers assessed the robustness of iOS and Android encryption. The survey found that while the encryption infrastructure on iOS “looks very good”, it is not used:

“On iOS in particular, the infrastructure is ready for this hierarchical cryptography that looks very good,” said Maximilian Zinkus, chief iOS researcher. “But I was definitely surprised to see how much of it is not used.”

When an iPhone is booted, all stored data is in a “Full Protection” state and the user must unlock the device before anything can be decrypted. While it is extremely secure, the researchers noted that since the device was unlocked for the first time after a reboot, a large amount of data goes into a state that Apple calls “Protected until the user’s first authentication”.

Since devices are rarely restarted, most data is in a “Protected state until the user’s first authentication”, rather than “Full protection” most of the time. The advantage of this less secure state is that the decryption keys are stored in quick access memory, where they can be accessed quickly by applications.

In theory, an attacker could find and exploit certain types of security vulnerabilities in iOS to obtain encryption keys in quick access memory, allowing them to decrypt large amounts of data from the device. It is believed that this is how many access tools for smartphones work, such as those of the forensic access company Grayshift.

While it is true that attackers require a specific operating system vulnerability to access keys, and Apple and Google fix many of these flaws as they are noticed, this can be avoided by hiding the encryption keys more deeply.

“I was really shocked, because I entered this project thinking that these phones are really protecting user data well,” said Johns Hopkins cryptographer Matthew Green. “Now I left the project thinking that almost nothing is protected as much as it could be. So, why do we need a back door for law enforcement when the protections these phones really offer are so bad?”

The researchers also shared their findings and a series of technical recommendations directly with Apple. An Apple spokesman offered a public statement in response:

“Apple devices are designed with multiple layers of security to protect against a wide range of potential threats and we are constantly working to add new protections to our users’ data. As customers continue to increase the amount of sensitive information they store on their devices, we will continue to develop additional hardware and software protections to protect your data. “

The spokesman also said Wired that Apple’s security work focuses primarily on protecting users from hackers, thieves and criminals looking to steal personal information. They also noted that the types of attacks that the researchers highlighted are too expensive to develop, require physical access to the target device and only work until Apple releases a patch. Apple also emphasized that its goal with iOS is to balance security and convenience.

.Source