Wednesday, February 17, 2016

Apple vs The F.B.I. - What you need to know about the fight for your privacy and the government


The current San Bernardino case involving Syed Rizwan Farook’s iPhone 5c and whether or not Apple should legally help unlock it has brought the company’s stance regarding strong encryption to the forefront.

Since this debate isn’t going away any time soon, here’s what you need to know about it so far — and why it’s a much, much bigger issue than just one legal case.

What has happened so far?

In December 2015, shooters Syed Rizwan Farook and Tashfeen Malik murdered 14 people and injured a further 22 after opening fire at an office party in San Bernardino in an apparent terrorist attack. After the shooting, the FBI discovered the iPhone 5c belonging to Farook, although they have been unable to unlock it due to Apple’s encryption.

On iOS devices, important files are encrypted in such a way that users must unlock the phone with a manually-entered passcode, and user data will be wiped if enough incorrect PIN attempts are made (Since the device is an iPhone 5c it doesn't have Touch ID so using the dead gunman's finger to unlock the device is not possible). Yesterday, United States magistrate judge Sheri Pym requested that Apple give the FBI a custom firmware file allowing it to unlock the iPhone 5c in question.

Exactly what is being asked for?

The FBI wants Apple to build a special version of iOS which works only on the iPhone which has been recovered. This version will contain three major differences to regular iOS.

Firstly, Apple will bypass or else disable the auto-erase function for the device in question.

Secondly, Apple will enable the FBI to submit passcodes to the iPhone via the physical device port, Bluetooth, Wi-Fi, or other protocol rather than having to enter each PIN attempt manually.

Finally, Apple will stop the iOS software from purposely introducing additional delays between passcode attempts as it does as standard. These delays get worse and worse as more wrong PIN codes are entered until the delay between attempts is one hour.

Because the iPhone 5c was a pre-Touch ID handset, there’s no need to argue about the legal difference between passcodes and biometrics.
Can Apple do this?

A blog entry from Trail of Bits suggests that Apple has the power to do this, despite its strong iOS encryption. Security expert Dan Guido writes that:

“Apple has allegedly cooperated with law enforcement in the past by using a custom firmware image that bypassed the passcode lock screen. This simple UI hack was sufficient in earlier versions of iOS since most files were unencrypted. However, since iOS 8, it has become the default for nearly all applications to encrypt their data with a combination of the phone passcode and the hardware key. This change necessitates guessing the passcode and has led directly to this request for technical assistance from the FBI.

I believe it is technically feasible for Apple to comply with all of the FBI’s requests in this case. On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.”


If you’re interested in the specifics, Guido goes into far more detail on his blog about the way in which Apple could overwrite the iPhone’s firmware with a version which conforms to all requested specifications — allowing the FBI to brute-force its entry onto the handset.

It’s already possible to hook up an iPhone to a device which tries to brute-force the passcode by simply starting at 0000 and working through to 9999. The problem for the FBI is that iOS has a couple of security systems designed to defeat this.

First, you can set your iPhone to automatically erase all data after 10 failed passcode attempts (Settings > Touch ID & Passcode > Erase Data). Any tech-savvy terrorist or criminal is going to have this turned on.

Second, iOS enforces increasing delays between failed passcode attempts:
  • 1-4 attempts: no delay
  • 5 attempts: 1 minute
  • 6 attempts: 5 minutes
  • 7-8 attempts: 15 minutes
  • 9 attempts: 1 hour
This explains why the FBI’s attempts to gain access in this way have still not succeeded some two months after they began.

So what is the problem?

Right from the start, Apple has cast user privacy as a moral issue every bit as much as a technical one. In other words, just because Apple could conceivably hack an iPhone doesn’t mean that it should. In an open letter published today, Tim Cook explained his position:

“When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

At the end of the letter, Cook suggests that — good intentions aside — the FBI may end up undermining, “the very freedoms and liberty our government is meant to protect.”

While there’s no doubt that Apple is no supporter of terrorism, it’s also a proponent of strong encryption when it comes to keeping users safe. Those opposing ideological stances is why this is a bigger issue than one single case. It's about your right to privacy versus the government's duty to save lives.

No comments:

Post a Comment