The FBI wants a legal backdoor from Apple, not a technical one

A lot of media appear quite confused about a story in which Apple is refusing a US judge's ruling that requires the company to assist the FBI in accessing encrypted data on an iPhone 5c belonging to a man who killed 14 people.

This is understandable, after all encryption is a complicated subject even for those who work in and around the security business. So let's try and clear some of that confusion up, shall we?

That law enforcement would want to access data on the iPhone belonging to Syed Rizwan Farook, who killed 14 people in a suspected terrorist attack in San Bernardinolast December, is a given. That they should be able to is more open to debate, which is why this particular judicial request is so important.

Apple's CEO, Tim Cook, says the US government is asking "for something we simply do not have" and "something we consider too dangerous to create", that something being a backdoor into the encryption used by iPhones.

Indeed, Cook himself states that the FBI "have asked us to build a backdoor to the iPhone".

Cook goes on to suggest that the FBI wants Apple to "make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation".

Obviously, this would be a bad thing' because as I have stated repeatedly, you cannot introduce a weakness into a security product and expect it to only be exploitable by yourself.

However, would it actually be a backdoor in the accepted sense? That's a little less clear. As I understand it, what it being requested by this order that the judge signed is for Apple to disable a security barrier that wipes all data from a phone after 10 incorrect passwords.

Currently, so we are told, FBI data forensic specialists have been unable over the course of nearly three months to get into the iPhone. Brute forcing the password would trip the repeated errors limitation and delete the data the FBI is after.

Here's what the judge wants: reasonable technical assistance' to bypass or disable the auto-erase function, to enable the FBI to submit a passcode to the device without introducing additional delays between passcode attempts.

Not a backdoor, but certainly a path that leads towards it by making brute forcing of the passcode relatively simple. Assuming the passcode is a straightforward PIN, of course, and isn't a hugely complex string instead.

For Apple to obey the court order it would, so it seems, have to rewrite parts of iOS to enable the unlocking of the device by someone other than the device owner.

If it did, and did so within time limits set by the court, then the chances are the resulting code would be buggy. This in and of itself means that it would open up even more risk for the future of encrypted iOS devices.

Indeed, the court order states that as part of the reasonable technical assistance remit', Apple would provide a signed iPhone software file or recovery bundle. It suggests a Software Image File coded by Apple so as to only run on the particular phone in question.

At which point I have to refer back to my you cannot introduce a weakness that benefits only one side' argument.

There are folk within the security industry, and the forensic data recovery business, who would insist (off the record) that it's already feasible to remove data from an encrypted iPhone, thanks very much.

That may well be so, and the fact that this is a 5c model without TouchID and, therefore, without the hardware benefit of the Secure Enclave, getting at the data shouldn't be too problematic with or without the help of Apple.

If it were really that easy, you might imagine, the FBI would have done it by now. Unless you go all X-Files on this one. How about if this was less to do with if it is possible to comply with the FBI request, and more to do with forcing the removal of security functionality without having to bother with that democratic process nonsense?

I mean, why bother with debates and votes to try and change the law when you can just haul something out of the dusty statute books from 1789 (the All Writs Act) and get a lowly magistrate judge to sign off on that instead?

Now that is where the real backdoor in this whole thing comes in, a backdoor that enables political and democratic processes to be side-stepped.

That is what we should be concerned about. Apple is being thrown between a rock and a hard place, where it will find arguing it cannot technically comply because it is technically unreasonable to be very difficult indeed.

Instead it might have to stick with the because it is wrong' argument, and that could prove altogether much harder to succeed with in a court of law.

Or maybe it could just introduce anotherError 53that bricks the phone as soon as it detects any brute forcing by the FBI

Davey Winder

Davey is a three-decade veteran technology journalist specialising in cybersecurity and privacy matters and has been a Contributing Editor at PC Pro magazine since the first issue was published in 1994. He's also a Senior Contributor at Forbes, and co-founder of the Forbes Straight Talking Cyber video project that won the ‘Most Educational Content’ category at the 2021 European Cybersecurity Blogger Awards.

Davey has also picked up many other awards over the years, including the Security Serious ‘Cyber Writer of the Year’ title in 2020. As well as being the only three-time winner of the BT Security Journalist of the Year award (2006, 2008, 2010) Davey was also named BT Technology Journalist of the Year in 1996 for a forward-looking feature in PC Pro Magazine called ‘Threats to the Internet.’ In 2011 he was honoured with the Enigma Award for a lifetime contribution to IT security journalism which, thankfully, didn’t end his ongoing contributions - or his life for that matter.

You can follow Davey on Twitter @happygeek, or email him at davey@happygeek.com.