The FBI has won a court order demanding Apple help the bureau in accessing the data on the iPhone 5c of one of the San Bernadino gunmen.
The judge ruled Tuesday that the Cupertino-based company had to provide “reasonable technical assistance” to the government in recovering data from the iPhone 5c, including bypassing the auto-erase function and allowing investigators to submit an unlimited number of passwords in their attempts to unlock the phone. Apple has five days to respond to the court if it believes that compliance would be “unreasonably burdensome.”
In response, Apple’s CEO Tim Cook has published an open letter opposing the court order.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software – which does not exist today – would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
It should come as no surprise that I strongly, deeply, and vehemently agree with Tim Cook, and I applaud the company for trying to fight this court order every step of the way. It would be great if other technology companies – Microsoft, Google, whatever – publicly join Apple in trying to fight this court order. Strength in numbers.
That being said, it will be in vain. Apple – and thus, all of us – will lose this war. They might win this particular battle, but they won’t win all the battles to come. All it takes is for one important country to demand a backdoor and Apple caving – due to financial pressure, sales stops, etc. – for the whole house of cards to come tumbling down.
This is a hard fight, that we will lose. Get ready.
If we want to crack this problem – where your device isn’t used against you – is there any other way than open hardware, open software, open firmware?
I’m not saying it’s easy, but I don’t see any appraoch that involves closed tech being part of the solution.
And how, precisely, would that help? Open or closed, the government would demand, and get, access. A completely open stack would help in certain cases, but it’s no cure-all and of no help under these circumstances where the FBI already has physical possession of a device.
So this is really a misnomer to start with as the FBI already has the device. From the one article I read, they were complaining because they only had 10 attempts to decrypt it before it would self-wipe; however, anyone that works with that kind of stuff knows that the way you get around that is by cloning it before you do anything and then only work on the cloned materials – thereby (a) preserving evidence and (b) keeping yourself from destroying it before you can decrypt it.
So if they FBI is not employing that process on the iPhone, then they have other issues in their chain of investigation.
Now may be Apple made it really hard to really hard to clone that way, but if you have the physical device then it’s just a matter of time until they figure out how to do so.
You make no sense to me. Why would the government demand access to something that was already OPEN?
It’s this kind of thinking that leads people to believe that open source voting machines are a good idea…
If it’s open, you can (theoretically) audit the code (it’s too big – you really can’t in reality) or choose trusted source level community vetted code (better-ish, but you’ll probably then get a reputation like anyone who runs Tor) to make sure there is no back door in the software running on your device.
If you are really careful you could sign everything with protected encryption keys (this is more viable than the above), and keep everything secure that way.
In reality, there are dozens of ways to sneak all sorts of things in in all sort so places before, during and after compilation, though it’s at least harder with properly signed binaries.
As far as voting machines it’s even worse, because there is absolutely no way to know that the software running in the block box was compiled from the supposedly vetted open source software, though currently no one makes any suck claim anyway.
Edited 2016-02-18 04:00 UTC
Exactly!
Funny, does someone really believe that there is no such option/feature now?
“government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone”
Bullshit.
Ah, a fellow cynic. The same thought went through my mind. However, given how much attention is put into jailbreaking or otherwise hacking iPhones, if there were such a feature I suspect we’d know by now. Of course, that doesn’t cover the data held on Apple’s servers, independent of your physical device. There, my cynicism remains high.
If such a program can be written, then it already exists, regardless of whether Apple themselves create it or not.
Its easy to make a claim based on no data, just a speculative hunch.
Given that neither you or I have any actual information about what Apple may or may not have hidden inside its organisation the best we can do is try to figure out what^aEURTMs more and what’s less likely.
The first question is why would Apple have gone to the trouble of building a secret backdoor in iOS?
What advantage would it bring to the company? I cant think of any.
There are two things we do know:
One is that Apple thinks it has a competitive advantage against Google/Android in the arena of privacy because (whether its accurate of not is immaterial) Google/Android appear vulnerable regarding worries about privacy because Google^aEURTMs business model is based on gathering data about users:
Secondly we know that Apple is a customer experience company – it drives sales by winning customers and then retaining them based on the customer experience of using its products. In that context Apple has played the privacy and security card very hard, it clearly believes its integrated product design system can deliver a competitive customer experience in relation to privacy.
Finally there is the risk part of the equation. If Apple very publicly takes the stand it has, fronted by the CEO personally, and then it were to be discovered that it had been lying because in secret it had already built/installed a backdoor it would be a big and damaging blow for the company^aEURTMs reputation and brand.
So given that there is no competitive advantage for Apple of building a secret backdoor but a lot of possible downsides and risks I would say that on balance I think the probability they have done so is pretty low.
If you have ever developed secured devices, you would immediately understand that such a backdoor is extremely useful when developing and testing a device. If every time you had to throw away a device when the number of faulty password attempts has been reached, the testing and development would be extremely expensive. And you cannot avoid the testing of the faulty password attempts.
So yes Apple has most probably a backdoor. Such a backdoor is generally speaking a combination of special hardware and software and not just software as Tim Cook suggested, so it can only be used internally in Apple where the special hardware is present.
But of course, the details are very confidential and not to be disclosed.
Edited 2016-02-17 15:44 UTC
Reaching the end of passcode attempts doesn’t mean throwing away the device, it simply means that user data on the device is deleted. The device can then be setup as a new device unless it is locked by Find My IPhone. So there’s no expense inherent in developmental testing of pass codes.
This is almost certainly the very thing they want to get around. If the automatic erasure triggers and Find my iPhone has not been shut off deliberately, the device will be locked and require the Apple ID and password of the primary account on the device to unlock it again.
cropr,
They do have a back door, it just doesn’t go by that name. Operating system update mechanisms are backdoors into the operating system. We tend not to think of these as backdoors because they also have legitimate purpose, but the only difference is intent. Make no mistake that a hacker could do nefarious things with the same access: keyloggers/data capture/crypto-bypassing/etc. Clearly Apple posses these capabilities, even if Tim Cook is ignorant of that fact. The real question is not whether or not Apple has back door access (they do), but whether or not Apple can be trusted not to exploit it. And also whether or not Apple and it’s employees can be trusted not to leak the signing keys that would enable the NSA to exploit it themselves (ie under a secret court order).
I’m glad that apple has taken a pro-privacy stance publicly, but it could all be security theater to placate the masses. If a company really wants to make a genuine commitment to security, it should put it all on the line and open source all of the security critical parts of the stack. Otherwise there’s a good chance it’s just lip service.
Thanks Cropr. That goes a lot to explain Microsoft UEFI effort. Reinforcing the stack as low as they can go [Are They giving ethical use to it?]. Unfortunate is that not even Microsoft can go lower than that. Has to be a concerted effort of the totality of the ecosystem
Main reason for the Hard/Firm/Soft in-house back-door is always -believe it or not- Security!
Damage Control Mode On.
Because it wasn’t guns that killed all those people at the Inland Regional Center, it was the iphone.
Give the FBI that back door and there won’t be any more shootings. Right?
If Apple were to conspire with the governement to add a backdoor, what would be the best way forward such that “bad guys” continue to trust it and the backdoor continues to be useful.
Well, Apple and the Government would stage a big feud in public, with Apple “refusing” wink wink nod nod know what I mean, to add the back door.
So I take all of this with a giant grain of salt on the popcorn I’m eating here on the sidelines.
How would installing a malicious version of ios open up a device they already have posession of? Surely another version of ios no matter how malicious would be unable to access the data without the encryption keys (which the fbi clearly do not have)…
If that is not the case, and installing a malicious version of ios is indeed sufficient to gain access to the data, then the data was never stored securely in the first place.
iOS obviously holds decryption abilities for the hardware used, with the passcode, password, or fingerprint being an essential element of the decryption process. That being said, unless there’s some sort of master key built into the hardware, I don’t see how bypassing the passcode will help them decrypt it if, as we are told, such a passcode/fingerprint/whatever is the key, then simply bypassing it isn’t going to allow decryption.
The FBI probably has many decryption tools already. What they’re demanding from Apple is a way to disarm the “self destruct” security feature.
The iPhone 5C (The phone in question) uses the A6, doesn’t have TouchID or the same Security Enclave hardware that iPhones using the A7 and later have.
For the 5C, iOS prompts the user for a passcode, and hands it off to a chip that does several tens of thousands of rounds of PBKDF2 to generate a crytographic key – the original passcode is combined with unique 256-bit identifier that is unique to that individual piece of silicon (No other A6 has the same UID). There also no way to extract the UID from the chip via software. It takes about 5 seconds for the chip to generate an encryption key based on the passcode. This encryption key is what protects user data.
So, what the FBI is asking Apple to do is build a custom version of iOS that can be installed without wiping user data, that will not wipe user data after too many failed attempts, that will not introduce additional delay in generating keys, and can accept passcode inputs electronically.
That way, they can plug the phone in, and crack a 4-digit code in about 14 hours, six days for a 5-digit passcode, 57 for a 6-digit code, or a couple of years for a 7 digit code, without having to type each code in one at a time manually.
Of course, there is a 50% chance of breaking the code in half the time.
EDIT: I’ve seen written that security features of the A6 may not take 5 seconds per key, and more like 80ms, so a 4-digit key might be broken in merely a half hour
Edited 2016-02-17 22:28 UTC
All of what this unfocused efforts will get in the near term is create a new market of ‘pre-digesting’ gadgets.
Long term those gadgets will grow in power an functions.
Anonymizers, GPS-scramblers, Un-Profilers just around that corner..
And that is not good for any part having good will at stake
Self-generated info for self-consumption is not to be seen, heard, read or touched by anybody but the individual. Under ANY conditions. Eternal rule.
Self-generated info TRANSFERED to intimate family for their self-consumption is going to be requested under a Federal Tribunal Order only. And is not prosecutable in any way. And is not re-distributable in any way. Under ANY conditions. A yet to be created International Tribunal On Privacy to resolve correspondents.
Every device should have two separated containers: One unable to be TRANSFERED to another device. And the other able to be TRANSFERED to another device.
Only the second container should be accessible.
An build from here upwards…
—————————————–
Let’s start with a fictional example:
THE CONEHEADS SAGA.
There are known cells of extremist ConeHeads haters.
One member of this cell handwrites a detailed plan to get rid of a particular group of ConeHeads, the SuperConics. The plan needs of the help of a lot of ConeHeads haters, to serve as distractors.
1) The master of the devil plan takes a photo of this paper and go where subject hater B is, show him the photo and ask him to take a photo of his photo. On doing this they are showing will to extend hate. But both have their photos in their personal zone of the device. This is not to be known for anybody else.
2) Subject hater B goes with his sister hater C and ask her to take a photo of the photo of the photo. ‘photo III’. This goes into personal zone of her device, too. This is not to be known for anybody else.
3) Subject sister hater C goes with her UNRELATED friend hater D and ask him to take a ‘photo IV’. This goes into personal zone of his device, too. This is not to be known for anybody else.
4) Unrelated friend hater D is too far in the county from his own sister hater E. and put a copy ‘photo IV diffusive’ on second memory section of his device. To USB courier or e-mail it latter. This document should be reachable by a Federal Tribunal. Not to be prosecutable. Not to be re-distributable in any way.
And build from here upwards…
A personal note at my cell shouldn’t be reachable, at all. Not even me, if loosing that peace of paper, with annotated key. Closed loop encryption.
If putting a copy of that note on diffusive area of memory, on intent of SMS my wife, that one could be reachable. If sending it to her, that one shouldn’t be prosecutable. [Prosecutable in the extended meaning: shouldn’t be meta-data-able, for both of us, as an example].
If uploading to Internet, well, that could be another story, [and just for me]. My Internet readers are not responsible of my thoughts. [I could be meta-dated, but my readers shouldn’t be meta-dated].
The second, diffusive area of memory at my cell, should not be, in any way, ‘cloud’-able, network-able, link-able, back-up-able. HARD local, and HARD key provided only by Federal Tribunal order. HARD coded only for my personal cell-phone.
These are just first thought, quite rude wanders.
My UEFI/BIOS, stack, software, switches, routers, proxies, address servers, etc. Everything so full of holes. Don’t even know if me is I, or inside an ‘Inception’
http://www.imdb.com/title/tt1375666/
By now.
Thom, I’m sure you’d feel different if the massacre committed by these terrorists occurred on Dutch soil. Their iPhones were obviously used to plan and execute the attack and the information contained in those phones could help thwart more terrorism. I’m sure Apple can crack the phones and provide the contents to the government without providing the government with the tools or expertise to crack other phones, so please stop all the grand standing and faux moral outrage. It’s very transparent.
I’m an American, and I agree wholeheartedly with Thom, Tim Cook, and the rest of the privacy advocates on this.
Do I really have to drag out the old quote?
“Those who would give up essential Liberty, to purchase a little temporary Safety, deserve neither Liberty nor Safety.”
While I agree with this, the trouble with said quote is that it has lost all meaning to most people. Historical mumbo-jumbo they had to hear in class and forget about after they’d finished their exam. Advocate for privacy these days and you get called an extreme paranoid.
And what security are you referring to that we’d be trading? The security that allowed an illegal immigrant to enter the US and enter into a sham marriage to escape deportation and then plan a massacre that kills 14 people using off-the-shelf tech?
So yes, go ahead and drag it out since you THINK it applies here, but it obviously doesn’t. Apple already has the tools and expertise to easily circumvent iPhone security. Apple should do the right thing and provide investigators with whatever they (Apple) can find on the phones without providing the government investigators the tools, techniques or expertise that would enable the government to get into other phones.
It’s no different than asking a company that produces safes to unlock one of their safes that’s been used by terrorists to hide information.
Edited 2016-02-17 23:06 UTC
are you sure of that?
As it stands, the government does not need to bother with due process any more, and the analogies by its supporters don’t need to make any sense.
Ain’t terrorism grand?
Where was due process skipped here? We are talking about a court order after months of investigation.
That is certainly not obvious. Not at all. In fact the suspects physically destroyed their personal phones to the point that no data could be recovered.
This phone, the iPhone in the ongoing court case, was a business phone that belongs to San Bernardino County where the suspect worked.
Personally, I think it’s unlikely that the County’s phone was used for planning the terrorist attack or communicating with other terrorists. Or else the suspects would have destroyed it too.
But it also makes the court case more complicated. Courts have already ruled about employee privacy when using company-owned computers and cell phones: The employee has no right to privacy when using his employer’s phone, and any data on the phone is the property of the County.
So in legal terms the court case isn’t directly about privacy at all, it’s about whether a court or government agency can force a corporation (Apple) to create new software and/or hardware to the government’s specifications.
Having a third party, that isn’t a sworn law enforcement office, handle such evidence causes issues with chain-of-custody. To preserve chain-of-custody for evidence, the FBI should be the one to extract the information. When third parties are involved, and they aren’t sworn law enforcement officers, there can be issues.
… but enough about yourself.
Say yes just once and they will never have a choice ever again. Apple have no choice but to go all the way with this. They will lose this battle, but with a bit of luck it increases the profile of this struggle & attracts attention & resources needed to create a better encryption techniques.
To me, having a destructive back door is a compromise I’d be happy with.
Simply, if they want the key, they need physical access, and they need to destroy the device to get it (and the data).
Crack open the back, put the “key chip” on a grinder, reveal the “secret code” like a scratch of lotto ticket, that just so happens to destroy the rest of the phone, then yank the epoxied flash and key off, put it in a special reader, and dump it all you want.
In the US at least, we are effectively NOT secure in our papers and artifacts given the proper circumstances (i.e. a court order). Your safe, your safety deposit box, your medical records, your financial records, etc. About the only thing that’s “safe” is conversations with your spouse or lawyer.
Other than that, the rest is fair game with proper procedure. They will torch your safe, they will subpoena your doctor, they will drill your box at the bank.
The real concern about the back door is remote exploit, and a silent, unauthorized exploit.
Physical destruction of the device mitigates both of those problem, and gives the authorities the access they need.
Send the phone, a warrant, and a officer to maintain chain of custody to Apple and supervise the process, get back a stack of DVDs and a bag of phone parts.
That doesn’t bother me at all.
Edited 2016-02-17 18:16 UTC
I’m not sure that remote or silent exploits are the only issue.
Let’s say that Apple lets the FBI have this access, the next enemy will be apps with encrypted data like 1Password. We’ll also need a key for those apps. In your scenario, this puts me in a situation where a lost iPhone is destroyed, but the attacker has potential access to every one of hundreds of passwords that provide access to valuable data (that doesn’t even consider storing other things like credit cards or obscure data like IRS pins, you may feel that storing such things is a mistake, but that’s another argument). This puts me in a situation where I have be prepared to either cancel and change all of those passwords, or I have to stop using secure passwords. Both scenarios expose every consumer to more risk, and most of the risk is put on careful consumers. I think that’s unreasonably burdensome for Apple and their customers.
The problem with 1Password, is the same eternal problem with any encryption not managed directly by the phone.
All of those other concerns you have are identical to when your filing cabinet at the house is jimmied open with a butter knife. But the key point is that you are AWARE that this has happened, so that you CAN take those steps in an expeditious manner, rather than finding out through a 3rd party (such as the car dealer hunting you down because “you” didn’t send in “your” next car payment when the thief bought a car in your name — this exact thing happened to a friend of mine.)
The fundamental issue (which is actually not an issue at all legislatively) is whether or not the State has a “right” to access the data given due process. Historically, this is clearly settled. They do have the right save for a select few callouts (as I mentioned above, plus self incrimination),
They have a right to the information, you have a right to be notified that it has happened.
We don’t worry so much about “back doors” to the firesafe in the house, which are legion (not all of them are actually good safes). It’s a modicum amount of security against a burglar — it’s a deterrent. But we also have all the other deterrents (locked doors, alarmed houses, etc.) to prevent access. However it takes very determined and skilled intruder to defeat those quietly, and leave no evidence.
Knowing you’ve been burgled has value of its own, since you can act on that knowledge that you’ve been compromised.
Having my credit card stolen is a pain, for sure, but I have recourse to mitigate the damage once discovered. If someone breaks in to my home, I can get the locks repaired, get the safe fixed, change my passwords, create new accounts, etc.
But if someone silently slides in, steals all of that information and has unfettered access while I am unawares, that’s a different issue.
This doesn’t take into account the loss that could be incurred. Saying that there are vulnerabilities doesn’t justify creating new ones. In fact, just the opposite.
They have a right to the information. They don’t have a right for it to be easy, with or without due process.
The idea that they might comes down to your philosophy of governance. I will always disagree with making it easier for thieves or anyone else, regardless of the inherent security of the system.
The hope is to diminish the need for paid, or coerced hacks under it.
This may be unpopular, but I think Apple should unlock and decrypt the device… if they can. That’s what the court ordered – and we certainly don’t want our companies being allowed to skirt court orders without due process.
Not to mention just how valuable that data may be in stopping future attacks (both at home an abroad).
Now, that said, they shouldn’t create a backdoor on shipping devices. By no means would I support that. Apple should create a solution they can use in-house and guarded access to it like they’ve never guarded anything before.
There is, most likely, a way around auto-erase and the time lockouts. In fact, there’s certainly a way to image the phone’s data… it is just stored on chips after-all…
looncraz,
But that’s questionable. Unless you plan to snoop on people who have yet to commit a crime, then this data is going to be of little use in prevention. It’s mostly for “after the fact” evidence.
Edited 2016-02-17 22:23 UTC
Due process also includes the right of Apple to appeal the earlier court ruling.
———-
One alternative outcome of Apple’s resistance to unlock the phone, or their continued insistence that it’s not possible, could be a court order to release the iOS source code to the FBI.
An open source OS would keep back-doors at bay. If ios was open source we wouldn’t need to trust Apple or the FBI.
No reaction (yet) from the Android camp. I wonder if the FBI asked Google the same and if Google (silently) obliged, or if the FBI doesn’t bother because hacking and Android phone is easy…
As most around here, doesn’t believe they don’t have access. The issue here is the open disobeying.
This has to do with a specific case, not a general request. Everyone is getting their panties in a bunch because they don’t seem to know the whole story. I was horrified by the first headline I read but after reading the whole story I think Apple should comply.
So let’s summarise this: Apple build a system that is secure, even they can’t snoop and now the FBI asks Apple to break into its own security system, making it less secure and setting a precedent so that the FBI can read the contents of 1 iPhone out of the 300+ million active iPhones around?
Please, no Sir.
They’ve actually made the device so secure that not even the legal owners of the phone can access their own data.
Do the iOS security measures prevent anyone from accessing their own phone data should they forget their password? Apple can’t unlock an iPhone for a registered and lawful owner?
How this is relevant, the iPhone in this terrorism case is owned by the San Bernadino County. So the records and data associated with the phone are County property.
Is the County locked out of their own phone?!?
As usual the U.S. government is overreaching in trying to compel Apple to assist it in hacking an encrypted iPhone to retrieve personal information and data. Apple’s Tim Cook was absolutely right in refusing to abide by such an unreasonable and ridiculous request. If the government wants to retrieve personal data from an encrypted device, let it build its own software tools to do so. Frankly, the U.S. government is out of control and it is willing to do just about anything to maintain its grip on power and its control of the American people. Who, by the way, have succumbed to idiocy by way of social media and a multitude of mindless distractions such as sports and Hollywood. But I digress. I sincerely hope that Apple stands its ground and abjectly refuses to give in to government pressure.
These markets: Russia, China, Europe and ASEAN nations will definitely not buy an iPhone product if it is known to have a backdoor.
Good move Apple. But in case, how can the public know for sure that Apple is true to its word?
allanregistos,
Open sourcing security components and conducting public audits is part of it. One also needs to know that the code running on one’s phone matches the publicly audited code. For this to work, apple would have to return device control back to the owner. Otherwise we’re back at square one relying on the word of the manufacturers.
There’s an XKCD out there that I can’t find about how hackers focus on the technical too much and in real life technical security is easily overrriden with a bit of brute force…
What point is fighting against backdoors when perfectly legal torture, ahem, I mean interrogation, will get the password?
Hard to brute-force the password out of someone who’s dead though.
If Apple won’t comply, the FBI can always ask the Chinese government to do it. Afterall, iPhones are products of China and it’s hard to believe the every-spying Chinese government isn’t sneaking spyware into everything electronic they export. It wasn’t that long ago that we found out various firmwares have been infected for many years.
We are talking about a court order concerning a terrorist’s phone, not giving free reign to snoop on anyone and everyone. If Apple doesn’t want ANYONE to be able to get into the phone then prevent it from taking firmware updates. If you are paranoid about your security then make a sufficiently long random passcode that will prevent even using the firmware trick to crack.
I understand that with this new firmware image authorities and others can then crack short passcodes on some models but you would have to have physical possession of the phone AND time, possibly lots of it.
The backlash again the request by the FBI seems completely unfounded to me. Sure I don’t want technology companies just giving open ended access to my stuff but this is the exact opposite of that. This is a court order concerning a known terrorist’s phone.
Edited 2016-02-19 17:44 UTC
This firmware that the FBI is asking Apple to create will NOT allow the government to access anyone’s iPhone. A sufficiently long random passcode will prevent the government from cracking a phone even with the modified firmware installed.
Either you’re impossibly naive, or you’re just not thinking clearly.
1
Browser: Mozilla/4.0 (compatible; Synapse)