Apple rejecting calls from FBI to unlock gunman's iPhone
Discussion
RobDickinson said:
But dont you only get 6 attempts before the phone is locked and needs deleting?
Yes, but then the FBI can make you write a custom version of iOS that doesn't. Which is what is happening.If they hadn't weakened the encryption as much as they did, then it would have been moot.
RobDickinson said:
No one is going to live with a 256 key pin for their phone.
Practical difference between 4 and 6 digits is negligible.
It's 256 bits not charactersPractical difference between 4 and 6 digits is negligible.
below is a sample 256 bit key
5854256c7b422d22474a3a4f74516f5e794b3a50455656263e5140623ca3a4f7
so only 64 characters
In all seriousness, this is the problem, everyone keeps shouting for strong encryption, but they want short easy to remember passcodes. Can't have both.
RobDickinson said:
if only the iphone had another was of validating the user and adding to the key..
I assume you're talking about fingerprint recognition?So two problems with that,
First, most biometric systems I have seen, don't produce a number that is any way consistent. Usually they just give you a percentage similarity to a reference print. anything north of 80% (or whatever threshold for the system) you'd call a match. Multiple scans will provide different numbers. Nothing you could generate a key from.
I would imagine the biometrics just produces a yes/no and then uses the pin to generate the key.
Secondly cuts and abrasions will mean recognition failures and reference prints of other fingers are required, which would clearly generate different values.
RobDickinson said:
So long as its a match (yes/no) you can use a pre stored secondary key part. The user can know what this is and have an option to enter it in case they had their finger chopped off etc.
but you can't store your key on the phone with the encrypted data, that's like taping your front door key to the door.zippy3x said:
RobDickinson said:
if only the iphone had another was of validating the user and adding to the key..
I assume you're talking about fingerprint recognition?So two problems with that,
First, most biometric systems I have seen, don't produce a number that is any way consistent. Usually they just give you a percentage similarity to a reference print. anything north of 80% (or whatever threshold for the system) you'd call a match. Multiple scans will provide different numbers. Nothing you could generate a key from.
I would imagine the biometrics just produces a yes/no and then uses the pin to generate the key.
Secondly cuts and abrasions will mean recognition failures and reference prints of other fingers are required, which would clearly generate different values.
zippy3x said:
I assume you're talking about fingerprint recognition?
So two problems with that,
First, most biometric systems I have seen, don't produce a number that is any way consistent. Usually they just give you a percentage similarity to a reference print. anything north of 80% (or whatever threshold for the system) you'd call a match. Multiple scans will provide different numbers. Nothing you could generate a key from.
I would imagine the biometrics just produces a yes/no and then uses the pin to generate the key.
Secondly cuts and abrasions will mean recognition failures and reference prints of other fingers are required, which would clearly generate different values.
It's been a while (decades rather than years now) since I've done any biometrics, but ISTR it being perfectly possible to do some processing (Hough transforms, rubber sheet mapping, etc) then Principal Component Analysis to distill a repeatable key. The reason many systems don't is it throws some information away that you're better off keeping for a binary yes/no but it then means you have to have a similarity threshold. Even if there's a yes/no match based on a percentage they should still be able to lengthen the key a bit in a repeatable fashion, I've no idea what Apple are doing though.So two problems with that,
First, most biometric systems I have seen, don't produce a number that is any way consistent. Usually they just give you a percentage similarity to a reference print. anything north of 80% (or whatever threshold for the system) you'd call a match. Multiple scans will provide different numbers. Nothing you could generate a key from.
I would imagine the biometrics just produces a yes/no and then uses the pin to generate the key.
Secondly cuts and abrasions will mean recognition failures and reference prints of other fingers are required, which would clearly generate different values.
I was looking at iris recognition so cuts and abrasions were less of an issue.
zippy3x said:
It's 256 bits not characters
below is a sample 256 bit key
5854256c7b422d22474a3a4f74516f5e794b3a50455656263e5140623ca3a4f7
so only 64 characters
In all seriousness, this is the problem, everyone keeps shouting for strong encryption, but they want short easy to remember passcodes. Can't have both.
This is pretty much the crux of it - the encryption on an iPhone is 256bit AES (so if you just took the encrypted data off it you'd be brute-forcing longer than age of the universe to date. AES is technially "broken" in a pure cryptographic sense of the word but this only reduces the effective key space by two bits which means currently you'd still be waiting about a billion billion (not exact - but you get the picture) years for a current super computer to crack it so for all real world purposes it's "secure".below is a sample 256 bit key
5854256c7b422d22474a3a4f74516f5e794b3a50455656263e5140623ca3a4f7
so only 64 characters
In all seriousness, this is the problem, everyone keeps shouting for strong encryption, but they want short easy to remember passcodes. Can't have both.
Unfortuntely asking someone to remember a 64 character passcode for their phone, let alone asking them to type it in everytime they want to unlock it to check on their latest facebook updates or call their Aunt Tilly simply isn't going to happen. As anyone who has ever worked in IT support knows all to well most ordinary people struggle with passwords of 8 characters and the UI of a phone makes entering even that enough of a ballache that most people just wouldn't use it. Which is where the 4-6 digit PIN comes in, it's easy to remember, and quick to enter. On it's own though it's a truly trivial size of encryption key so what Apple have done is that they take the PIN and entangle it with a 64 character key that's hard-coded into the processor of the phone to produce the "real" 64 character key which is then used to decrypt the data.
So for anyone wanting to decrypt your stuff if they only have the data they have to face breaking the full 64 characters (so too long to be practical), if they have the data AND the phone itself you only need to break the PIN - so you are only as secure as those four-six digits (ie you can brute force that over a lunch break). To Apple's credit they do support longer, more complex passwords on iOS and using a 10-12 character one would make brute-forcing take longer then would be practical in most circumstances.
Apple aren't stupid so realised they would have to put measures in place to mitigate that - which is where the increasing delay between failed attempts and the option for erasing the device after 10 failed attempts comes in.
All so far so standard practice really. Where Apple fked up though was enabling the device to boot into a version of the operating system that isn't the one already on the device without requiring the PIN to put it into that mode. It's not an uncommon feature to have in computing, anyone who has ever used something like Hirens to fix a borked computer has done basically same thing.
Obviously this is great from a "recovering a borked device" point of view but less than ideal from a security point of view, they obviously realised that relying on some pretty ephemeral defenses in the form of settings/code in the operating system to prevent someone brute forcing the PIN wasn't the smartest plan which is why the Secure Enclave takes over doing the slow down etc in those devices that have them. However they left themselves the ability to update the software of the SE without erasing the secured data bringing us right back to where we started. Oops
So basically anyone who isn't interested in behaving legally (be they "good" or "bad" guys) could, if they can get their hands on the source code etc appropriate version of iOS (and suitable apparatus to sign it) can carry out this exploit. All Apple's talk of them not wanting to create a "whole new operating system" is bks - I haven't seen the iOS source code obviously but I imagine the changes would probably amount to less then 50 lines of code, and changing 50 lines of War and Peace does not make it a "whole new book". Also, any modded version of iOS is not the most dangerous thing - the real danger is in the weakness of the overall design and all the smoke and mirrors around creating the modded version itself being so goddamn dangerous is to try and distract from that IMO. Apple aren't being asked to create a backdoor - they are being asked to help exploit one that is already there as part of a perfectly legal process.
KaraK said:
...they are being asked to help exploit one that is already there as part of a perfectly legal process.
The legality of he process is the most immediate question - the FBI have been forced to use a very old piece of law to try and make Apple do as they wish. The background appears to be a failed attempt over the past few years to get congress to introduce and pass legislation that would specifically neuter encryption on such devices.0000 said:
It's been a while (decades rather than years now) since I've done any biometrics, but ISTR it being perfectly possible to do some processing (Hough transforms, rubber sheet mapping, etc) then Principal Component Analysis to distill a repeatable key. The reason many systems don't is it throws some information away that you're better off keeping for a binary yes/no but it then means you have to have a similarity threshold. Even if there's a yes/no match based on a percentage they should still be able to lengthen the key a bit in a repeatable fashion, I've no idea what Apple are doing though.
I was looking at iris recognition so cuts and abrasions were less of an issue.
Fingerprints can't be hashed, which is what I assume people have in mind when talking about adding them to the key. They are also not secret and cannot be changed, which is why they make terrible passwords.I was looking at iris recognition so cuts and abrasions were less of an issue.
Leithen said:
KaraK said:
...they are being asked to help exploit one that is already there as part of a perfectly legal process.
The legality of he process is the most immediate question - the FBI have been forced to use a very old piece of law to try and make Apple do as they wish. The background appears to be a failed attempt over the past few years to get congress to introduce and pass legislation that would specifically neuter encryption on such devices.Gassing Station | Computers, Gadgets & Stuff | Top of Page | What's New | My Stuff