Apple rejecting calls from FBI to unlock gunman's iPhone
Discussion
KaraK said:
Leithen said:
KaraK said:
...they are being asked to help exploit one that is already there as part of a perfectly legal process.
The legality of he process is the most immediate question - the FBI have been forced to use a very old piece of law to try and make Apple do as they wish. The background appears to be a failed attempt over the past few years to get congress to introduce and pass legislation that would specifically neuter encryption on such devices.It would be useful if The Supreme Court had a full complement of judges by the time it makes it there....
silverous said:
ewenm said:
Which is great, but how do you restrict such tools to law enforcement?
I'm sure a company that is capable of restricting phones to avoid law enforcement interference could figure it out....Over the last 30 years we've seen the capabilities and rights of law enforcement ratcheted ever up, particularly in the US. We've also seen those abilities abused and used for activities law enforcement said they've never use them for. You can't argue for "law enforcement to have any tool available that allows them to swiftly gather evidence" without placing controls on how this mythical technology is used, or abdicating responsibility for securing it to somebody else.
The raw materials of encryption are mathematics and, regardless of your intentions, you can't legislate against maths. Once you've created a mathematical hole in encryption for law enforcement it's exists for anyone to take advantage of. It simply isn't possible to restrict such tools to law enforcement.
silverous said:
They have the ability to dip into our icloud backups by the sound of it but I don't see that getting leaked out to all and sundry.
iCloud backups are encrypted using an Apple key and because Apple have the key, they can decrypt those backups. iCloud backups are not getting "leaked out to all and sundry because even if you get access to the backups you can't decrypt them. Strong encryption keeps them safe, weaken that encryption and you make them a target.Desire to give law enforcement every tool they desire, and hope that technology can protect those tools are two very frightening ideas for any individual to believe in.
The idea of an encryption algorithm with a backdoor in is imho a bad thing - and one of the reasons that the selection process of the algorithm for AES was done in such an exceptionally transparent way was to prevent such a backdoor and also allay fears of one.
Enough cryptographers who seriously know their stuff have declared it sound for me to feel more than comfortable believing that if correctly implemented AES 256 is not feasible to crack and I would point out that Apple are not being asked to do this here. To those worried that if Apple agree to the writ that such a backdoor will be "requested" I would say that such a request is actually more likely to be the result of a refusal "see, we tried to play nice bur Apple wouldn't carry out our request so we had no choice but to play hardball". If Apple agreed to the current case then that would set a precedent that searches deemed legally reasonable would be possible so that would IMO make any request for a backdoor seem very much that they were only wanting it for carrying out unreasonable / illegal searches. The warrantless surveillance carried out under the FISA and PATRIOT powers are the real threats to privacy and civil liberties in my book.
But really I'm still of the opinion that the NSA don't want such a backdoor, not for any altruistic reasons but because such a backdoor would be far too easy for rival governments and criminals to exploit. They would much rather have it require a whopping great supercomputer to break, ideally the more expensive and specialist the better since they would a) be more difficult for a "bad guy" to get and b) they would stand a very good chance of knowing that the bad guy had such a computer. Hence the supercomputer research programs the NSA is conducting. If AES were "too easy" to crack they would drop the standard like a hot potato just like they did with DES.
Enough cryptographers who seriously know their stuff have declared it sound for me to feel more than comfortable believing that if correctly implemented AES 256 is not feasible to crack and I would point out that Apple are not being asked to do this here. To those worried that if Apple agree to the writ that such a backdoor will be "requested" I would say that such a request is actually more likely to be the result of a refusal "see, we tried to play nice bur Apple wouldn't carry out our request so we had no choice but to play hardball". If Apple agreed to the current case then that would set a precedent that searches deemed legally reasonable would be possible so that would IMO make any request for a backdoor seem very much that they were only wanting it for carrying out unreasonable / illegal searches. The warrantless surveillance carried out under the FISA and PATRIOT powers are the real threats to privacy and civil liberties in my book.
But really I'm still of the opinion that the NSA don't want such a backdoor, not for any altruistic reasons but because such a backdoor would be far too easy for rival governments and criminals to exploit. They would much rather have it require a whopping great supercomputer to break, ideally the more expensive and specialist the better since they would a) be more difficult for a "bad guy" to get and b) they would stand a very good chance of knowing that the bad guy had such a computer. Hence the supercomputer research programs the NSA is conducting. If AES were "too easy" to crack they would drop the standard like a hot potato just like they did with DES.
It's a massive conspiracy between the FBI and Apple. There is a standard 4 digit code (6969) which will automatically unlock any iOS device. The reason for the 'case' is to bump sales for apple and to sink all the terrorists into a false sense of security that all their devices are watertight*
*Dramatisation, may not of happened.
*Dramatisation, may not of happened.
GoodDoc said:
silverous said:
ewenm said:
Which is great, but how do you restrict such tools to law enforcement?
I'm sure a company that is capable of restricting phones to avoid law enforcement interference could figure it out....Over the last 30 years we've seen the capabilities and rights of law enforcement ratcheted ever up, particularly in the US. We've also seen those abilities abused and used for activities law enforcement said they've never use them for. You can't argue for "law enforcement to have any tool available that allows them to swiftly gather evidence" without placing controls on how this mythical technology is used, or abdicating responsibility for securing it to somebody else.
The raw materials of encryption are mathematics and, regardless of your intentions, you can't legislate against maths. Once you've created a mathematical hole in encryption for law enforcement it's exists for anyone to take advantage of. It simply isn't possible to restrict such tools to law enforcement.
silverous said:
They have the ability to dip into our icloud backups by the sound of it but I don't see that getting leaked out to all and sundry.
iCloud backups are encrypted using an Apple key and because Apple have the key, they can decrypt those backups. iCloud backups are not getting "leaked out to all and sundry because even if you get access to the backups you can't decrypt them. Strong encryption keeps them safe, weaken that encryption and you make them a target.Desire to give law enforcement every tool they desire, and hope that technology can protect those tools are two very frightening ideas for any individual to believe in.
So Apple has the keys for icloud but you and others suggest that anything that exists that is a backdoor isn't safe - so how come they managed to keep the iCloud encryption key(s) safe?
I understand why a porn filter is difficult and don't consider myself to be general public.
I'm not arguing for law enforcement to have tools without controls - the FBI are going down a route to achieve this access and as long as there are sufficient safeguards and controls, as I've said previously on this thread, that should be sufficient.
People keep suggesting what the FBI have asked for is impossible - is it or isn't it? You've talked about how difficult some things are and then referred to the impossible - are you saying Apple *cannot* do what is being requested by the FBI?
Apple are not being asked to break the encryption.
They are being asked to create a custom firmware and update the phone so it doesnt lock itself after a limited number of tries so the FBI can then brute force the passcode.
Apples argument here is that they are 'far removed' from the device, they built it and sold it and there ends the connection with it.
They are being asked to create a custom firmware and update the phone so it doesnt lock itself after a limited number of tries so the FBI can then brute force the passcode.
Apples argument here is that they are 'far removed' from the device, they built it and sold it and there ends the connection with it.
anonymous said:
[redacted]
The implications of the government’s position are so far-reaching – both in terms of what it would allow today and what it implies about Congressional intent in 1789 – as to produce impermissibly absurd results
anonymous said:
[redacted]
then that would still work, and you wouldnt look like you're making a target of someone. It generally works better when opinions are accepted as peoples own, without the need to attack themSecond point - this is a different case to the San Bernadino one. This one's in New York and has a different interpretation applied by both the courts, the Gov agency and the offender.
Gassing Station | Computers, Gadgets & Stuff | Top of Page | What's New | My Stuff