Offline
tennyson wrote:
TheLagerLad wrote:
tennyson wrote:
What about the impact if they do not co-operate in ANY way ? The general public really has a paranoia especially about Islamic terrorism and IF they wind up doing nothing to support the effort, I was thinking this could hurt them. Like I said, time will tell.
I think there are a very small set of people of are paranoid or overtly worried about Islamic terrorism. And I think there are even fewer who would consider a iOS hack as the solution to said paranoia.
Let me put it this way.
If the government forced Apple/Google/Microsoft to create backdoors into their phones for national security or law enforcement purposes, I would be 100,000 times more concerned about getting my identity stolen by a Russian or Chinese hacker who got a copy of the software than I would feel relieved that the government had a minor tool (in the overall scope of things) in fighting terror.
I probably overstated in using "paranoia" (although there certainly are some), but I do believe that a lot of people do want the government to do all it can to track down terrorists (especially when it involves the US) and Apples's refusal to do anything MIGHT hurt them.
I think the perfect solution would be for Apple (assuming it is possible, which I believe IS the case) to just provide the FBI with the information that is on the phone without providing them with a tool to decrypt any phone or device or get around automatically the built in security.
Now we're going down a legal rabbit hole, but in terms of evidence processes, I'm not sure law enforcement could just give the phone to Apple and say, "here, decrypt this." In the case of the San Bernardino shooter, it doesn't matter, but in a case where someone was being charged with a crime, the police and prosecutors wouldn't want to give up a key piece of evidence to a third party.
Offline
Here's a really good article from Gizmodo that explains far better than I can why Apple is doing the right thing....
The FBI wants Apple’s help to investigate a terrorist attack. Apple says providing this help is the real danger. We’ve reached a boiling point in the battle between tech companies and the government over encryption. And what happens will affect anyone who uses a smartphone, including you.
After the San Bernardino shootings, the FBI seized the iPhone used by shooter Syed Rizwan Farook. The FBI has a warrant to search the phone’s contents, and because it was Farook’s work phone, the FBI also has permission from the shooter’s employer, the San Bernardino County Department of Public Health, to search the device. Legally, the FBI can and should search this phone. That’s not up for debate. If the FBI gets a warrant to search a house and the people who own it say okay, there’s no ambiguity about whether it can search the house.
But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That’s the situation that Apple’s dealing with here.
The FBI obtained an order from a California district court asking Apple for assistance cracking Farook’s passcode. The court order doesn’t flat-out demand that Apple unlock the phone, which is an iPhone 5C* running iOS 9. Instead, the judge is asking Apple to create a new, custom, terrorist-phone-specific version of its iOS software to help the FBI unlock the phone. Security researcher Dan Guido has a great analysis of why it is technically possible for Apple to comply and create this software. (It would not be if Farook had used an iPhone 6, because Apple created a special security protection called the Secure Enclave for its newer phones that cannot be manipulated by customizing iOS.)
The fight isn’t over whether Apple can comply in this case. It’s whether it should.
If Apple makes this software, it will allow the FBI to bypass security measures, including an auto-delete function that erases the key needed to decrypt data once a passcode is entered incorrectly after ten tries as well as a timed delay after each wrong password guess. Since the FBI wants to use the brute force cracking method—basically, trying every possible password—both of those protections need to go to crack Farook’s passcode. (Of course, if he used a shitty password like 1234, the delay wouldn’t be as big a problem, since the FBI could quickly guess.)
The security measures that the FBI wants to get around are crucial privacy features on iOS9, because they safeguard your phone against criminals and spies using the brute force attack. So it’s not surprising that Apple is opposing the court order. There is more than one person’s privacy at stake here!
Apple equates building a new version of iOS with building an encryption backdoor. CEO Tim Cook published a message emphasizing that the company can’t build a backdoor for one iPhone without screwing over security for the rest.
Apple will be writing its own malware if it complies with this order. It would be creating the best tool to break into its own (older) devices.
“Essentially, the government is asking Apple to create a master key so that it can open a single phone,” the Electronic Frontier Foundation wrote in a statement supporting Apple. “And once that master key is created, we’re certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security.”
Don’t sit there chuckling if you use an Android, by the way. If Apple is compelled to create this malware, it will affect anyone who uses technology to communicate, to bank, to shop, to do pretty much anything. The legal basis for requesting this assistance is the All Writs Act of 1789, an 18th century law that is becoming a favorite for government agencies trying to get tech companies to turn over user data. The AWA is not really as obscure as Apple suggests, but it is a very broad statute that allows courts established by Congress to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.”
The Department of Justice has even tried to use it to force Apple to turn over suspects’ messages before. I know 18th century law sounds boring, but this is an 18th century law that could fuck you big time.
The All Writs Act can only force a company to do something if it’s not an “undue burden.” Seems like making Apple create malware that will fundamentally undermine its core security features is an enormous burden. And if it’s not deemed “undue” in this case, that sets a horrible precedent. After all, if compelling Apple to maim itself is allowed, compelling Google and Facebook and Microsoft to write security backdoors would also be allowed.
Offline
TheLagerLad wrote:
tennyson wrote:
TheLagerLad wrote:
I think there are a very small set of people of are paranoid or overtly worried about Islamic terrorism. And I think there are even fewer who would consider a iOS hack as the solution to said paranoia.
Let me put it this way.
If the government forced Apple/Google/Microsoft to create backdoors into their phones for national security or law enforcement purposes, I would be 100,000 times more concerned about getting my identity stolen by a Russian or Chinese hacker who got a copy of the software than I would feel relieved that the government had a minor tool (in the overall scope of things) in fighting terror.
I probably overstated in using "paranoia" (although there certainly are some), but I do believe that a lot of people do want the government to do all it can to track down terrorists (especially when it involves the US) and Apples's refusal to do anything MIGHT hurt them.
I think the perfect solution would be for Apple (assuming it is possible, which I believe IS the case) to just provide the FBI with the information that is on the phone without providing them with a tool to decrypt any phone or device or get around automatically the built in security.
Now we're going down a legal rabbit hole, but in terms of evidence processes, I'm not sure law enforcement could just give the phone to Apple and say, "here, decrypt this." In the case of the San Bernardino shooter, it doesn't matter, but in a case where someone was being charged with a crime, the police and prosecutors wouldn't want to give up a key piece of evidence to a third party.
I agree about other than the San Bernardino shooting incident. There are real issues when the subject is still alive and especially IF an American citizen. In this case there still are legal issues, but I would really be interested to see how a question on this specific circumstance would work out with the public at large. IF (and that is a BIG IF) I am right, Apple might be making a business mistake in not appearing to co-operate.
BTW, I DO believe that Apple should not reveal how to decrpty or how to circumvent anything. IF they have the technology to unlock the phone without loosing the data, I would hope the government lets them do it on their own and then return the phone to the authorities to do the rest of their mission. Do you see that approach as a fesible solution if possible ?
Last edited by tennyson (2/17/2016 3:47 pm)
Offline
Mark Cuban: Apple did the exact right thing
Mark Cuban
Weighed in on Thursday on the Apple versus FBI debate, applauding the tech giant for resisting a federal court order to unlock a terrorist's iPhone.In a blog post, the Dallas Mavericks' owner said Tim Cook did the "exact right thing by not complying with the order.""Every tool that protects our privacy and liberties against oppression, tyranny, madmen and worse can often be used to take those very precious rights from us," Cuban wrote. "We must stand up for our rights to free speech and liberty."
Last edited by Common Sense (2/18/2016 10:15 am)
Offline
tennyson wrote:
TheLagerLad wrote:
tennyson wrote:
I probably overstated in using "paranoia" (although there certainly are some), but I do believe that a lot of people do want the government to do all it can to track down terrorists (especially when it involves the US) and Apples's refusal to do anything MIGHT hurt them.
I think the perfect solution would be for Apple (assuming it is possible, which I believe IS the case) to just provide the FBI with the information that is on the phone without providing them with a tool to decrypt any phone or device or get around automatically the built in security.
Now we're going down a legal rabbit hole, but in terms of evidence processes, I'm not sure law enforcement could just give the phone to Apple and say, "here, decrypt this." In the case of the San Bernardino shooter, it doesn't matter, but in a case where someone was being charged with a crime, the police and prosecutors wouldn't want to give up a key piece of evidence to a third party.I agree about other than the San Bernardino shooting incident. There are real issues when the subject is still alive and especially IF an American citizen. In this case there still are legal issues, but I would really be interested to see how a question on this specific circumstance would work out with the public at large. IF (and that is a BIG IF) I am right, Apple might be making a business mistake in not appearing to co-operate.
BTW, I DO believe that Apple should not reveal how to decrpty or how to circumvent anything. IF they have the technology to unlock the phone without loosing the data, I would hope the government lets them do it on their own and then return the phone to the authorities to do the rest of their mission. Do you see that approach as a fesible solution if possible ?
Yes, I do.
If Apple can unlock the phone of this terrorist and give it to the government, that's fine.
What Apple should NOT do is to turn over technology that would allow the government to go about unlocking other phones as they see fit.
Offline
If Apple can unlock the phone of this terrorist and give it to the government, that's fine.
What Apple should NOT do is to turn over technology that would allow the government to go about unlocking other phones as they see fit.
As Cook said yesterday, the hacking of it's own phone for one investigation will set the precedent that the government can show up at Apple's door day after day after day with a new court order for some other investigation.
And the government will show up at Google's door and Microsoft's door and DropBox's door, and every cloud data service that offers encryption technology and demand that they break their products.
It is reasonable to assume (and almost certain to happen) that these hacks will land in the hands of people who will use them to steal data from unsuspecting users.
The things that used to take up filing cabinets or safes in our houses are now stored securely in our personal devices. This is a benefit to individuals and society as a whole. Apple is right in saying that they should be not responsible for knocking down the first brick in the data security wall.
Offline
A curious question--is it possible for Apple to come up with what is necessary to unlock that particular phone to satisfy the FBI request and then destroy the unlocking process they used to keep it from the hands of evildoers?
Offline
I think, as far as Apple is concerned, it would be akin to unlocking Pandora's Box.
Offline
flowergirl wrote:
A curious question--is it possible for Apple to come up with what is necessary to unlock that particular phone to satisfy the FBI request and then destroy the unlocking process they used to keep it from the hands of evildoers?
THAT is the Million Dollar question.
From what I have read, the only thing (at least known at this time) from preventing the authorities from accessing the data is that IF a password is not entered correctly more than a certain limited number of times, that either the data or device becomes usable. That protection is built into their software. How hard it would be to disable that OR find the password is something that Apple would have to figure out.
Offline
The DOJ just took it up another notch having just filed a motion to compel Apple to assist the FBI.