iPhone Encryption and the Return of the Crypto Wars

Last week, Apple announced that it is closing a serious security vulnerability in the iPhone. It used to be that the phone’s encryption only protected a small amount of the data, and Apple had the ability to bypass security on the rest of it.

From now on, all the phone’s data is protected. It can no longer be accessed by criminals, governments, or rogue employees. Access to it can no longer be demanded by totalitarian governments. A user’s iPhone data is now more secure.

To hear US law enforcement respond, you’d think Apple’s move heralded an unstoppable crime wave. See, the FBI had been using that vulnerability to get into people’s iPhones. In the words of cyberlaw professor Orin Kerr, “How is the public interest served by a policy that only thwarts lawful search warrants?”

Ah, but that’s the thing: You can’t build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.

Backdoor access built for the good guys is routinely used by the bad guys. In 2005, some unknown group surreptitiously used the lawful-intercept capabilities built into the Greek cell phone system. The same thing happened in Italy in 2006.

In 2010, Chinese hackers subverted an intercept system Google had put into Gmail to comply with US government surveillance requests. Back doors in our cell phone system are currently being exploited by the FBI and unknown others.

This doesn’t stop the FBI and Justice Department from pumping up the fear. Attorney General Eric Holder threatened us with kidnappers and sexual predators.

The former head of the FBI’s criminal investigative division went even further, conjuring up kidnappers who are also sexual predators. And, of course, terrorists.

FBI Director James Comey claimed that Apple’s move allows people to “place themselves beyond the law” and also invoked that now overworked “child kidnapper.” John J. Escalante, chief of detectives for the Chicago police department now holds the title of most hysterical: “Apple will become the phone of choice for the pedophile.”

It’s all bluster. Of the 3,576 major offenses for which warrants were granted for communications interception in 2013, exactly one involved kidnapping. And, more importantly, there’s no evidence that encryption hampers criminal investigations in any serious way. In 2013, encryption foiled the police nine times, up from four in 2012­—and the investigations proceeded in some other way.

This is why the FBI’s scare stories tend to wither after public scrutiny. A former FBI assistant director wrote about a kidnapped man who would never have been found without the ability of the FBI to decrypt an iPhone, only to retract the point hours later because it wasn’t true.

We’ve seen this game before. During the crypto wars of the 1990s, FBI Director Louis Freeh and others would repeatedly use the example of mobster John Gotti to illustrate why the ability to tap telephones was so vital. But the Gotti evidence was collected using a room bug, not a telephone tap. And those same scary criminal tropes were trotted out then, too. Back then we called them the Four Horsemen of the Infocalypse: pedophiles, kidnappers, drug dealers, and terrorists. Nothing has changed.

Strong encryption has been around for years. Both Apple’s FileVault and Microsoft’s BitLocker encrypt the data on computer hard drives. PGP encrypts e-mail. Off-the-Record encrypts chat sessions. HTTPS Everywhere encrypts your browsing. Android phones already come with encryption built-in. There are literally thousands of encryption products without back doors for sale, and some have been around for decades. Even if the US bans the stuff, foreign companies will corner the market because many of us have legitimate needs for security.

Law enforcement has been complaining about “going dark” for decades now. In the 1990s, they convinced Congress to pass a law requiring phone companies to ensure that phone calls would remain tappable even as they became digital. They tried and failed to ban strong encryption and mandate back doors for their use. The FBI tried and failed again to ban strong encryption in 2010. Now, in the post-Snowden era, they’re about to try again.

We need to fight this. Strong encryption protects us from a panoply of threats. It protects us from hackers and criminals. It protects our businesses from competitors and foreign spies. It protects people in totalitarian governments from arrest and detention. This isn’t just me talking: The FBI also recommends you encrypt your data for security.

As for law enforcement? The recent decades have given them an unprecedented ability to put us under surveillance and access our data. Our cell phones provide them with a detailed history of our movements. Our call records, e-mail history, buddy lists, and Facebook pages tell them who we associate with. The hundreds of companies that track us on the Internet tell them what we’re thinking about. Ubiquitous cameras capture our faces everywhere. And most of us back up our iPhone data on iCloud, which the FBI can still get a warrant for. It truly is the golden age of surveillance.

After considering the issue, Orin Kerr rethought his position, looking at this in terms of a technological-legal trade-off. I think he’s right.

Given everything that has made it easier for governments and others to intrude on our private lives, we need both technological security and legal restrictions to restore the traditional balance between government access and our security/privacy. More companies should follow Apple’s lead and make encryption the easy-to-use default. And let’s wait for some actual evidence of harm before we acquiesce to police demands for reduced security.

This essay previously appeared on CNN.com

EDITED TO ADD (10/6): Three more essays worth reading. As is this on all the other ways Apple and the government have to get at your iPhone data.

And an Washington Post editorial manages to say this:

How to resolve this? A police “back door” for all smartphones is undesirable—a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant.

Because a “secure golden key” is completely different from a “back door.”

EDITED TO ADD (10/7): Another essay.

EDITED TO ADD (10/9): Three more essays that are worth reading.

EDITED TO ADD (10/12): Another essay.

Posted on October 6, 2014 at 6:50 AM101 Comments

Comments

Appeos October 6, 2014 7:27 AM

Thanks for posting this. I found the recent Washington Post piece about this particularly embarrassing and naive. They rightly rule out having a “back door”, then immediately call for a magic “golden key”, as if that’s somehow any better/different.

uh, Mike October 6, 2014 8:00 AM

My short version: They’re going to sell secure phones overseas. Is there some reason that American’s can’t have them, too?

Sam October 6, 2014 8:07 AM

Since Apple products are closed platforms, how can we really be sure that this “fix” is effective? How can independent experts like Schneier confirm that Apple has not closed this backdoor simply to open a smaller backdoor somewhere else?

good article October 6, 2014 8:11 AM

Good article! Thanks.

I am amused at the irony of law enforcement (rightly) advising people that encryption is good to protect their data, while simultaneously saying encryption is bad because law enforcement will be left powerless against criminals.

Sounds like left hand doesn’t know what right hand is doing.

uh, Mike October 6, 2014 8:17 AM

@Sam, if there’s a backdoor, someone will leak it, or it will be exposed through careless use.

Don’t listen to government complaints. They would best complain more if they have the backdoor.

sceptical schneier blog reader October 6, 2014 8:18 AM

Oh, God.. but only all this is just an obvious lie. None of the big corporations, belonging to the true owners of the USA (and the Western world in general) is going to cut off access to its own intelligence groups and/or “law” enforcement, ever.

The only outcome of this obvious ruse will be an improvement in their annual crime solution reports – if the duped criminals start to trust their i-devices enough to put really sensitive stuff on them.

None of the large IT technology corporations, so tightly integrated into the New World Order and plans for its furtherance (like the Total Information Awareness initiative, split and renamed, but still implemented, or the concerted drive for urging stupid users to use more and more “on-line services” and storage etc.) – not one will ever close its products for the New World Order operatives.
Ever.
Period.

We’ve been fed a whopper.

uh, Mike October 6, 2014 8:21 AM

@sceptical, I’ll play along. Big corporations are just governments in disguise. If they traffic in flawed security, they only fall to the same traps as governments. They just get measured in a different system of units.

parrot October 6, 2014 8:35 AM

I can’t help seeing that Comey’s lament about the new iPhone seems to be about the “proles’ getting encryption, whereas Shawn Henry’s suggestion for encryption is more for corporate consumption.

feds October 6, 2014 8:35 AM

I recall the FBI was peeved at Apple before because there was a huge backup of requests to decrypt/unlock devices. The police state flooded them with requests so Apple rightfully put a stop to it with this.

Watch them go after Apple hard now with lawsuits and investigations for every random thing because they defied the stasi. Fact is decrypting a phone isn’t needed for their investigations, meta data galore still exists to prove who talked to who and there’s always police state malware like FinFisher they can target devices with.

randall October 6, 2014 9:03 AM

The next phase will be easier encryption of mobile communications, which is now such a pain that hardly anyone does it. I’ve tried — you really have to want to encrypt your email before going through the weeds of explaining the intricacies of S/MIME and certificate creation/installation/signing to your own tech unsavvy attorney.

The current mess can be fixed when you get a signed TLS certificate with the purchase of every iDevice, and TLS is integrated by default into mail, messages, and everything else.

paul October 6, 2014 9:09 AM

The rapidity of the amnesia really is pretty amazing.

But I wonder how much of this is just pro forma by law’n’order types. The information that’s being lost isn’t (as Bruce notes) all that important or useful, but the sense of power relations that the old system fostered certainly was.

Laser Lotus October 6, 2014 9:15 AM

The problem is that, this time around, the people making the pro-crypto argument are arguing for strong cryptography specifically to thwart search warrants. Apple themselves is marketing iOS 8 encryption as a way to avoid cooperating with law enforcement and court orders. The ACLU’s “principal technologist” Christopher Soghoian summarized the iPhone encryption changes as Apple telling warrant bearing police to “get lost”. It’s a lot harder to make the argument that encryption doesn’t really hamper police investigations when that’s exactly what some people are praising it for doing.

Whether this attitude towards encryption is justified or not, and some people would certainly argue that it is, it’s framing the debate in a really dumb way.

fajensen October 6, 2014 9:27 AM

Eric Holder … That would be the same Eric Holder “overseeing” the ATF Gunwalk Scandal? Thought so! Even though Mexicans murdered by ATF-supplied guns didn’t crimp Holders career, one should probably not follow his advice on anything.

aUKman October 6, 2014 10:25 AM

From my personal and very recent experience, if you are the victim of a serious aggression in the UK and have never seen the aggressor, in a desert place, the Police will not even ask you if you had a mobile phone on you and which kind of mobile phone.
The aggressor was too young not to have a mobile phone on him (and was probably on drugs).
If he and me had an Apple product, both mobile phone would probably have exchanged their identity (there was such a story not long ago) – that information would probably identify the agressor using Apple data quite quickly.
It seems the police is not using all that “big data” collected, there can be a lot of reasons – maybe it is the fact that I am a victim and not a casualty so the case is not that important…

iOnuS October 6, 2014 10:30 AM

Who in their right mind would trust Apple now? Myself wound up on linux after years of ripping spyware out of OS X taught me to use the terminal. Apple can’t change. They blew it once and for all and now they’re restricted to the passive and compliant consumer sheep market. While open source can be sabotaged, it isn’t subject to Apple’s sort of organized corporate treachery.

carol October 6, 2014 10:32 AM

If you want to read some really dumb arguments against encryption, head over the the NYT‘s debate, wherein Stewart Baker compares AAPL to “a teenager getting Edward Snowden’s name tattooed up her arm” and says that iOS encryption means that employers can’t access employee communications.

Is Baker stupid because he actually believes this stupid argument, or because he believes that others will? I can’t decide myself.

justme October 6, 2014 10:46 AM

Bruce, imho, a great essay up until the final sentence: “And let’s wait for some actual evidence of harm before we acquiesce to police demands for reduced security.”

As you point out, the technology has already tipped the balance much more in favor of government (and corporate) access over individual privacy than has ever been the case anywhere before in history. To imply that some “evidence” could somehow justify tipping that balance even further against the individual seems counterproductive.

squarooticus October 6, 2014 11:35 AM

Laser Lotus: LE shot themselves in the foot by overstepping their authority. If the Snowden revelations had never come to light, it’s likely Apple would not have taken this step. I think most people are not opposed to criminal investigations with targeted warrants revealing private information with probable cause, but they are opposed to dragnets and ubiquitous surveillance. It’s impossible to get one without the other except as a matter of policy, and the government has made it very clear their policy is to use as much power as we the people don’t explicitly deny them.

allan October 6, 2014 11:46 AM

Given that we know the government is instructing police forces how to falsify the sources of evidence. How do we know how much effect encryption has on investigations.

Khavren October 6, 2014 12:36 PM

It would have the effect of making it harder to cover up illicit or illegal methods, since the legal methods might be blocked by the encryption

Friday October 6, 2014 12:52 PM

Under “Strong encryption has been around for years.” you mention Apple & Microsoft. The big three are Apple’s OSX, Microsoft’s Windows, and Linux. (No matter how I might personally feel about Net/Free/Open-BSD, Linux is what businesses use.)

Linux has had strong encryption for ages. I’ve used TwoFish under Linux for pocket filesystems (filesystem in a file mounted via loopback) and encrypted disk partitions. The nice part is, you can send the encrypted filesystem as a file back and forth via email. Everyone needs to know the password in advance of course, but unless it’s written down (and compromised) or poorly chosen, that password can be hard to crack.

Security is out there, and ubiquitously available to anyone competent enough to use it. All that’s changing is security becoming available to those less skilled, who ironically need it more…

tz October 6, 2014 12:57 PM

A “golden key” that required acetone and fuming nitric acid to burn away the chip casing…

They might be able to do so today.

Encryption and security are about economics.

What LE wants is a solution that costs almost nothing.

So, kidnappers? Get the phone, take it to a lab, and carefully expose the chips, use microprobes… It might cost tens of thousands of dollars, but are they saying it isn’t worth it for a kidnapping? Apple might even tell you where to look on the chip.

Also, often the question is maybe the criminal is among 100 innocents – do they decrypt all to find out?

There are screen doors, steel doors, security doors, and vault doors. Each can be opened given time and effort and cash. Security is the same way (beyond strict mathematics).

We don’t torture for 3 categories of infopocalypses. Do we start making more exceptions for that.

Daniel October 6, 2014 2:28 PM

The fundamental strength of encryption is not its technological impact but its psychological impact. For the last twenty years the people of the USA have been subjected too and ruled by the narrative of fear. If a person can be protected via encryption they feel more secure and if they feel more secure they are less easily manipulable by fear. The problem for the elites and their media supporters is that they have no other narrative than fear–so the irony is that they they fear letting go of fear. They no longer have the power to “make the world safe for democracy” and they no longer have the generosity to offer a “chicken in every pot” so the best Obama can do is rail against “genocide” and keep rattling the skeleton of fear. Encryption is bad because it is empowering. That’s the reason it must be stopped.

Wyatt Storches October 6, 2014 3:03 PM

Presumably the snoops have read Sun Tsu. If they are whining about this it’s almost certainly because the devices are already compromised six ways from Sunday and the encryption serves as a pacifier for the user.

The communications into and out of the devices are already remotely owned and controlled, both the radio chip and the network pipe and possibly the machine at the other end. How much content on a phone isn’t sent or received over a network, with opportunities for man in the middle decryption and leaving a trail of metadata slime along the way? Some data might be secured … from some … I feel so much better now.

The mobile phone is a two-way radio that transmits and receives, and it is a computer that runs processes. The transceiver and the processor operate in ways that the user cannot observe. Therefore it is as secure as a megaphone in the public square. I treat it as such.

Anonymous Coward October 6, 2014 3:19 PM

No.

We have to assume that there are still backdoors available for law enforcement. Remember that Apple’s warranty canary was just recently removed and we know from the Snowden Documents that the NSA and friends have multiple methods for inserting backdoors into products, including getting NSA employees hired into companies where they can influence target systems.

Furthermore intelligence agencies do not publicly announce what they cannot get access to. That’s not how it works.

It’s far more consistent to understand this unusual media blitz about “how great Apple is” as a coordinated effort to rebuild their image after it was tarnished by PRISM and other Snowden docs and reporting.

Arraywalker October 6, 2014 3:35 PM

@Bruce

Love the Star Wars reference in the title.

Looking forward to seeing what happens to the “Sith”.

Anura October 6, 2014 3:58 PM

I was reading up on iOS 8 security, and I find one thing odd:

https://www.apple.com/iphone/business/docs/iOS_Security_Feb14.pdf

On page 9, it states that the file metadata is protected with the filesystem key, which is not dependent on the user’s passcode. The metadata contains a per-file key, which requires the user’s passcode. However, in the case of someone charged with child pornography, the mere existence of the file could be enough and the metadata may include that.

Anura October 6, 2014 4:36 PM

@Shannon

Well, when they invent encryption that is completely secure except that it doesn’t work on child porn, I’ll endorse it. I expect that to be around the same time that they find the phrase you need to speak to transmute charcoal into gold. Until then you are either advocating security for everyone or noone – I’ll go with the former.

Lance Cottrell October 6, 2014 5:05 PM

The echoes of the crypto war really take me back. A backdoor at apple makes no more sense than the ClipperChip did back in the 1990’s. This is the reason I made sure that Anonymizer.com never kept any logs. If they can be compromised to stop the bad guys then they will be compromised BY the bad guys (including bad governments).

I just did a blog and video on this on ThePrivacyBlog.

Shannon October 6, 2014 5:46 PM

@Anura – that’s fine, but one of the problems with encryption is, you don’t know if someone decrypted your files or not. People use encryption and they choose what they believe is a strong passphrase and they see the ciphertext, but after that it’s all trust. If someone has the key, you don’t know whether they got your stuff or not. There is no separate output that proves your files are safe. It’s not like your tires where you can measure the tread, etc. Or something you can measure against. It’s just a box with a handle on it and you turn the handle. You really don’t know anything about your information. Anyone can use the algorithm and you can’t stop someone from decrypting your files. And when they do, you won’t know it. And who uses strong keys? They choose a passphrase they THINK is good enough, and then they use it to encrypt all their stuff. And once those files are encrypted with the key from that passphrase, it’s not like you can change it. And that is basically all there is between you and oblivion. As they say, all encryption does is trade a lot of little secrets for one big secret. And before that the paradigm was access controls. Just choose a strong password. Then no body can get your stuff. What a joke.

Walt French October 6, 2014 8:03 PM

“…exactly one involved kidnapping…”

Yes, but what about all the upskirt photos and kiddie porn that we heard about?

Odd that the last refuge of challenged law-enforcement officers is trying to evoke the populace’s fear of others’ sexual fantasies. I guess they don’t think that the citizenry realizes how anybody downloading kiddie porn onto their computers, got it over the airwaves, which the cops CAN track with warrants (or can and does without, I’d guess).

This is a tragic example of our law enforcement having lost sight of its raison d’être, to serve the citizenry that it instead is advertising it can no longer control.

Another commenter suggests that without Snowden, this never would’ve come to pass. I’d shift that to, “without Snowden, this might have been longer in coming.” Apple has initiated multiple security initiatives for many years now, to block malware; to keep app developers from tracking more than they need to know about our use of their apps; to prevent websites from aggregating personal information that could be sold to insurance, bank and other parties we do business with; to make it much harder for thieves to profit from stolen phones; and so much more.

Seen in the light of those initiatives, the prevention from government spying seems like a noteworthy, but relatively minor, enhancement.

Skeptical October 6, 2014 8:28 PM

This is a powerful essay, and I appreciate the particular evidence that it cites in support of its arguments.

The question of the tradeoffs we face were communication devices immune to even lawful warrants is more directly confronted here than I’ve read previously – and it’s that feature which I appreciate the most.

On the one hand, there’s a strong argument made that, at least so far as court-ordered wiretaps are concerned, encryption has not foiled many government investigations.

On the other hand, I think the argument that imposing access requirements for lawful surveillance will necessarily render us quite vulnerable is weak.

Somewhat ironically, I see somewhat of a mirror between the rhetoric used by law enforcement to justify tighter restrictions on encryption, and the rhetoric used here (and elsewhere) to justify less tolerance for access mechanisms designed for law enforcement.

For instance,

Backdoor access built for the good guys is routinely used by the bad guys.

“Routinely” is a very strong claim. But the cases you cite are highly unusual. No statistics are adduced to demonstrate that access mechanisms designed to be restricted to government actors are often, or frequently, or routinely, exploited by the bad guys.

In this respect, that part of the argument in the essay resembles the anecdotes law enforcement will give to justify tighter restrictions on encryption.

And then there’s this claim:

You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.

Plug some other threat/security compliments into that framework.

“You’re either vulnerable to being kidnapped by anyone, or you’re secure from kidnapping by all of them.”

“You’re either vulnerable to being shot by anyone, or you’re secure from being shot by all of them.”

“You’re either vulnerable to anyone hijacking an aircraft, or you’re secure from hijackings against everyone.”

These all seem plainly unsound, as does the original formulation. But I then I thought that perhaps there was a meaning to “vulnerability” that I’m not taking into account – perhaps it has usage as a term of art. So, I glanced at a few sources, and was pleasantly surprised by Wikipedia’s entry on vulnerability.

I did not see a definition that would save the formulation.

But maybe I’m missing something.

How are you defining “vulnerability” in the above?

Is “vulnerability” a binary characteristic, in that one either is vulnerable to X or one is not, or are there degrees of vulnerability?

Why does a vulnerability to one threat imply vulnerability to all threats?

Daniel October 6, 2014 9:03 PM

@skeptical

First, I don’t think you know what the word example means. By definition an example is an illustration and not intended to be conclusive evidence. Your claim that Bruce’s evidence is based upon “highly unusual” cases is itself not backed up by any evidence. Where is your data? As I understand it Bruce’s point is that criminal can and in fact do exploit legal means to get to their objectives–that seems unassailable to me.

“Why does a vulnerability to one threat imply vulnerability to all threats?”

Again, your examples disprove your point. Bruce is not making a generalized security claim but a specific claim about the nature of the internet. The internet isn’t like kidnapping, or shooting, or hijacking an aircraft. The internet is the internet and by definition of the term “inter-net” it is a communal endeavor. Kidnapping, shooting, and hijacking are not communal endeavors.

Thoth October 6, 2014 9:03 PM

TO: ALL TLAs and US Govt “not very brilliant ones but full of power” people.
CC: @all

For the “not very brilliant ones but full of power” who keeps asking for key escrow and back door, they should realize the DUAL USE nature of ANY SOFTWARE OR HARDWARE. A military officer can be using an iPhone or an Android or a police officer can be using one. Politicians rely heavily on smartphones (like Obama) to communicate and tweet as well.

A backdoor or escrow would make all the politicians, government officials and so forth vulnerable as anyone else. No backdoor nor escrow system have been proven to be secure because you are deliberately weakening a system.

Due to widespread use of common technologies and phones, a backdoor could mean life or death as well. What happens if a US TLA agent carrying one of those backdoored common phones into a hostile theater and are trying to pretend to be like a normal person using a normal phone and some how the local agencies figured a way into those backdoors or escrow and finds out a US TLA agent is nearby via the backdoor phone’s backdoored mic while he is on an encrypted secure phone talking to his handlers. I remember there are pictures of Obama using his Blackberry next to a secure phone terminal and that’s yucky …. talk about OPSEC.

It has been known that militaries and Govt agencies have decided to build secure phone technologies onto common phone platforms and making a backdoor is purely disingenious. Of course the agencies will remove the backdoors but it is going to add to the time consuming effort when it could have been spent on something else somewhere. If the militaries and Govet agencies around the world are using modified versions of Android and Apple, it would be more appropriate to make security audits and operations easier by not introducing vulnverabilities at the first place. Sure you know where to find all your own backdoors but the dual use nature of these products make them very dangerous to backdoor.

Last but not least, you have been sitting in Washington too comfortably and being totally ignorant with your NSA modified Blackberries thinking that the BND, NSA, GCHQ and so on cannot get at you. Be assured that your presumption is wrong and they can and will tap you as well.

Why don’t you have a field day using an insecure phone and letting everyone tap you and have a feel when your privacy have been invaded and your security breached ?

Clive Robinson October 6, 2014 10:11 PM

@ Skeptical,

You have a problem with your reasoning, due to a false assumption.

The false assumption is “the equivalence of physical and information objects”.

You are not the first and you surely won’t be the last to make this mistake, and it’s something I occasionaly bang on about on this blog.

Thus, when you see a statment of,

    You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.

And then you say,

    “You’re either vulnerable to being kidnapped by anyone, or you’re secure from kidnapping by all of them.”

You are making the false assumption that the information object involved with “eavesdropping” is equivalent to the physical object in “kidnapping” they are not, as can clearly be seen after a moments thought.

To see this, look at what is actually happening, “eavesdropping” is the process of “observing and recording” which is the equivalent of “copying” the information object. Whilst “kidnapping” is the process of “stealing” the physical object. That is the act of “copying” leaves the original information object intact with the owner/originator, whilst “stealing” takes the original physical object away from the owner / originator. The former is non obvious the latter very obvious.

Thus “encryption” is incorrectly associated in peoples heads to being the equivalent of a “safe” and it is actually very very far from this. It’s not possible to take a safe away and work tirelessly for years or decades on it without the owner noticing. Not so with an information object such as a password file, as we have seen with several decades of such attacks.

It is this failing of assuming an information object is equivalent to a physical object that is causing most of the bad problems we are seeing, especialy among legislators and the judiciary.

It’s a point that is not lost to the LEO prosecutors who quite deliberatly misrepresent the facts to both legislators and judiciary alike. The fact that they very clearly do this “Knowingly” is a form of purgery for which they should be dealt with like any other person making a quite intentional and knowingly false representation to a court for personal gain.

Skeptical October 6, 2014 10:18 PM

@Daniel: First, I don’t think you know what the word example means. By definition an example is an illustration and not intended to be conclusive evidence.

I appreciate the heads up on what the word example can mean, but the word “example” wasn’t used in the portion of the essay we’re talking about. That’s not to detract from the value of the lesson, of course.

Your claim that Bruce’s evidence is based upon “highly unusual” cases is itself not backed up by any evidence. Where is your data?

As far as reporting goes, they’re pretty unusual. I’m not going to run a search over Lexis for you and post the results to prove the point though – and in fact, for the sake of argument, I’ll leave aside any claim about whether the cases provided are unusual or not. The burden of proof here is on Bruce.

As I understand it Bruce’s point is that criminal can and in fact do exploit legal means to get to their objectives–that seems unassailable to me.

Then you’ve missed his point. He said that bad actors routinely exploit mechanisms designed to permit only authorized law enforcement to use.

Routinely matters because we are weighing tradeoffs. If criminals routinely break mechanisms designed to permit only law enforcement, then clearly these mechanisms carry enormous cost to our security. However, if those mechanisms are brokenly rarely, and in some cases extremely rarely, then the cost to our security is much lower.

Again, your examples disprove your point. Bruce is not making a generalized security claim but a specific claim about the nature of the internet. The internet isn’t like kidnapping, or shooting, or hijacking an aircraft. The internet is the internet and by definition of the term “inter-net” it is a communal endeavor. Kidnapping, shooting, and hijacking are not communal endeavors.

Okay, so now we have:
1) The internet is the internet.
2) ???
3) Any vulnerability (in a system connected to the internet?) to anyone is a vulnerability to everyone.

Were Iranian centrifuges equally vulnerable to everyone? Greenpeace? Were Soviet undersea cables in certain areas equally vulnerable to everyone – or just to nations with incredibly advanced submarine technology?

Or do Iranian centrifuges and Soviet undersea cables not count because they’re not closely enough tied to the internet?

How about features on telecommunications equipment that enable government to effect a lawful wiretap? Vulnerable to everyone, including low-level drug dealers and bored folks surfing the internet looking for something interesting to listen to? How about script kiddies? How about foreign governments?

The only way for me to make sense of this argument is to give “vulnerability” a meaning that is almost completely divorced from the risk of exploitation that it presents.

So that law enforcement access on telecom equipment might constitute a vulnerability, but with very low risk of exploitation.

But then it’s not a question of whether we’re vulnerable, but rather a question of what risk we’re running.

Daniel October 6, 2014 10:38 PM

@Skeptical.

I don’t think the burden of proof is on Bruce. Bruce has made a claim and then provides some examples as evidence to back up that claim. If someone wants to contradict that claim the burden of proof is on them. However, one of the major problem with deciding the claim–viz., whether it’s routine or not–is that hacking by definition is an underground activity. I make this point all the time in another context. Crime statistics don’t measure the actual incidence of crime. Crime statistics measure only the crime that got reported. And legal cases only measure crimes that got reported and where the person was caught. It may be that hackers using legal backdoors is in fact routine but they rarely get caught doing so. So I’m not sure that statistics are useful in solving the debate.

As for the nature of the internet itself, you are once again confusing the ability to exploit with the fact of exploit. The fact that the Iranian centrifuges were not equally vulnerable to everyone is not material. The fact is that they were vulnerable. Whether that vulnerability is more or less “risky” is also not material because risk is always a matter of subjective opinion. In fact, one the things I appreciate most about Bruce’s essay is that it doesn’t go down the rabbit hole of “risk”. Everyone is vulnerable or no one is–risk has nothing to do with it.

Buck October 6, 2014 10:58 PM

@Skeptical

Why does a vulnerability to one threat imply vulnerability to all threats?

It might become more clear if one were to properly consider some of the major obstacles.
Think seriously about the implications of contact-chaining, large-scale interdiction networks, PEBCAK, poorly audited mass surveillance institutions, vulnerable humans & personal communication systems…
On the other hand, why think so much? What could possibly go wrong when giving a hand-selection of mortals all the keys to the kingdom..? (Or leverage over the other key-holders)

drouse October 6, 2014 11:15 PM

First thought. Why, even though they made it, should Apple have the ability to defeat the security on a device that I own. How is that any different than the key escrow system that got the clipper chip ridiculed out of possibility?

Second, my personal opinion is that this is the start of a battle over the abuse of the third party doctrine. The tech companies have been catching a lot of crap because of compliance with Gov demands using it for legal justification. They’re getting blamed for not resisting enough even though we can’t know because the proceedings are secret. Look how long it took for information on Yahoo’s efforts to seep out into public view. Secrecy favors the government and this might be Apple’s attempt to force the fight into the open.

drouse October 6, 2014 11:39 PM

I should have said abuse of the doctrine in the national security context. Especially since a secret court issuing classified opinions has allowed the IC to demolished any effective legal barriers intelligence and law enforcement.

Пиздец October 7, 2014 12:10 AM

Skeptical, happy as a pig in shit chopping logic with that puerile good guy/bad guy dichotomy, and burbling happily about lawful surveillance in a regime that preemptively pardoned millions of government felonies and annulled its residual law.

It’s tempting to dismiss skeptical as performance art. He was last seen chirping about getting their ass kicked in yet another war, Nobody’s that stupid, Right? But remember, this is a state that decorated the deck ape who shot down a civilian airliner. And that was before Iraq got too embarrassing and the services had to scrape the bottom of the barrel for felons, white supremacists, dual-diagnosis basket cases and the cream of the special-ed crop. This is a mafia state. And the whole steaming load of them are caught up in rationalizing each cascading failure and each new crime until everybody has skeptical’s affectation of dim-witted insouciance.

OldFish October 7, 2014 12:25 AM

From whence comes the concept that our communications must be in a form that can be archived and reviewed at will? Isn’t pfs and the analogy with a face to face chat far better for all of us? What is the characteristic that justifies treating telecommunications differently than face to face private chats? Physical distance? I guess technology is empowering and I would prefer the power be in the hands of all citizens rather than a small, despotic,ruling class.

keiner October 7, 2014 1:44 AM

“There are literally thousands of encryption products without back doors for sale, and some have been around for decades. ”

Hmmm, any proof for this claim? I’m not sure nowadays. Apple an M$ as examples for safe crypto, that’s a long shot…

Winter October 7, 2014 2:10 AM

“Law enforcement has been complaining about “going dark” for decades now. ”

We know that law enforcement has had historically unprecedented access to detailed information about everything we do. Did it reduce crime rates or increase the rate of cases solved?

If not, why would going dark make a difference?

Point is, that 9/11 would not have been stopped by a panopticon on every communication. We can say this because everything the TLAs needed to know was already known. They just could not bring themselves to share it or even interpret it.

The same with law enforcement. Crime rates did not change in lock step with FBI & friends access to all our digital deeds. Whatever they do with the information they collect, it is hardly preventing or even solving crimes.

QnJ1Y2U October 7, 2014 6:01 AM

@Skeptical

Is “vulnerability” a binary characteristic, in that one either is vulnerable to X or one is not, or are there degrees of vulnerability?

From some folks, that would look like a legitimate question. From you, it just looks like a disingenuous strawman. Unlike Bruce’s article here, we’re not addressing a general audience, so we can re-phrase the general principle:

If a system has a vulnerability, then that system is vulnerable to anyone with the resources to exploit it.

This should look familiar. Even your silly kidnapping example fits into the formulation. The problem, as Clive notes, is the nature of information objects and the internet expands the number of people with the resources to exploit a vulnerability by several orders of magnitude.

For an example, let’s suppose that Apple creates a key-escrow system for the iPhone, and then builds a database of ‘golden keys’ that can be used to decrypt those iPhones. They are now vulnerable to:
– Anyone with a court order
– Anyone who can fake a court order
– Anyone whose job is to execute the court orders
– Anyone that Apple decides can have access outside a court order (NSLs)
– Anyone at an agency that gets a copy of the database (NSA)
– Any sysadmin at Apple who can access the database
– Anyone at the factory in China who is feeding the golden keys into the database
– Anyone with access via a FOREIGN GOVERNMENT which requires Apple to supply them the golden keys as a requirement for entry into the their market
– Anyone that can bribe or extort anyone else in the list
– Anyone who can break into a system that holds the golden keys
– Anyone who can find an exploit in the key-escrow system
– Anyone who can follow an exploit found by someone else

And so on. And thanks to the expanding nature of attacks, leaks, and laws, the list will grow over time.

At some point, the effective difference between ‘anyone on the list’ and ‘anyone’ becomes immaterial – vulnerable systems are vulnerable.

Bob S. October 7, 2014 6:03 AM

Re: “Policies”

We all been concerned with privacy policy and have come to understand corporate fine print is written intentionally in such a way to make sure intended targets have none.

Lately, we’ve come to know security policies are likewise no more than gibberish except to exonerate the corporation when inevitably something goes wrong, like losing credit data of a hundred million people at a time.

Meanwhile, the various government agencies, eager to join in the data-feast orgy, implicitly and explicitly go blind when it comes enforcing any kind of laws that might pertain to the egregious loses as they occur.

Last, fat pig elected politicians only take action when their pockets are filled with bribes in the name of donations.

Dudes, we are on our own.

Deal with it.

mike~acker October 7, 2014 6:37 AM

Surveillance is used by governments to suppress dissonance.

Suggested Reading: No Place to Hide (Glenn Greenwald)

Examples of Dissonance:

American Revolution 1776
Womens’ Sufferage
Civil Rights — 1950-1968
anti-Vietnam war

*Excerpt from _No Place to Hide_ *

“No matter the specific techniques involved, historically mass surveillance has had several constant attributes. Initially, it is always the country’s dissidents and marginalized who bear the brunt of surveillance, leading those who support the government or are merely apathetic to mistakenly believe they are immune. And history shows that the mere existance of a mass surveillance apparatus, regardless of how it is used, is in itself sufficient to stifle dissent. A citizenry that is aware of always being watched quickly becomes a compliant and fearful one.”
NO PLACE TO HIDE Glenn Greenwald, p.3

Skeptical October 7, 2014 8:29 AM

@Daniel: I don’t think the burden of proof is on Bruce. Bruce has made a claim and then provides some examples as evidence to back up that claim.

So now the examples are not simply illustrations, but should be viewed as evidence for his claim?

Then the problem remains that they’re no more sufficient than any other anecdote – they don’t persuade me that “bad guys” are routinely accessing, e.g. telecommunications equipment designed to facilitate access only for authorized access by the government.

If someone wants to contradict that claim the burden of proof is on them.

No, the burden of persuasion is on the individual who introduces the positive claim. In this case, that’s the claim I just described above. The examples aren’t sufficient evidence for the truth of the claim.

However, one of the major problem with deciding the claim–viz., whether it’s routine or not–is that hacking by definition is an underground activity. I make this point all the time in another context. Crime statistics don’t measure the actual incidence of crime. Crime statistics measure only the crime that got reported. And legal cases only measure crimes that got reported and where the person was caught. It may be that hackers using legal backdoors is in fact routine but they rarely get caught doing so. So I’m not sure that statistics are useful in solving the debate.

Sure, it could be something difficult to show.

As for the nature of the internet itself, you are once again confusing the ability to exploit with the fact of exploit. The fact that the Iranian centrifuges were not equally vulnerable to everyone is not material. The fact is that they were vulnerable. Whether that vulnerability is more or less “risky” is also not material because risk is always a matter of subjective opinion. In fact, one the things I appreciate most about Bruce’s essay is that it doesn’t go down the rabbit hole of “risk”. Everyone is vulnerable or no one is–risk has nothing to do with it.

I strongly disagree. Because we are deciding between tradeoffs, i.e. the benefits of having government be able to access communications with proper authorization and the costs of ensuring that access via a feature enabling it, the risk posed by that feature is very much central to the issue.

drouse October 7, 2014 9:23 AM

@Henry Ford

That is somewhat of a leap in logic. I don’t see where I implied trust in these companies. I think that these companies are suffering damage to both the bottom line and to their reputations(whether these reputations are deserved or not) and the abuse of the third party doctrine is at the root of it.

Shannon October 7, 2014 10:41 AM

@Clive Robinson
re: “Thus “encryption” is incorrectly associated in peoples heads to being the equivalent of a “safe” and it is actually very very far from this. It’s not possible to take a safe away and work tirelessly for years or decades on it without the owner noticing. Not so with an information object such as a password file, as we have seen with several decades of such attacks.”

Exactly, and it is very true that people incorrectly associate encryption with a type of “safe.” Well put.

Incredulous October 7, 2014 11:00 AM

To the list:

There is really no point in arguing with Skeptical. Anything that supports his predetermined pro-government, pro-surveillance, pro-war position is unquestionable, anything that doesn’t he just flatly says is untrue. His generalizations and suppositions are unquestionable and yours are invalid.

Besides that, his arguments glance around the edges of most of the questions. Diversions. I think it is pointless to follow him there. Why allow him to set the terms of the discussion? His long ponderous equivocations leave no long term impressions in anybody’s mind. We only give them credence by acknowledging them.

This is not to say that it isn’t fun to dissect his absurdities and rolling contradictions from one post to the next. But it is pointless to address these observations to him because HE HAS NEVER ONCE acknowledged one.

If we think that the US doesn’t learn from experience, perhaps we should try to learn from our experience as well. Skeptical in the ISIS of this group, chopping the head off of what we know is true simply to get us to engage.

Clive Robinson October 7, 2014 12:29 PM

@ Incredulous,

This is not to say that it isn’t fun to dissect his absurdities and rolling contradictions from one post to the next.

Tut tut, you left out his propensity for arguing from a position he arrived at by false assumption 😉

It’s not the first time or I’m sure the last time he will do this, which makes me wonder about his real scientific or technical ability. He can always as the old army joke has it “Watch and learn, Watch and learn” down on the “range” of life.

Over all it’s a shame, as although I often don’t agree with his position he does fight his corner with a polite tenacity, which at times raises interesting points to contemplate to see if one’s own position is valid.

SmokingHot October 7, 2014 3:06 PM

In regards to this:
https://www.techdirt.com/articles/20141006/01082128740/washington-posts-braindead-editorial-phone-encryption-no-backdoors-how-about-magical-golden-key.shtml

How to resolve this? A police “back door” for all smartphones is undesirable — a back door can and will be exploited by bad guys, too. However, with all their wizardry, perhaps Apple and Google could invent a kind of secure golden key they would retain and use only when a court has approved a search warrant. Ultimately, Congress could act and force the issue, but we’d rather see it resolved in law enforcement collaboration with the manufacturers and in a way that protects all three of the forces at work: technology, privacy and rule of law. — WP

Did you get that? No “back door,” but rather a “golden key.” Now, I’m not sure which members of the Washington Post editorial board is engaged in mythical “golden key” cryptography studies, but to most folks who have even the slightest understanding of technology, they ought to have recognized that what they basically said is: “a back door is a bad idea, so how about creating a magic back door?” A “golden key” is a backdoor and a “backdoor” is a “golden key.” The two are indistinguishable and the Post’s first point is the only accurate one: it “can and will be exploited by bad guys, too.” That’s why Apple and Google are doing this. To protect users from bad guys.

  1. a “golden key” can also represent a known software vulnerability of some level of complexity so it is unlikely to be found, but when that vulnerability should be closed they can have someone find it and so get it closed

  2. the commenter is correct, ultimately this is a backdoor, regardless of what any of them actually truly meant

  3. the governments are the bad guys

Nick P October 7, 2014 9:09 PM

@ Bruce

Great article with plenty of good links. I still disagree, though, on the backdoor. I previously pointed out that we do secure backdoors all the time: it’s called remote administration tools. SSH, properly configured, does a pretty good job of only letting in authorized parties. I also proposed how to do it even better where it works for both parties. Not having any intercept capabilities when courts are all in agreement that they’re legal is a non-starter. So, we need to work on ways to give courts something without giving them everything. That’s my contribution, anyway.

re Apple Security

This is a joke. There’s plenty of ways in for FBI or NSA. If anything, I think they’re just posturing to avoid being inconvenienced and misdirect that they already have ways in. We also know, thanks to leaks, that NSA hacks targets like iPhones with 100% success rate, shares with FBI, and they use parallel construction. So, if they could still get in, they’d lie and try to make it look like the person was busted another way.

The whole thing is marketing far as domestic TLA’s. The change might be good against foreign TLA’s or general black hats, though. More protection by default is always a good thing.

Anura October 7, 2014 9:33 PM

@Nick P

There have been major improvements in the iPhone 6 with regards to encryption, and they do deserve credit. They use an encryption key that is written directly into the processor and can only be used with encrypt and decrypt commands, and then they combine AES and PBKDF2 to make it so you have to use the phone to attempt hashes. They also encrypt files by default. That is a lot of effort, but they also deserve some criticism:

1) They haven’t published how they combined AES and PBKDF2.

2) While the key is stored in the processor, it is unlikely it’s going to stop anyone who has access to something like an electron microscope. It will, however, protect you if your phone is stolen and you have a strong-ish passcode.

3) The hardware is not open, and we have no way to verify that there are no backdoors that would allow a TLA to access the AES key

4) While file contents are encrypted using a key that is encrypted with a key that is derived from the passcode and the master key, metadata is only dependent on the master key and can be decrypted without the passcode.

More importantly, they failed to solve the biggest problem: the fact that passcodes on phones are so inconvenient to make secure. The real focus (well, along-side making devices more open) is finding an alternative to passcodes that offers much greater security with a decent amount of convenience.

Nick P October 7, 2014 9:52 PM

@ Anura

Ahh. I had heard they put the crypto on chip. That is a great improvement to security so long as the crypto is implemented properly and there’s no bypasses. We know through leaks and Appelbaum that NSA has attacks on wireless stacks. There could still be bypasses blackhats can hit through external such I/O. Far as physical theft, it would be an improvement. However, most of the data theft seems to be through subverted apps or Internet-related (eg cloud attack).

To be honest, the security of Apple devices has so many unknowns for me that I usually say “Who knows. They’ve done some solid improvements but they have a horrible track record for patching and security engineering. This is the company that designed an admin login that asked for username & password, but accepted any password. Among other things. They don’t seem to really care about security or their users: just selling high priced, flashy, fun things.” I still stand by it while giving them extra credit for what you mentioned.

Thoth October 7, 2014 10:16 PM

@Nick P
SSH is actually a bad way of security access. The reason is simply people don’t know how to manage their SSH installations. The idea behind SSH has now been associated with just that black command line with some text and you key in a password or you use your SSH keys to login. People are bad at configuring access control on SSH, they do not understand how to configure or the concept or why they use SSH other than it provides a remote access to the server in the other side of the world. People leave their keys lying around and visibility of key ownership and access are a huge headache.

In all, it’s the human/management issue.

SSH Communications have released a product called a Universal Key Manager for visibility of SSH keys and limited SSH key control as long as the machines are registered into the UKM server.

But still, the problem is not technology but human/policies/management issues.

Giving access to remote access capability is not impossible (imagine a modified DUAL_EC_DBRG that have been cleaned up with fixed values as “The Golden Key” which is simply the asymmetric key of whoever that is). This will definitely give a SSH-like remote access capability into any system using an improved DUAL_EC_DBRG v2 or whatever version or names they want to give it.

To go another step further, they could mandate all cryptography or crypto-capable systems to have this Golden Key value baked right in or be considered illegal and dealt with as a serious heinous offence of sorts (aiding enemy verdict or something of the seriousness which will require capital punishment e.g. state execution).

The problem has never been the technology but with the people and policies.

There is a saying that you give them an inch and they take a ruler (in Chinese vocab). It is best not give an inch or they take a meter.

But after all these are just our personal views and our standpoint influenced by our experience of our own realities.

@Anura
PBKDF2 sounds like a bad idea. They should have gone a mix of BCRYPT and SCRYPT. The security of their encrypted filesystem and encryption system should not be taken seriously. If they (Android/Apple) are to be serious of their crypto, they would have to employ a baked in HSM like an on-board Secusmart chip (https://www.secusmart.com/en/) that passes FIPS 140-2 Level 3 test ironically…

Security should be done at the highest level and also be done at the most practical level available. Human interference due to malice or accident should be kept at the lowest as well in a high security assured setting.

Nick P October 7, 2014 10:31 PM

@ Thoth

SSH may or may not be a great example. My overall point was that the the INFOSEC industry has been building protected backdoors for a long time. SSH is a common tool used. There are others. It’s strange to say we must be vulnerable to none or vulnerable to all when many [common] security tools are designed specifically to make us vulnerable to one to a few with credentials. That this is at the low end of assurance, yet often works, means there’s probably a solution at the high end.

I can make it even simpler, though. There’s proven methods for building computers secure against remote attacks, especially if applied at every layer. The methods essentially create a mechanism enforcing certain policies. A number were created that NSA pentesters couldn’t beat. A government backdoor is also a mechanism enforcing certain security policies. Likewise, we should be able to build a backdoor that does it’s job of allowing only specific parties access and to specific things. They’re essentially the same problem.

The simplest would be something enabling read access to the data, but not write access. That alone would ensure that, at best, they could pull information off the device after initiating the backdoor. They couldn’t write to it to modify its behavior (eg plant evidence). The next step, as I mentioned in my high assurance L.I. scheme, is to organize the system so user data appears in a specific spot and the user can’t affect this. This allows us to design a L.I. system that is read-only for that specific spot and just the relevant data rather than eg system keys. The access becomes even narrower.

So, I think a high assurance backdoor can be created. I’ve built prototypes of such things for remote commands sent to systems I’ve designed. A backdoor for one isn’t necessarily a backdoor for all so long as a high assurance scheme is used on each end and physical/personnel aspects of each are managed well.

Thoth October 7, 2014 11:28 PM

@Nick P
Indeed technology can be done to create high assurance remote access control but that does not prevent scope creeping.

As we already know, the Binney’s THINTHREAD provides privacy assurance mechanisms to analyst to prevent misuse but that does not stop the Congress and NSA from flipping it out and using their own failed programs that compromise privacy and is half baked (e.g. TRAILBLAZER) which cost so much billions of dollars and did not catch a whole lot of threats it was suppose to do.

I feel that humans are still not responsible enough with the technology they operate and we see machines and codes as just tools which explains my boolean based notion of either a fully assured and secured environment from ground up or nothing.

01 October 7, 2014 11:46 PM

@skeptical

actually, most of your ‘examples’ support Bruce’s argument and not yours.
A plane vulnerable to hijacking, for one, is definitely vulnerable to hijacking by an arbitratly large set of actors (that’s why we are so damn preoccupied with making planes harder to hijack).
Iranian centrifuges were vulnerable to any actor that was capable of making airgap-jumping malware (the claims that malware in question was designex by a nation-state are merely plausible conjecture, as far as I recall).
And even the underwater cables aren’t by nature vulnerable to only nation states (at most, you could say that at this moment no non-government actor has the capability to covertly deploy underwater payloads needed to tap those cables, but as science marches on, such capabilities will become available to an unpredictable number of actors due to improvement in unmanned submarine technology and reduction in cost)

So yes, either a security system is fundamentally flawed (and can be subverted by an ever-increasing number of actors) or it is fundamentally sound (and then neither a golden key nor a golden shower will help a third party get the data, even if it is a totally nice third party who only wants to save the kids or something)

You might be in favor of a flawed system because you believe it serves your interests better.

I am confident that my interests are best served by a fundamentally sound system.

Skeptical October 8, 2014 12:03 AM

@QnJ1Y2U: The problem, as Clive notes, is the nature of information objects and the internet expands the number of people with the resources to exploit a vulnerability by several orders of magnitude.

Well, Clive put forward the classic set of conditions under which a given encryption might be tested.

However, that classic set is not always applicable.

For an example, let’s suppose that Apple creates a key-escrow system for the iPhone, and then builds a database of ‘golden keys’ that can be used to decrypt those iPhones. They are now vulnerable to:

– Anyone with a court order

Given your next example, I assume here you mean a valid court order. The answer is no, unless you include the support of the personnel and means needed to authenticate and effect that order. But in any case, this is a misuse of the term vulnerability, unless it’s proper to say that my email accounts are vulnerable to being accessed by an authorized user with the proper credentials.

– Anyone who can fake a court order

No, because the order itself isn’t enough. There are procedures and means of effecting the order and authenticating it – so one would need to fake all of those as well.

– Anyone whose job is to execute the court orders

Is your picture of how this works that of a guy with a piece of paper who walks up to a technician, shows him the paper, and then waits for his connection?

– Anyone that Apple decides can have access outside a court order (NSLs)

NSLs aren’t a good example, but that’s extraneous to your point. As to whether Apple would have the capability to unlock unilaterally, that’s true of particular designs, but it’s not true of all designs.

– Anyone at an agency that gets a copy of the database (NSA)

The database of keys? This is a possibility under certain types of designs. Other designs might include features that require factors additional to the key.

– Any sysadmin at Apple who can access the database

And if there were none who could do so without additional authorization? And if the particular key obtained could only be used once other factors were satisfied?

– Anyone at the factory in China who is feeding the golden keys into the database

Seems easy to avoid.

– Anyone with access via a FOREIGN GOVERNMENT which requires Apple to supply them the golden keys as a requirement for entry into the their market

If the keys were designed in such a fashion, and if Apple complied, yes.

– Anyone that can bribe or extort anyone else in the list

Making lots of assumptions about the design of the system in question, yes.

And so on. And thanks to the expanding nature of attacks, leaks, and laws, the list will grow over time.

It’s a list highly contingent on the specifics of a given system.

The kind of list you’re producing here consists of a series of potential single points of failure for a given security structure. However, the structure need not be designed in such a fashion, and the resources needed to place any one of those points in jeopardy can be placed well out of the reach of most.

So my point again is simply that the risk posed by having authorized access features depends on the design of the system in which the feature exists.

Indeed, moreover and equally importantly, the risk depends on the nature of the political, cultural, and legal systems and institutions in which the feature exists as well.

My other concern is that in simply listing vulnerabilities we’re not really coming to grips with the risks inherent in each. But it is risk that enables us to assess cost, and we need cost to compare to benefits, and thereby arrive at a determination (this oversimplifies greatly, obviously, since factors being considered here are not necessarily commensurable).

Figureitout October 8, 2014 12:10 AM

Incredulous RE: “the Skeptical one”
–The main frustration for me, is he/she/it ignores evidence being shoved in its face; then asks for more evidence, backs up requests on evidence on his/her/its part which are PR statements, then acts like he/she/it is actually smart while just doing nothing but post mostly quotes and politics/law nothingness. Then there’s the insincerity and straight ass-kissing and being so sure of things that aren’t possible to be proved. He/she/it is very likely a public affairs person who thinks he/she/it is actually smart. Let me tell you, from 1st ‘degree’ experience, it’s a worthless degree and I will tell everyone I meet to go science or don’t go to school; it’s a scam otherwise and don’t fall for it.

I was waiting for Clive Robinson to come in and yet again intellectually own w/ the dumb comparisons b/w physical and digital worlds he/she/it brought up questioning backdoors eventually being compromised, as Clive has “banged on” about this a lot here. Just doesn’t have a clue how technology (ie a global network) can fail in so many ways; nor the nature of all data can be observed and copied.

It is pointless to engage, unless to call him/her/it the idiot him/her/it is. Maybe he/she/it needs an actual demonstration of what we’re saying…? Maybe too late as I recall when his/her/its posts got caught up in the spam filter..? lol..that would be hilarious…

Figureitout October 8, 2014 12:22 AM

Incredulous
ADD-ON
–Simple physics is determining what’s going on depends on your “frame of reference”. So LEO’s operating under the assumption there aren’t any foreign agents, inside leakers, or amateur attackers compromising a facility or network; can’t guarantee a certain “frame of reference” is visible from a vector in even just 3 dimensions. The simplest compromise is just observing the protocols/key gen/crypto in real-time!

Anura October 8, 2014 12:33 AM

@Thoth

RE: PBKDF2

Well, the point is that you are supposedly forced to crack it on the phone (although as I stated before, that’s probably not the case for TLAs), so they chose an execution time target, which is 80ms and provided you are forced to use the phone the algorithm doesn’t matter much… Which brings up another question, why only 80ms? It’s a phone lock screen, it’s not like you need a high throughput, why not 1 second?

I agree regarding PDKDF2 overall, though, since your assumption should be that the encryption key will be recoverable with physical access. They should use BCrypt, make it take 1sec (and preferably implement as much in hardware as possible) and figure out how to get more secure lockscreens. A 6 digit pin at 80ms, even if chosen completely at random means 11 hours to crack on average (22.22 hours maximum).

The thing that needs to be asked is how much entropy is needed, and how do we get that much entropy in a convenient and memorable way?

jdgalt October 8, 2014 12:48 AM

We are all on our own, but even if Apple’s iOS can’t be trustworthy because it’s closed-source, it’s still useful as a test of government’s tolerance. Will FCC or some other three-letter agency revoke the iPhone’s type-approval under CALEA? Will anybody go to jail? Most important, will there be a court case that can set a precedent giving us all the civil right to use whatever crypto we want on our communications?

Even if the courts are prevented from considering this question, or are unwilling to give a straight answer (just as SCOTUS managed to refrain from hearing any challenge to gun control laws for 70+ years!) we can expect to see a practical answer in the form of whether this gadget, or good crypto apps for Android or other phones, appear on the market and stay available there. It’s important that the law allow such things, both in theory and practice, if good crypto is to become used widely enough that using it will no longer be reason for the feds to investigate people.

In the meantime I suggest technical folks continue to push the envelope by solving the gaping holes in common “security” protocols such as SSL and DNS. Good peer-to-peer versions of both would lock snoops out of a lot more places, including many that non-technical users now believe are secure.

Thoth October 8, 2014 12:53 AM

@Anura
I would assume that once you have physical access to a phone, you would copy it’s content out to crack it’s encryption like any other offline attack do. Trying to attack a system on itself is dangerous. What if that system have login traps like … wipe phone after 5 wrong attempts. My presumption is attacker has access to phone and can copy/read content. If they can read the content, they would likely propagate it across some form of networked server system to attempt to crack it and this is where a mix use of SCRYPT to inconvenient the use of GPUs and BCRYPT to inconvenient the use of CPUs and FPGAs for cracking. Let’s say we alternate 2000 rounds of SCRYPT and BCRYPT 5 times over which makes 10000 rounds of SCRYPT and BCRYPT each, this would provide significant stopping power against GPUs, CPUs, FPGAs for cracking.

Entropy is difficult to achieve on a small device but the Tinfoil Chat has a noise generator schematics which is useful for generating entropy. As Nick P loves to say, a dice throw with a CSPRNG would suffice. I guess you could do that by having a few sets of entropy pool and use them at mixed rates (like how Fortuna PRNG does).

Anura October 8, 2014 1:01 AM

@Thoth

You can’t just copy the data off and crack it from there, the key is physically written to the crypto processor and can’t be read (supposedly); this is to force you to use the phone to crack it. Also, I would think you would install custom firmware to avoid any software mechanisms, as well as kernel access restrictions.

As for entropy, I was referring to passcodes for lockscreens, not the phones themselves. If you want a CSPRNG, you can use the phone camera, microphone, radio antenna, accelerometer, touchscreen sensor, and combine it with various software metrics.

Thoth October 8, 2014 2:50 AM

@Anura
We would assume that Apple uses the defacto common way of most disk encryption which is to use a low entropy password (CIK) to be stretched to encrypt a hoepfully high entropy crypto key (AES key in chip) which this high entropy crypto key (DEK) would be used to encrypt data.

If it is a common hacker, they could simply copy ciphertext data out with some “cheap hacks” like jailbreaking and all that sort. A more sophisticated attack vector would be to access the chip’s DEK. It has been known that Apple has some “unofficial fun stuff” and who knows if there might be undocumented way to directly interact with the chipset before the option of prying open the phone is considered. Even if the option of prying open a handset to retrieve the chipset DEK (who knows if Apple might actually have a record of all the DEKs too ?) is used, there is nothing stopping the DEK from being copied.

As Bruce has always ranted about how attacks get more sophiscated and about TLA access, we can’t simply rule out the option that DEK cannot be copied easily or at least to a possible extend. We should consider security to the extend of making inconvenience to TLAs (but not totally excluding them) since attacks get better.

Let’s assume somehow the DEK gets extracted by some sidechannel, unofficial call functions, TLA style toolkits that can read chips and disk data… etc…

With the DEK and ciphertext in hand, the first step is decrypt the CIK protected DEK. If it is a badly configured password stretching, it can be bruteforced which will reveal the plaintext DEK (DEKs are keys so they are pretty small in size and you can just keep on bruteforcing). With the DEK, it’s over.

I have been suggesting building HSM modules into phones (in some of my comments) to prevent such thing from happening. I am not sure of the reaction of an Apple product if you somehow go for the DEK. Will it has HSM-like tamper detection feature to zeroize the DEK ? Who knows.

Passcodes for lock screen and effectively your lock screen password/passcode/patterns/pictures/…etc… is going to be stretched into the CIK to protect your DEK. I would expect it to be low entropy for human memory purposes. A pattern based login only has 9 nodes where you draw your patterns over, how much entropy can it afford for a lock screen ? Even if it is a text based password or numerical PIN lock screen, the entropy is still very low. Try typing in a 24 character strong password into a login text field on a phone screen and you would simply give up on it and just change it to an 8 character one.

Anon October 8, 2014 3:22 AM

@Skeptical • October 8, 2014 12:03 AM

“”Anyone with access via a FOREIGN GOVERNMENT which requires Apple to supply them the golden keys as a requirement for entry into the their market””

“If the keys were designed in such a fashion, and if Apple complied, yes”.

And then China and Saudi Arabia would have the power to demand from any company that you either give us the golden key, or your product is banned from our market.

Apple acts as an international corporation and can’t selectively dish out golden keys only to American law enforcement.

So any backdoor must be vailable to China, Saudi Arabia and Russia.

If the Chinese government has arrested a Tibetan activist, and is unable to crack his phone containing names of human rights activists, Apple will get blood on its hands for assisting a foreign government in human rights violations.

Thoth October 8, 2014 6:08 AM

@Anon
And that’s why it is always a bad idea to have Golden Keys, Special Tunnels or Unknown Doors of sorts in the first place.

If there is a need for security, do it properly in the first place. A half baked effort is a waste of time, effort, resources and …. blood ….

So much security products are not even secure in the first place or are done by people who do not known much of security. They simply see crypto as the magic key to all problems and that is a sad case.

Clive Robinson October 8, 2014 6:40 AM

@ jdgault,

Will FCC or some other three-letter agency revoke the iPhone’s type-approval under CALEA?

That is actually an interesting question about the “Demarc”.

In theory CALEA is limited to access to the “communications equipment” and thus not past the Demarc. That is your home PC is not considered communications equipment because it’s purpose is to be used as a stand alone data processing unit. The fact it can be optionaly connected by another device –a modem– to a communications network does not make it of necessity “communications equipment” that is the demark falls at the modem connection.

In a mobile phone you have a “baseband processor” for basic phone functionality under the control of the SIM which has it’s own processor etc. In a smart phone you additionaly have another processor that gives the phone it’s smarts. This processor is the equivalent of the PC and the baseband processor the equivalent of the modem.

The question is will a judge see it this way or conclude that as the two processors are on the same PCB or in the same case then the demarc is actually past the PC equivalent processor.

If they do then this brings up a whole can of worms about pads and laptops, and even USB thumb drives.

It’s probably the next “secret court” battle ground which will lead to adverse decisions as the likes of FISA appear to accept without question a TLA legal argument…

The UK has already decided this with RIPA as far as it’s concerned there is no demarc for snooping it always goes on the otherside of wherever the interesting data is, even in that stand alone airgaped system in a security vault, because at some point it will have been or will be connected all be it indirectly back to a system that in turn will be connected to a data network connected to a UK network…

Anon October 8, 2014 6:42 AM

@Thoth • October 8, 2014 6:08 AM

Yes, golden keys or backdoors are a very bad idea even assuming that their use are limited to complying with ‘legal’ requests.

They are dangerous for the individual who may get tortured or killed for owning a backdoored product, and tortious for the company handing over the information to a government requester.

If the Chinese, Saudi or Russian government know that there is a backdoor open for the US government, Apple can’t treat any legal foreign government differently but must hand over the key provided the request complies with local law.

If local law states that blasphemy or criticism of the head of state is a death penalty worthy crime, and a Chinese, Saudi or Russian dissident is refusing to hand over his key, a US or western company will be in a quandary.

If it complies, and information found on the decrypted phone helps the police to locate victims who get tortured or killed, the western corporation may well be sued or at least held morally responsible.

Apple can never afford being put in a situation wherein any backdoor can be seen as aid to a foreign government’s oppression of its people.

A backdoor or golden key is an all or nothing policy and there is no way to limit its use to legal requests depending on civilized or democratic law enforcement.

For a Chinese, Saudi or Russian dissident, a golden key or backdoor in any encryption product is a warrant for death or torture at the hand of the secret police.

What if Apple or Google had been around in the 1930s while we were still at peace with nazi Germany?

Would a golden key have been available to Gestapo?

A foreign corporation could not be selective in its cooperation with the local government while stille being
permitted to sell its products.

Clive Robinson October 8, 2014 7:05 AM

@ Thoth,

I am not sure of the reaction of an Apple product if you somehow go for the DEK. Will it has HSM-like tamper detection feature to zeroize the DEK ? Who knows.

I very much doubt it will zeroize…

The reason is ‘(ab)users’ and keeping development, tech support and other related costs to a minimum.

They will tell users it’s not possible but there actually will be a way to get at the DEK from cold without the user authenticating.

Thr oldest real world example of this mentality I can think of at short notice is the *nix root password.

If you installed *nix and for some reason you lost the root password, when you phoned tech support they would tell you correctly “sorry we cannot help you get it back” and then offer incorrect advice such as “you will need to reinstall the operating system”.

The real advice depending on the *nix was to start the installation process and when the kernal was up escape out into a root command shell and mount the /etc disk and edit using either ed or vi the root line in the password or shadow file and depending on the vintage either remove the existing password hash or make a new user hash and copy it over the root hash. Then reboot the system and change the root password to something new and tidy up any other remaining bits and bobs.

This “never lock yourself out permanently” mentality is hard baked into nearly all system / OS developers minds, due to the grief of not being able to do it during the development process when things go wrong.

So no I don’t think the DEX will get zeroized or any other authentication tokens / keys unless a full factory reset is done, and maybe not even then.

Coward October 8, 2014 9:48 AM

Hi guys, does anyone actually understand how it is possible for one to host files on one’s servers and not to be able to decrypt them? If you provide a service like iCloud then when hosting the files you always host the key as well? Or are they saying that data on a physical device is different than data in the cloud? That is a premise that is true today, but not tomorrow when all data is in the cloud. Even if physical device data is secure, the government can force you to give your fingerprint, can’t they – unlike a password – so there are ways around that über-security that Apple is proposing for data on a physical device.

Is this the reality – because it appears to me that it is – ?..
– use any iPhone up to an iPhone 5, because government can force you to authenticate via your fingerprint.
– change PIN to an alphanumeric code. Really difficult to do. Most of us can remember only so little.
– once something is on iCloud or any server provider the Chinese, the Americans and whoever else has their hands on it. Create fake data so that when they compile a profile from you they get it wrong – from innocent lies like that you like to listen to country music when you are a death metal dude to finding ways to delude Google and Apple Maps. Make the profile data chaotic and unreliable.

A view of all the secure e-mail providers leads to a conclusion that none can give 100% guarantee that they can’t access one’s e-mail. How can Apple with iCloud avoid that? Or is e-mail data different from phone and iCloud data?

Is the premise that the key is tied to physical device and never leaves it? Then we have to trust the word of Apple that it is indeed tied to a physical device and not logged and transmitted to a 3rd party via their software that we can’t audit nor can the ACLU. Back to square one. All tech companies say that no backdoor is possible, until, oops it is.

Additionally, I don’t understand how is that this article states the key is your PIN / alphanumeric lock code but you can also unlock your iCloud data in some forms with your Apple ID. Therefore there are already 2 keys.

Maybe it’s a shell game, where Apple says that PIN lock code is safe, and in deed it is but allows NSA access via the Apple ID? So technically the data is safe but in reality NSA can get a bird’s eye view of it.

Sancho_P October 8, 2014 1:05 PM

On iPhone encryption and backdoors,

@ Anon wrote:

…, a US or western company will be in a quandary.
If it complies, and information found on the decrypted phone helps the police to locate victims who get tortured or killed, the western corporation may well be sued or at least held morally responsible.
… For a Chinese, Saudi or Russian dissident, a golden key or backdoor in any encryption product is a warrant for death or torture at the hand of the secret police.

Not sure if I got that right, you say the backdoor would lead to torture or death?

  • On the contrary, a backdoor would probably prevent torture, because the’d have access without need of physical pressure.

So that’s not the real concern we must have.

The existence of a golden key == a backdoor for the powers means there is official access.

Therefore the powers can claim there is incriminating “evidence” on your device –
and no one but God (if you believe in) could proof the opposite,
regardless whether they’ve placed the evidence there or it never was there before.

This is no problem for you and me, but probably for a higher ranking official, judge, politician or any foreign (this will never happen in the U.S., granted !!!) dissident.

This is why the simple existence of an intentional backdoor is mad, it transforms a gadget into a weapon.

Think of vetting your brain using a special scanner, e.g. the kind they have at Gizmo. [1]

In case there is no backdoor then there is no such probably tampered “evidence”.
That’s OK, let’s face it:
Criminal behavior is still behavior, not the content of personal “thoughts” (true or not).

It might be convenient for the police to find images of thugs at the crime scene.
This is a post crime convenience.

Society needs means to prevent crime in the first place!
This is a political duty.

[1] See, criminals have their legal backdoor anyway, especially in undeveloped societies.

@ Moderator: Thanks!

Anura October 8, 2014 1:27 PM

Can we stop calling it a “golden key”? Because I don’t think we should legitimitize the WaPost article. It’s just key escrow.

Sancho_P October 8, 2014 2:09 PM

“Can we stop call it …”
No probs, but what the WaPo wrote drives the wet dreams of skeptical national-capitalists.
😉

Anura October 8, 2014 3:08 PM

@Dan Hough

Here’s a detailed, point by point analysis:

“And even if the conversation is encrypted — in principle — it is still possible to decrypt [AES] provided you have sufficient computer power ,” he says. This is in no small part due to the fact that the vast majority of telecommunications operators use the same encryption algorithm — the so-called AES, the outcome of a competition launched by the US government in 1997.

o_O

This is where my invention comes in,” he says. It expands the AES algorithm with several layers which are never the same.

“When my phone calls you up, it selects a system on which to encrypt the conversation. Technically speaking, it adds more components to the known algorithm. The next time I call you, it chooses a different system and some new components. The clever thing about it is that your phone can decrypt the information without knowing which system you have chosen. It is as if the person you are communicating with is continually changing language and yet you still understand,” he says.

😐


tl;dr, basically, they don’t attempt understand or address the real problem (keys are stored with the service provider, not the phones themselves), instead they say “AES can be broken if you have enough computing power, so we rolled our own system.” The only way to address the problem is if each phone has a certificate secured in a way that the NSA can’t forge it (probably notification that the key changed is the best you can do), and each phone generated ephemeral keys for each call.

Thoth October 8, 2014 11:42 PM

@Key Escrow Advocates
It is quite naive to think that handing over key escrow data would reduce torture. It may reduce torture if the data contains nothing they can use. It may inicrease torture because they become suspicious of something is there but they can’t see anything despite using key escrow methods.

We should not just consider scenarios for criminals and low lives of sorts. There are many other legitimate users of such crypto-systems. Activists, health workers, peace keepers, government agents, corporate members … ordinary people … all whom are working in hostile environment may need them.

Lets put an example of a legitimate user – journalist/activist/healthworker – in a hostile environment (Middle East/China/Russia/N.Korea). If these hostile environment government demands key escrow usage (like what Blackberry did to remain in business by providing Middle Eastern Govt access to it’s secure messaging), that legitimate user would be in trouble and if the leigitimate user has friends and contact lists and messages with other journalist/activist/healthworker, it is pretty much turn into a witch hunt.

Andrew_K October 9, 2014 1:06 AM

There seem to be three constants in technology discussions over time.

  1. It is dangerous
    Whenever new and not intuitively understandable technology has been introduced to public, it is assumed to be dangerous for outsiders. Note that those in power and their representatives are by definition insiders on everything.
  2. Therefore, we must limit distribution
    Because those not in power or representing it cannot handle the dangers associated with the new technology (by definition), they are prevented from getting exposed to it. Keeping it secret will especially minimize the number of criminals who can abuse the new technology, right?
  3. Those who break the rule are either working for the power or have to be criminals
    Those using the new technology on behalf of power are by definition doing it to the good of the public. Other persons interested in this technology receive special attention and can subsequential be recruited or prosecuted; probably depending on whether they are willing to keep secrets secret and work for the power — or decide to make them publicly available.

Threre are of course criminals, who want to and who will abuse the technology for their personal benefit. They partially legitimate the actions of those in power since the public will fear the new technology, as it now poses a real danger to them. Power may fight them, especially if it shines a bright light on otherwise secret technology.

Enlightening the masses usually takes place several (i.e. 10-100) years after a new technology has been developed.

Andrew_K October 9, 2014 2:35 AM

@ Thoth

I generally agree with your position against key escrow but I beg to differ regarding the use of torture.

The main difference is that in the moment a state applies torture, it exposes is hostile nature — brutally and without any misunderstanding. Compare this to a state which happens to be able to take a look at a device from time to time. Who just wants to look whether everything is “still ok”. E.g. when you are processed at an airport.

There are persons which can be tortured and persons which cannot be tortured. Everyone already feeling state’s hostility and not in a position where he or she may hurt the state by speaking out can be tortured — as it will basically not change anything aside them doing what state wants them to do. But those not feeling state’s hostility should not be tortured as it would make them doubt their believe in the state being the good guy. These people can only be subject to subtle surveillance. And they are the vast majority in almost every country.

And also don’t forget: Using torture for information retrival is wrong understanding of torture.
Torture is about destroying a persons personality. It is not usable for reliable information retrival since — when tortured — people will say and do anything to get out of it. If you want to stage a tribunal and be sure the suspect will confess, fine, use torture and put sadistic smiling torturer(s) in the first row of the audience. But do not expect the suspect to tell anything reliable. He will just say and do what he assumes will make the beasts go away, including suicide.

Axel Cooper October 9, 2014 11:36 AM

No one who values security should buy a closed-source product anyhow. This includes all “smartphones” (AKA programmable multi-sensory pocket bugs) shamelessly locked down against the user (AKA fish, if he buys such trash without the intention to reverse-engineer it instead of mindlessly clicking apps). It really applies to all computers available today, which is a tragedy in and of itself.

But this piece of insulting drivel from the “washington post” (AKA “pravda”?) really stands out, even in the midst of the Snowden backlash, a time where fear is hammered into the population at increased pace and new police state measures are paradoxically being implemented (you’d think it impossible right now, but the majority truly ARE sheeple). made me blow my fuse. It even drove me to register an account just so I could post my opinion about it (luckily they allow this).

encryption is very basic civil right in the digital age and must be enshrined in UN charter.

Sancho_P October 9, 2014 6:47 PM

@Toth, Re: Key Escrow Advocates

I think there is a huge misunderstanding. You wrote:

”It may reduce torture if the data contains nothing they can use. It may increase torture …”

When it comes to torture it is completely irrelevant whether they find something or not, yet worse,
“contains nothing” is just more suspicious, triggering the real trouble for that poor guy.

With rouge states, all over the world, be it U.S., in Europe or Asia, you are always doomed if they pick you, because they do not need evidence.

There is just one difference with a key escrow system and small crime suspects:
When it comes to court they could say “We had official backdoor access and found xyz” but nobody, not the best layer could reply “Wrong, it is / was not there!” because they have but he has not access to the device’s data.

Therefore: No key escrow, no access, no (correct / false) data, no evidence.

”Lets put an example of a legitimate user – journalist/activist/health-worker – in a hostile environment (Middle East/China/Russia/N.Korea). If these hostile environment government demands key escrow usage (like what Blackberry did to remain in business by providing Middle Eastern Govt access to it’s secure messaging), that legitimate user would be in trouble and if the legitimate user has friends and contact lists and messages with other journalist/activist/health-worker, it is pretty much turn into a witch hunt.”
[emphasis added]

You didn’t include the U.S. in your list of hostile environments, which is not true for foreigners,
but ask Skeptical where he would “draw a line” between legitimate user and Gizmo.

In fact, your example points at another, much more serious issue:
Even with perfectly secured devices they know all your contacts, including time, duration and location, without asking for access to your gadget.

Witch hunting season is open, no limit.

.
@Andrew_K:

”Who just wants to look whether everything is “still ok”. E.g. when you are processed at an airport.”

This is not only a slippery slope but is extremely incriminating, as it reflects that (bad) parents / kids relation “we are the good and you are suspect”.
No. I disagree. This is not the society I’d want.

”There are persons which can be tortured and persons which cannot be tortured.”

I did not understand that.
They have tortured a teenager in Gizmo. Everybody can be kidnapped and tortured, by mistake or not.

I fully second your last paragraph. Unfortunately the powers think different.

Thoth October 10, 2014 3:33 AM

@Sancho_P
Probably we are all finished anyways due to pervasive invasive manipulation and surveillance techniques that are simply too hard to debug unless you kind of slow down your digital footprint or probably disappear somehow ?

My statement of:
“may reduce torture if the data contains nothing they can use. It may increase torture ”

It is not absolute. Nothing to agree or disagree about I guess. We can never be absolutely sure unless you have tasted one and lived to tell the tale.

We can assume with or without backdoors, it is up to those who have powers to decide the fate of their prisoners.

I mentioned in full if you did not notice the last sentence of that paragraph:

” It may reduce torture if the data contains nothing they can use. It may inicrease torture because they become suspicious of something is there but they can’t see anything despite using key escrow methods.”

Which I have said and you later on iterated the fact that containing nothing may or may not bring more trouble but it is not absolute again. Probably I did not make myself clear enough.

Well, I can add the US into the list of hostile environment. I think it is better to add “*” to the list of hostile environment since it’s fair that you have no idea what is the true undercurrent behind all those political theatres.

@Axel Cooper
I don’t think the UN or it’s Charters have any use of sorts at all. It is simply there for the powers that be to play out their international show. It’s all a political game, sadly.

Andrew_K October 10, 2014 4:21 AM

@ Sancho_P

Don’t get me wrong, I do not want the state to be parent-y. I just see it coming and we cannot do anything about. We will just have to find a way to deal with it.

Regarding persons that can and that cannot be tortured, I would like to elaborate.

I did not mean it in a technical way. Everyone can technically be tortured — everywhere.
A mock execution takes two or three strong men and fifteen seconds in an elevator without camera observation.

My statement was adressing the prior consideration whether a person should be subject to tortue.
There are lots of persons out there who will gladly hand their computers to Uncle Sam when kindly asked for by a cop. And there are persons who may be a little bit suspicious but whose suspicion can be calmed easily — making them cooperate, too. These people “cannot” be tortured since it would be counterproductive to make them doubt their beliefs in the state being the good guy.
Basically everything boils down to understanding the suspect. Its about making persons believe handing over information is the right thing to do. Most people are accessible to simple manipulation (which may include elements of torture without the intend of destroying more of the personality than necessary for information retrival).
Those seemingly immune (not revealing information either because they are trained to withstand torture or because they have no knowledge to reveal — which is impossible to differentiate) pose a thread in the state’s perception. Therefore destroying them in torture will best case lead to new evidence. Worst case, they’re just no longer a risk, since they may be tortured to a state beyond their training (which is possible, since training shall not completely destroy the trainee). Win-Win. They “can” be tortured, especially if state manages to get the suspect out of the legal system which would otherwise restrict possibilities (super fun surprise vacation on a tropical island).

I think this is also the answer to your last statement. State knows exactly what torture is and when to apply it. The do not think different about torture. They think different about their people than we would expect them to.

Anon October 10, 2014 6:13 AM

@Sancho_P • October 8, 2014 1:05 PM

“Not sure if I got that right, you say the backdoor would lead to torture or death?

  • On the contrary, a backdoor would probably prevent torture, because the’d have access without need of physical pressure.”

Only if the contents of the phone did not violate local law.

It’s safe to assume that any government willing to kill or torture to get the key would always be willing to kill the dissident for the contents of the phone.

However, if the contents of the phone remains locked, the police doesn’t know who is on the dissident’s contact list.

I think that phone encryption in such a case is not likely to save the owner, but if the information stays encrypted and there is no backdoor other individuals may avoid being tortured or killed.

Sancho_P October 10, 2014 4:55 PM

@ Thoth:

So we are close to the same line, thanks.

@ Andrew_K:

I think now I got your “can / can not”. You are a good fellow, you believe in your state powers.
I do not.
When it comes to torture any credibility is lost, there is no responsible brain involved,
from President down to officer.

They can not think (= “understand the suspect”) because they do not have enough intelligence.

“Give me the box you will allow me to operate in. I’m going to play to the very edges of that box.” [M. Hayden]

Actually I have 6 goats in my backyard. When I “give them the box” that’s exactly what they do.
This mindset is useful for beltway grunts but wouldn’t suffice to drive a Humvee through Bagdad.

A goat on top of the NSA. The appropriate box would be 3 by 4 ?

Sancho_P October 10, 2014 5:21 PM

@ Anon:

Nope, encryption may bring more trouble, just imagine:

They kidnap you in the darkness, push you into a van, strip your pants, set you on drugs and drive out to the private (VIP) part of the international airport. You lost your consciousness but with the escrow key to your phone they learn they got the wrong guy. You will end up out of town in the gutters but not at Gizmo.

This is the only (very rare) scenario where a key escrow may help.
Skeptical people may use it to argue for key escrow (“officers can easily identify you even when your phone is locked and you are unconscious, they will learn about your medication in minutes and save your life”).

… But when the phone remains locked (no key escrow) and they believe you are the right one at Gizmo you will open all phones they hand you over within minutes, granted. You will sing out your “secret” manifesto loudly, day and night.

  • I have no idea in which case encryption could be helpful to avoid “torture”.
    Remember, in most states a judge will send you to jail until you open your phone.

But do not make a mistake: I am for full device encryption without key escrow,
because my sophisticated device is part of my personality, my brain, my thoughts.
I may grant access but I want to be asked for and given a reason, together with a fair chance for a lawyer.

.
And again:

The dissident’s contact list from the phone is not worth a penny.

They don’t need it.
They have all contacts from the provider, together with connection details and locations of your phone and your contacts during conversation.

They do not need your phone to know your contacts. You can dump the phone – they have them already.
You do not even notice when they check your contacts, the provider will and can not tell you about, they don’t even know.

See, here is a dangerous and crazy idea, uncritically populated in an otherwise serious magazine:
http://www.scientificamerican.com/article/a-phone-that-lies-for-you-an-android-hack-allows-users-to-put-decoy-data-on-a-smartphone/
I guess I know what will happen when the provider’s connection log does not correspond with the phone data: “Lying to an officer” (adds 4 years for stupidity).

Thoth October 11, 2014 12:22 AM

I guess if the only way to communicate secretly is that no provable evidence can be provided and our mental conversations without additional media is the only safest way to communicate ephemerally, than we should all learn telepathic communications to communicate using metaphysics since it such comms cannot be proven anyway not can it be intercepted.

(Last sentence is just a pun intended. Since it is nor provable in the first place.)

PJ October 12, 2014 10:55 AM

“You can’t build a backdoor that only the good guys can walk through. Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.”

There are no good guys. There are 2.3 million Americans in jail, political prisoners, put there by the supposed “good guys”. I don’t worry about Chinese secret police, I worry about American police of all varieties.

However, I don’t worry too much. The main harvest here for the police state is not information, but fear in the minds of the peons. The remedy for that fear is to stop fooling with computer technology and go out and buy a battle rifle and a case of ammo. Somehow it helps a lot with the paranoia about back doors…

someone October 14, 2014 1:37 AM

hmmmm???? We are talking about back doors, and how we can not trust the government nor corporations really- so why have ANY faith that closed source commercial encryption such as bit locker are “strong” with no built in weaknesses??

Even if Microsoft, Apple, or anyone else have their code “audited”, closed source = no possibility to trust it completely.

Leave a comment

Login

Allowed HTML <a href="URL"> • <em> <cite> <i> • <strong> <b> • <sub> <sup> • <ul> <ol> <li> • <blockquote> <pre> Markdown Extra syntax via https://michelf.ca/projects/php-markdown/extra/

Sidebar photo of Bruce Schneier by Joe MacInnis.