Inventor of BitTorrent Explains Why Manufacturers Will Stop Making Your Products Hackable

Inventor of BitTorrent Explains Why Manufacturers Will Stop Making Your Products Hackable
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

These questions originally appeared on Quora - the knowledge sharing network where compelling questions are answered by people with unique insights.

Answers by Bram Cohen, Inventor of BitTorrent, on Quora.

A: First, it should be pointed out that the central issue in the iPhone case isn't one of cryptography, it's tamper resistance. If you use a randomly generated 20 digit PIN then nobody will be able to forcibly unlock your iPhone even with the help of Apple, the NSA, and the Illuminati. When you have a short PIN the issue is one of physical tamper resistance. It's analogous to having a safe made out out of super hard Iridium which has a very difficult to pick lock, and there being general argument about how much effort it will require for the safe's manufacturer versus law enforcement to open it and who's going to have to foot the bill.

I'm not sure of all the facts in the Apple case so I'll give separate answers to various hypothetical scenarios.

If Apple did thing 'the right way' then there's a physically separate part of the iPhone which returns the key to decrypt the contents of the phone if given the right PIN, and it can only be used by giving it the right key or physically prying it open and inspecting its internals. If it's given too many wrong requests it either self-destructs or starts forcing increasingly long wait times before responding to guesses. In this scenario Apple has no special ability to extract information from the device, and should tell law enforcement to go pay for its own damn forensics when they want to forcibly unlock a device. According to some online sources newer iPhones are built this way (thankfully!) but the particular phone in question is an older one which is not.

If Apple did things in a completely busted way then there isn't any special physical security on the device, it's all done in software, and physically unlocking a phone is mostly a matter of having the right connectors and hardware to unlock it. In the scenario Apple would absolutely be compelled to cooperate with the FBI, but most likely the FBI could do that sort of forensics themselves and not have to bother with a court order, so this scenario is unlikely.

If Apple did things in a semi-busted way then there's a physically separate part of the iPhone which holds onto the PIN, but it has a 'reasonably secure' back door, in that a certain code or private key can be used to override its usual behavior and get it to return the decryption key for the device even without the PIN. This is the most legally interesting scenario. I'm not a lawyer, but I think in this scenario Apple would be legally compelled to unlock any single device which a court told them to, but it would most likely be unconstitutional for the FBI demand to be given the secret key to unlock everything so they can unlock anything they feel like in the future without having to bother collecting evidence and getting court orders. This sounds an awful lot like demanding the ability to do warrantless searches in the future, but it's a legally interesting case of a private company auditing the behavior of law enforcement, and I don't know for sure how it would play out in the courts. Given the amount of public saber rattling over the current case, it sounds like this last scenario is what's really going on.

...

A: I think the main result of Apple's confrontation with the FBI is that manufacturers are going to stop putting anything vaguely resembling a back door into their products. As I explained in my related answer, if the manufacturer does it the 'right way' and doesn't add any special back doors to the system whatsoever, then when law enforcement comes asking for help breaking things open the manufacturer can tell them to pay for their own forensics work. It's now clear that if a manufacturer includes any sort of backdoor or even diagnostic and testing features to their deployed products there's a very real chance that law enforcement will get a court order for the entirety of the development tools and private keys to be handed over to them, to then be used with impunity on the manufacturer's customers with no ability for the company to make sure that there are proper warrants being issued. And this is only in the United States. Law enforcement (or what passes for it) in other countries might pull the same stunt, likely with more underhanded approaches to making their demands and carrying out their investigations.

Most manufacturers, faced with these very real threats, will opt out of including any backdoors whatsoever.

...

A: Encryption security, unlike physical security, is very much an either/or thing. Either the system is put together the right way and no amount of resources can break it, or the system is put together wrong and no resources are needed to break it. Of course I'm simplifying a little, but the trend of real world systems is that the weak ones get weaker over time, and the system design should take that into account.

These questions originally appeared on Quora. - the knowledge sharing network where compelling questions are answered by people with unique insights. You can follow Quora on Twitter, Facebook, and Google+. More questions:

Popular in the Community

Close

What's Hot