New York tries to force phone makers to put in crypto backdoors

encrypt 

The sport of holding Apple, Google and other tech companies over a barrel to demand backdoors now has a new player: New York.
The state assembly has come up with a proposed bill that would ban encrypted mobile phones and slap manufacturers with a $2,500 fine per phone sold in the state of New York without a backdoor.
In a nutshell, backdoors are security holes – for example, an undocumented master decryption key – knowingly added to software.
The bill, introduced earlier this month, (PDF) demands that smartphones come with means of being decrypted, starting as of two weeks ago:
Any smartphone that is manufactured on or after January 1, 2016, and sold or leased in New York, shall be capable of being decrypted and unlocked by its manufacturer or its operating system provider.
The state assembly would impose a $2500 fine for each infringing phone sold in the state.
That’s a lot of phones and a lot of potential fines. New York’s a big state with a big appetite for mobile gadgets.
The rationale, from notes on the bill:
The fact is that, although the new software may enhance privacy for some users, it severely hampers law enforcement’s ability to aid victims.
All of the evidence contained in smartphones and similar devices will be lost to law enforcement, so long as the criminals take the precaution of protecting their devices with passcodes. Of course they will do so. Simply stated, passcode-protected devices render lawful court orders meaningless and encourage criminals to act with impunity.
The proposed bill is similar to the Investigatory Powers Bill in the UK, which has the support of Prime Minister David Cameron.
If it passes – the next step would be for the bill to move to the floor calendar, followed by votes in the assembly and senate – it would mean that manufacturers or operating system providers would have to decrypt and unlock phones for law enforcement and other authorities, creating a backdoor to surpass the encryption.
We’re hearing plenty of similar calls to poke holes in encryption, coming from countries including the UK with its Investigatory Powers Act or in China, which was poised to require internet companies and other technology suppliers to hand over encryption codes and other sensitive data for official vetting before they went into use.
Those demands were dropped in the law’s final draft, but China’s new law still requires that companies help with decryption when the law deems it necessary for investigating or preventing terrorist cases.
The Netherlands, on the other hand, has come out against backdoors last week, but the assault against the technology is still raging, as the New York bill clearly shows.
Apple CEO Tim Cook has been strenuously arguing that backdoors weaken security.
Here’s how he explained it to 60 Minutes last month:
Here’s the situation… on your smartphone today, on your iPhone. There’s likely health information, there’s financial information. There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it.
Why is that? It’s because if there’s a way to get in, then somebody will find the way in. There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys.
Cook and other Silicon Valley execs last week met with White House officials to discuss the use of social media and technology in the fight against terrorism, radicalization, and propaganda.
Apple has stated that it’s “impossible” to unlock most iPhones, given an iOS 8 feature that prevents anyone without the device’s passcode from accessing the device’s encrypted data – including Apple itself.
Cook has said that a backdoor wouldn’t be such an issue if it were to be used only for catching “bad people,” but he doubts that crooks couldn’t manage to figure out how to exploit a backdoor even if it were only meant to help law enforcement.
Naked Security’s take?
Paul Ducklin put it pretty bluntly: “Tim Cook is right: if you put in cryptographic backdoors, the good guys lose for sure, while the bad guys only lose if they are careless.”
And, as Paul recently reminded us, the US has gone down this road before, and it didn’t turn out well.
In the 1990s, the US required American software companies to use deliberately weakened encryption algorithms in software for export, in an attempt to make it safe to sell cryptographic software even to potential enemies because their traffic would always be crackable.
The results:
  • International customers simply bought non-US products instead, hurting US encryption vendors.
  • EXPORT_GRADE ciphers lived on long after they were no longer legally required, leaving behind backdoors such as FREAK and LOGJAM that potentially put all of us at risk.
Doubleplusungood.
Backdoors have a way of being forgotten about, soon end up widely known, often live much longer than anyone imagined, and can be widely misused: all good reasons to avoid them.

Comments