Should tech giants slam the encryption door on the government?

0
7
- Advertisement -

Reuters reported yesterday, citing six sources familiar with the matter, that the FBI pressured Apple into dropping a feature that would allow users to encrypt iPhone backups stored in Apple’s cloud.

The decision to abandon plans to end-to-end encrypt iCloud-stored backups was reportedly made about two years ago. The feature, if rolled out, would have locked out anyone other than the device owner — including Apple — from accessing a user’s data. In doing so, it would have made it more difficult for law enforcement and federal investigators, warrant in hand, to access a user’s device data stored on Apple’s servers.

Reuters said it “could not determine exactly” why the decision to drop the feature was made, but one source said “legal killed it,” referring to the company’s lawyers. One of the reasons that Apple’s lawyers gave, per the report, was a fear that the government would use the move as “an excuse for new legislation against encryption.”

It’s the latest in a back and forth between Apple and the FBI since a high-profile legal battle four years ago, which saw the FBI use a little-known 200-year-old law to demand the company create a backdoor to access the iPhone belonging to the San Bernardino shooter. The FBI’s case against Apple never made it to court, after the bureau found hackers who were able to break into the device, leaving the question of whether the government can compel a company to backdoor their own products in legal limbo.

The case has prompted debate — again — whether or not companies should build technologies that lock out law enforcement from data, even when they have a warrant.

TechCrunch managing editor Danny Crichton says companies shouldn’t make it impossible for law enforcement to access their customers’ data with a warrant. Security editor Zack Whittaker disagrees, and says it’s entirely within their right to protect customer data.


Zack: Tech companies are within their rights — both legally and morally — to protect their customers’ data from any and all adversaries, using any legal methods at their disposal.

Apple is a great example of a company that doesn’t just sell products or services, but one that tries to sell you trust — trust in a device’s ability to keep your data private. Without that trust, companies cannot profit. Companies have found end-to-end encryption is one of the best, most efficient, and most practical ways of ensuring that their customers’ data is secured from anyone, including the tech companies themselves, so that nobody other than the owner can access it. That means even if hackers break into Apple’s servers and steal a user’s data, all they have is an indecipherable cache of data that cannot be read.

But the leaks from last decade which revealed the government’s vast surveillance access to their customers data prompted the tech companies to start seeing the government as an adversary — one that will use any and all means to acquire the data it wants. Companies are taking the utilitarian approach of giving their customers as much security as they can. That is how you build trust — by putting that trust directly in the hands of the customer.


Danny: Zack is right that trust is critical between technology companies and users — certainly the plight of Facebook the past few years bears that out. But there also has to be two-way trust between people and their government, a goal thwarted by end-to-end encryption.

No one wants the government poking their heads into our private data willy-nilly, scanning our interior lives seeking out future crimes à la Minority Report. But as citizens, we also want to empower our government with certain tools to make us safer — including mechanisms such as the use of search warrants to legally violate a citizen’s privacy with the authorization of the judiciary to investigate and prosecute suspected crimes.

In the past, the physical nature of most data made such checks-and-balances easy to enforce. You could store your private written notebooks in a physical safe, and if a warrant was issued by an appropriate judge, the police could track down that safe, and drill it open if necessary to access the contents inside. Police had no way to scan all the private safes in the country, and so users had privacy with their data, while the police had reasonable access to seize that data when certain circumstances authorized them to do so.

Today, end-to-end encryption completely undermines this necessary judicial process. A warrant may be issued for data stored on let’s say iCloud, but without a suspect’s cooperation, the police and authorities may have no recourse to seize data they legally are allowed to acquire as part of their investigation. And it’s not just law enforcement — the evidential discovery process at the start of any trial could similarly be undermined. A judiciary without access to evidence will be neither fair nor just.

I don’t like the sound or idea of a backdoor anymore than Zack does, not least because the technical mechanisms of a backdoor seem apt for hacking and other nefarious activities. However, completely closing off legitimate access to law enforcement could make entire forms of crime almost impossible to prosecute. We have to find a way to get the best of both worlds.


Zack: Yes, I want the government to be able to find, investigate and prosecute criminals. But not at the expense of our privacy or by violating our rights.

The burden to prosecute an individual is on the government, and the Fourth Amendment is clear. Police need a warrant, based on probable cause, to search and seize your property. But a warrant is only an authority to access and obtain information pursuant to a crime. It’s not a golden key that says the data has to be in a readable format.

If it’s really as difficult for the feds to gain access to encrypted phones as they say it is, it needs to show us evidence that stands up to scrutiny. So far the government has shown it can’t act in good faith on this issue, nor can it be trusted. The government has for years vastly artificially inflated the number of encrypted devices it said it can’t access. It’s also claimed it needs the device makers, like Apple, to help unlock devices when the government has long already had the means and the technologies capable of breaking into encrypted devices. And the government has refused to say how many investigations are actively harmed by encrypted devices that can’t be unlocked, effectively giving watchdogs no tangible way to adequately measure how big of a problem the feds claim it is.

But above all else, the government has repeatedly failed to rebut a core criticism from security engineers and cryptography experts that a “backdoor” designed only for law enforcement to access would not inadvertently get misused, lost, or stolen and exploited by nefarious actors, like hackers.

Encryption is already out there, there’s no way the encryption genie will ever float its way back into bottle. If the government doesn’t like the law, it has to come up with a convincing argument to change the law.


Danny: I go back to both of our comments around trust — ultimately, we want to design systems built on that foundation. That means knowing that our data is not being used for ulterior, pecuniary interests by tech companies, that our data isn’t being ingested into a massive government tracking database for broad-based population surveillance, and that we ultimately have reasonable control over our own privacy.

I agree with you that a warrant simply says that the authorities have access to what’s “there.” In my physical safe example, if a suspect has written their notes in a coded language and stored them in the safe and the police drill it open and extract the papers, they are no more likely to read those notes than they are the encrypted binary files coming out of an end-to-end encrypted iCloud.

That said, technology does allow scaling up that “coded language” to everyone, all the time. Few people consistently encoded their notes thirty years ago, but now your phone could potentially do that on your behalf, every single time. Every single investigation — again, with a reasonable search warrant — could potentially be a multi-step process just to get basic information that we otherwise would want law enforcement to know in the normal and expected course of their duties.

What I’m calling for then is a deeper and more pragmatic conversation about how to protect the core of our system of justice. How do we ensure privacy from unlawful search and seizure, while also allowing police access to data (and the meaning of that data, i.e. unencrypted data) stored on servers with a legal warrant? Without a literal encoded backdoor prone to malicious hacking, are there technological solutions that might be possible to balance these two competing interests? In my mind, we can’t have and ultimately don’t want a system where fair justice is impossible to acquire.

Now as an aside on the comments about data: the reality is that all justice-related data is complicated. I agree these data points would be nice to have and would help make the argument, but at the same time, the U.S. has a decentralized justice system with thousands of overlapping jurisdictions. This is a country that can barely count the number of murders, let alone other crimes, let alone the evidentiary standards related to smartphones related to crimes. We are just never going to have this data, and so in my view, an opinion of waiting until we have it is unfair.


Zack: The view from the security side is that there’s no flexibility. These technological solutions you think of have been considered for decades — even longer. The idea that the government can dip into your data when it wants to is no different from a backdoor. Even key escrow, where a third-party holds onto the encryption keys for safe keeping, is also no different from a backdoor. There is no such thing as a secure backdoor. Something has to give. Either the government stands down, or ordinary privacy-minded folk give up their rights.

The government says it needs to catch pedophiles and serious criminals, like terrorists and murderers. But there’s no evidence to show that pedophiles, criminals, and terrorists use encryption any more than the average person.

We have as much right to be safe in our own homes, towns and cities as we do to privacy. But it’s not a trade-off. Everyone shouldn’t have to give up privacy because of a few bad people.

Encryption is vital to our individual security, or collective national security. Encryption can’t be banned or outlawed. Like the many who have debated these same points before us, we may just have to agree to disagree.

Written by Zack Whittaker
This news first appeared on https://techcrunch.com/2020/01/22/should-tech-giants-slam-the-encryption-door-on-the-government/?utm_source=feedburner&utm_medium=feed&utm_campaign=Feed%3A+Techcrunch+%28TechCrunch%29 under the title “Should tech giants slam the encryption door on the government?”. Bolchha Nepal is not responsible or affiliated towards the opinion expressed in this news article.