There should be no doubt about it. Privacy is necessary for markets to function and innovation to thrive. Without it, nobody would use online banking, email, location services, and more. With encrypted data, people feel safe knowing someone isn’t currently stealing their identity or looking at their most personal information. Unfortunately, some politicians care more about the government than their constituents, as evidenced by Diane Feinstein (D-CA) and Richard Burr’s (R-NC) “Compliance With Court Orders Act of 2016.”
Aside from the Orwellian overtones of the first section declaring that “no person or entity is above the law,” Feinstein and Burr manage to alienate technologists and scientists alike with this naïve, simple-minded dumpster fire of a bill.
In the text, the bill requires companies “receiving an authorized judicial order for information or data” to provide “intelligible information or data, or appropriate technical assistance to obtain such information or data.” It describes intelligible as either data that has been encrypted and decrypted to its original form, or data that has never been encrypted at all. The text requiring compliance with a court ordered warrant is overshadowed by the fact that the government suggests actual unencrypted data is a viable option. It would be more practical to install cameras inside every American home.
The hastily drafted Feinstein-Burr bill is a sad caricature of our government, and its reactionary nature, in drawing inferences between terrorist activity and encryption, is bad for consumers and the economy. Even the former head of NSA and CIA Michael Hayden admits that while back doors are good for making the surveillance easier, there are people who would take advantage of the opportunity, including the government itself. On a larger scale, encryption represents one of the few options for consumers to protect their digital information from prying eyes.
Surveillance has always been imperative to government counterterrorism policy and at first glance this would seem to be another way for the NSA or FBI to protect American citizens from those who would use encrypted communications to the worst ends. As evidenced by the FBI’s battle with Apple to unlock an iPhone used in the San Bernardino shooting, we are nearing a crossroads that can no longer be avoided. So how do we balance privacy and security in the digital age?
Luckily, we have history to prove that these methods don’t work. This isn’t the first time a bill has been proposed which mandates access mechanisms in encrypted data. In 1993, the Clipper Chip was developed by the NSA as an encryption device for phones with a backdoor straight to the agency. It was meant to be adopted by telecommunications companies for voice transmissions but was defunct by 1996. Two decades ago, the Electronic Frontier Foundation (EFF) as well as the Electronic Privacy Information Center (EPIC) challenged the program on the basis that since American companies would be forced to adopt this new standard in their encryption products, other foreign companies would not, negating the efficacy of the program. There is the issue of compromising the existing data and privacy of Americans if one were to hack through weakened security measures. Other revelations about criminals opting to use foreign technologies that have been subjected to U.S. mandates also render encryption legislation moot. Believe it or not, American companies aren’t the only ones who can encrypt data.
Similar arguments can be drawn about threatening American companies by diminishing their encryption abilities and sending opportunities to do so overseas. Hypothetically, if you were a consumer choosing between two apps, but learned that one’s data was largely unprotected while the other had robust encryption, which one would you choose? To a larger effect, who would you rather trust with ensuring your data privacy: A Congress with a lack of technical background on the issue, or the companies tasked with ensuring they have a viable, secure product to market (not to mention the aforementioned market incentive to protect)?
Considering the FTC has recently sued companies over data breaches, how would companies be able to comply with two contradictory forces. On one hand, the government requires strong protections of consumer data, whereas on the other, abilities to access inevitably weakens these same protections.
The resulting fallout would be greater today than 20 years ago, now that markets are maturing and innovative technologies are well underway to changing the social and economic fabric of society. End-users and private enterprise would incur great costs to their privacy, while tech companies would need to overhaul security protocols to allow for government surveillance. Aggressive enforcement would do nothing to stem terrorism, and everything to halt innovation and industry growth.
The crossroads of security and privacy may require concessions from both the government and tech companies. From a policy perspective, we are inclined to give law enforcement agencies the tools to do their jobs. But in order for this to happen, public policy needs to be subject to the technical realities of data security. With a nearly $1 trillion internet economy on the line, this 9-page bill is inadmissible. Feinstein and Burr would do well to realize the massive costs, risks, and operational complexities that would stem from their misinformed bill.
If you would like a shorter video version of Aaron’s article you can find it here: