President Obama recently told Americans that we have to give up some privacy in the name of security. Is the government’s demand reasonable?
Last month, I took up the issue of privacy from the perspective of the end user. This time I will try to shed some light on the government’s position, to see if it’s reasonable.
From the advent of telecommunications systems, the government has maintained the ability to tap calls and collect our communications. However, thanks to the Fourth Amendment, there has always been a pretty robust set of checks and balances in place. To tap a call, the (executive branch) investigators needed to get (judicial branch) approval in the form of a warrant. Further, if they collected data outside of the scope of a warrant, that information was inadmissible in court. If they happened to learn things from illegally obtained information, that information was deemed “fruit from a forbidden tree,” and it too was inadmissible.
For a long time, this arrangement provided a pretty good balance between privacy and security. But things changed.
First, encryption entered the equation. Later, the Snowden revelations put everything into “ludicrous speed” (with apologies to the movie Space Balls).
As computers became more powerful, scientists built encryption systems using software as well as hardware. These systems were used to safeguard our privacy, among other things. From the beginning, though, the government has done all it could to throttle the rate of progress. Because encryption is math, and therefore knowledge, it couldn’t be stopped. Its adaptation can be slowed down, however, and that is what the government has attempted, by limiting exports, key lengths and other things. By doing this, the government was presumably able to stay one step ahead of the bad guys.
In the 1990s, we started to see commerce arrive on the Internet, and with it, a higher need for communications privacy. So we saw SSL and then, later, TLS. And now we have secure communications, right?
Turns out that our trust in SSL was largely unsupported. A wise professor of cryptography once told me, “Rule number one for developing a security protocol is: don’t!” I thought he was more than a little paranoid. That is, until the Snowden revelations of a couple of years ago. Now his paranoia seems quite reasonable.
The Snowden revelations taught the public that our secure communications were indeed child’s play. SSL and other security protocols were developed by us crypto amateurs, and the pros found our efforts to be downright laughable. So most of the world was blissfully ignorant of the fact that its secure communications were still trivially easy to intercept.
But here’s where things again changed. Whereas before there was a pretty solid system of checks and balances, the public learned from Snowden that mass surveillance was now in play. This meant that pretty much all communications were being collected and were potentially analyzable by the government.
In response, product vendors have been stepping up their game. One need only look at the security evolution of the iPhone from 2007 to today to see that Apple and others are getting serious about privacy.
What the government fears the most in this regard is that it will lose the ability to (lawfully or otherwise) collect information. It claims that it won’t be able to catch terrorists, kidnappers and so forth. But are those fears reasonable?
I find it curious that the government chose the case of the San Bernardino terrorist’s iPhone to push this issue. For one thing, the phone that the government is seeking to unlock is owned by his employer. Never mind the fact that the employer should have had the ability to manage that device and unlock it as it sees fit. For the executive branch investigators to turn to the judicial branch to force a vendor to unlock its own product seems quite bizarre when you consider it should have had it unlocked from day one.
Clearly, the government couldn’t compel the terrorist to reveal the key, since he is dead. So instead, it had to reveal its own blundering by taking this to court.
I can only assume that, by choosing this case to push its agenda, the government is either desperate, feigning desperation or just staggeringly inept. I don’t find comfort in any of those scenarios.
If the government is truly desperate, it has to know that it is losing the “crypto wars” and this is a last-ditch attempt to try to extract victory from the jaws of defeat. If it is merely feigning desperation, it is trying to lull us back into thinking our systems are more secure than they really are — meanwhile, it has developed or is developing some post-Snowden means of obtaining our data. And if it’s just staggeringly inept — then God save us.
So do we really need to give up our privacy? Being a fan of the U.S. Constitution and the Bill of Rights, I feel the onus should be on the government to unlock our secrets, not the other way around. I doubt any reasonable person would object to the government having the ability to do so, as long as solid oversight is in place and enforced. But we civilians certainly shouldn’t be forced to build products that are deliberately weakened. The bar needs to be high, and in the name of liberty, we should be allowed to build our systems as strongly as we’re able to.
With more than 20 years in the information security field, Kenneth van Wyk has worked at Carnegie Mellon University's CERT/CC, the U.S. Deptartment of Defense, Para-Protect and others. He has published two books on information security and is working on a third. He is the president and principal consultant at KRvW Associates LLC in Alexandria, Va.