"The unleashed power of the atom has changed everything save our modes of thinking and we thus drift toward unparalleled catastrophe." Albert Einstein
"The unleashed power of the Internet has changed everything, and presented us with an unparalleled opportunity." Martin Hellman
Recently, on a glorious afternoon, under an azure blue sky, I drove from my office at the Carnegie Mellon Silicon Valley campus in NASA Research Park up to Stanford University to have a discussion about the great challenges of our time with Martin Hellman, Professor Emeritus of Electrical Engineering and co-author of the legendary Diffie-Hellman key exchange (which opened up the door to the world of public key cryptography).
Hellman had seized my attention during the annual Cryptographer's Panel at the 2009 RSA Conference earlier this year. In the midst of a discussion about "Cloud computing," with fellow luminaries, Whit Diffie, Ron Rivest, Adi Shamir and Bruce Schneier, Hellman started talking about the dangers of a very different type of cloud, i.e., the mushroom cloud.
Here is my blog post from that panel session:
Hellman asks, "How risky is nuclear deterrence?" "1100 times riskier than having a nuclear power plant near your home," he posits. He encourages the audience to do a Google search on "Hellman cryptography nuclear" to drill down into his current work, and also gave out the URL for his site, nuclearisk.org He characterized the human race as possessing the physical powers of a god with the psyche of a 16 yr old boy. If we do not "grow up really fast and pay attention to risks before they become obvious," we face calamity beyond comprehension. "Trial and error are not enough, we have to rely on forecasting ability." Hellman drew from the example of the current global financial crisis. There were repeated warnings about derivatives, he recounted; Sen. Bryan Dorgan (D-ND) in 1994, Brooksley Born of the CFTC in 1998, and Warren Buffet, who sounded the alarm about "financial weapons of mass destruction' in 2002. Society, Hellman noted, never seems unable to recognize risks until it is too late, and he cited nuclear weapons proliferation, the economic crisis and data security as prime example. "We risk being called Cassandras," he acknowledged, but exhorted the audience not to be dissuaded by this inevitability, because "Cassandra was always right." ( CyLab CyBlog, 4-21-09)
On my way to the interview, I stopped by the Rodin Sculpture Garden at the university's Cantor Arts Center to stand before the great artist's Gates of Hell, and allow my mind to move above the writhing bodies which rise and fall like tumultuous waves within the masterpiece's imposing bronze frame.
As stood there, a factoid over a decade old bubbled up in my psyche.
In 1997, I had came across an item in Peter Neumann's invaluable Risks Digests, quoting a San Francisco Examiner story about Caging the Nuclear Genie, a new book from Admiral Stansfield Turner. In the book, Admiral Turner described an incident that happened on in the pre-dawn darkness on June 3, 1980, while he was serving as President Jimmy Carter's CIA Director. "Colonel William Odom alerted Zbigniew Brzezinski at 2:26 a.m. that the warning system was predicting a 220-missile nuclear attack on the U.S. It was revised shortly thereafter to be an all-out attack of 2200 missiles. Just before Brzezinski was about to wake up the President, it was learned that the 'attack' was an illusion which Turner says was caused by 'a computer error in the system.'" ( Risks Digest, Vol. 19, Issue 43, 10-29-97)
Turner went on to say that this incident was not the only one. "We have had thousands of false alarms of impending missile attacks on the United States," he wrote, "and a few could have spun out of control." (ibid.)
Turner's anecdote (the term doesn't really seem appropriate for something of such significance) and his admonition that this particular false alarm was not a once in a lifetime event have stayed with me over the years. It offers one of those extraordinary teaching moments for your audience. I reference the story when trying to raise the consciousness of those who pooh-pooh the threat of cyber-terrorism. If such a computer malfunction could happen accidentally, I argue, even if such incidents are rare, then it could be also generated intentionally. Either way, the consequences are unthinkable.
We are, in a way, the victims of our own good fortune. We have been lulled into distraction by a dangerously disarming miracle. The "Cold War" came and went, and because the threat of nuclear annihilation had become inextricably bound up with it, when the "Cod War" ended, the unconscious assumption is that the worst of the nuclear threat ended with it. Nothing could be more untrue. Indeed, the polarized geopolitical structure of the "Cold War" allowed for clarity and insanity; clarity about the consequences, and what needed to be done (or not done) to hold avoid those consequences, and insanity in regard to even considering the use of nuclear weapons as an option. Having missed the opportunity that presented itself around the Millennium, the post "Cold War" world offers even more insanity (i.e., at least three new nuclear-warheaded nations, Pakistan, India and North Korea) but none of the clarity.
In a CSO Magazine piece last year, I offered " A Corporate Security Strategy for Coping with the Climate Crisis," because I feel strongly that no security, intelligence or military professional can properly assess risks and threats unless not only adding climate change at or near the top of the list of risks and threats, but also factoring in its impact on all the other risks and threats spread out along the high-end of the spectrum. Likewise, I cannot, in good conscience, write about risks and threats to organizations, communities, nations or the planet as a whole, without addressing nuclear war in the same way.
Reasonable people might ask, "What is the point, what could any of us do?"
"This issue has nothing to do with my professional life," you might say.
Well, there is something that anyone anywhere can do, and it is both meaningful, and powerful; and I argue that it is something that risk, security and intelligence professionals can do with much more persuasiveness and gravitas than others and that is speak out, educate, raise awareness within the Board Room and throughout the work-force. When the populace itself moves forcefully on an issue, government and industry fall into line.
There is a great secret in such profound change: it begins one on one, from mind to mind and hand to hand. As with many other risk and security challenges, awareness and education are more than simply vital elements to any real solution, they are the magic ingredients.
I wanted to talk to Hellman, you see, because he is swinging for another fool's home run.
"My wife started studying Tarot, because she was afraid of it. The church of course likened Tarot to witchcraft. And even though we are modern people, she had picked up those prejudices. She said, 'I had a fear of it, so I felt I had to learn what it was.' So she did a reading for me, and I ended up being the 'Fool.' And my first reaction was 'I am a Stanford professor, I am a smart guy, I have won all of these awards.' But then she pointed to me the positive aspects of the 'Fool,' he goes where no one else has gone, with one foot on the ground, and the other stepping off the cliff.
My whole life I have been fundamentally a fool, which is often very wise, because you go against conventional wisdom, which is often wrong, and yet because as a kid, I suffered from that, at a consciousness level I denied it, but at an unconscious level I had actually reveled in it. It made me who I was."
As Hellman further elucidated his thinking around the "fool's home run," I kept thinking back to a baseball player named Dave Kingman. In his 16 years in the Major Leagues, Kingman struck out 1,816 times in 6,677 at bats. But he hit 442 home runs, and walked 608 times (mostly people pitching around him). Every time Kingman stepped to the plate, you expected to see a home run. The strike-outs didn't matter. The low batting average didn't matter. Once you had seen one of his monstrous home runs (they often left the stadium completely, in a long, high arc), you just wanted to see another.
"Fool home runs don't come often. You swing at a lot of wild pitches, and you have to be foolish enough, after you have swung at ten or twenty of these pitches and each time ended up with egg on your face, to get just as excited at swinging at the tenth or the twentieth one, because if you are not excited you have no hope of taking it to its conclusion, and yet a priori, when you are confronted with it, it looks no better than all those that went nowhere."
He identifies two fool's home runs he has hit in his life so far.
The first of them led to the birth of public-key cryptography.
"When I first started working in cryptography in a serious way, around 1970, my colleagues uniformly told me I was crazy, foolish, to do so. 'The National Security Agency had a huge budget, we didn't know how big it was in those days, but it was multi-billion dollar budget, and has been working on it for a decade, even in 1970; how can you hope to discover something they do not already know?' The second argument was, 'If you do anything good, they will classify it?' I had an answer to the first question, I said, 'I don't care what they know. It is not available for commercial exploitation. Also, it is well-established who gets credit for discovering something, it is the first to publish, not the first to discover and keep secret.' Both arguments were valid. It was foolish to work on cryptography in 1970, and yet, in hind sight, you would have to say it was very wise to be foolish."
The second involved the great push toward nuclear disarmament and world peace at the end of the "Cold War."
"My wife and I had become involved with a group working on the nuclear weapons issue, which was in sharp focus at the time. There was a palpable concern about the world going up in smoke. & My wife and I had the privilege having some very deep relationships with Russian information theorists, who had been here on exchange visits, and I had gone there in 1973 and 1976. We had very honest political discussions. So in 1984 we went to the Soviet Union to try to get a dialogue going between the scientific communities on new equations for survival in the nuclear age. We knew it would be impossible to have them at a public level. So it was again a foolish thing to do. And yet, there must have been some guidance from a guardian angel or a muse to send on this mission. A year later, in 1985, Gorbachev came to power, which I never would have predicted, and at first he did not seem that amazing, but within a year he had lifted censorship and encouraged free debate. So then it made sense to do the project, but if we had waited until that point to start it, it would have taken two more years to build the trust, relationships and understanding. By starting two years earlier when it made no sense, we were in the perfect position and we were able to get the book out in six months time, a book that called for radical change in our approach to national security. Gorbachev endorsed the book. We were in on history."
Now Hellman believes he has a third "fool's home run" in him; he says his nuclear risk analysis project has "the same feeling" as the others.
Here are some more highlights from our conversation.
Power: Let's start in the space of cyber security. What would you like to say about the role of cryptography and encryption in cyber security? Looking back over the last 15 years, what has it done for us? What has it not done for us? What are the lessons learned?