Leonard Kleinrock is emeritus professor of computer science at the University of California, Los Angeles. He created the basic principles of packet switching, the foundation of the Internet, while a graduate student at MIT, where he earned a Ph.D. in 1963. The Los Angeles Times in 1999 called him one of the "50 people who most influenced business this century." Computerworld's Gary H. Anthes interviewed Kleinrock in 1994 as part of the Internet's 25th anniversary celebration. Recently, Anthes asked Kleinrock for an update.
You told Computerworld 11 years ago that the Internet needed, among other things, "a proper security framework." What about today? In the past 11 years, things have gotten far worse, so much so that there are parts of the population that are beginning to question whether the pain they are encountering with spam, viruses and so on is worth the benefit. I don't think there's a silver bullet. We need systemwide solutions. Strong authentication will help. IPv6 will help. Identifying the source of information—a networking issue—to make sure it's not being spoofed will help.
You called for better multimedia capabilities in 1994 as well. One of the major changes related to multimedia in these 11 years has been the explosion of what we call the "mobile Internet." There's this ability now to travel from one location to another and gain access to a rich set of services as easily as you can from your office. The digitization of nearly all content and the convergence of function and content on really smart handheld devices are beginning to enable anytime, anywhere, by anyone Internet -- the mobile Internet. But there is a lot more to be done.
Such as? We have to make it easier for people to move from place to place and get access. What's missing is the billing and authentication interface that allows one to identify oneself easily in a global, mobile, roaming fashion. We [will] see this change to an alternate pricing model where people can subscribe to a Wi-Fi roaming service offered by their company or from their home ISP. As these roaming agreements are forged between the subscription provider and the owners/operators of today's disparate public-access networks, the effective number of locations where a subscriber will be able to connect at no or low fee will grow. A key component in this environment is internetwork interoperability, not only for data traffic but for authentication and billing. The benefits will be ease of use and predictable cost.
You mentioned smart handheld devices. Where are they going? We are seeing your phone, PDA, GPS, camera, e-mail, pager, walkie-talkie, TV, radio, all converging on this handheld device, which you carry around in addition to your laptop. It will [alter the properties of] a lot of content - video, images, music—to match what's come down to the particular device you have. For example, you may be using your handheld cell phone to serve as a passthrough device to receive an image or video that you wish to display on some other output device—say, your PC or your TV. The handheld may need to "dumb down" the image for itself but pass the high-quality stream to the TV, which will render the stream to match its—the TV's—display capability.
Is that capability of interest to corporate IT? Absolutely. We see e-mail already on the handheld, as well as the ability to download business documents such as spreadsheets and PowerPoint presentations. We'll see the ability to handle the occasional videoconference on a handheld, as well as other media-rich communications. We are right on the threshold of seeing these multifunction devices. Of course, the human-computer interface is always a problem.
How might that improve? Voice recognition is going to be really important. And there will be flexible devices where you actually pull out keyboards and screens and expand what you are carrying with you. Haptic technologies—based on touch and force feedback—are not yet here, but there's a lot of research going on. For example, with a handheld, you could display a virtual keyboard on a piece of paper and just touch that.
You have warned that we are "hitting a wall of complexity." What do you mean? We once arrogantly thought that any man-made system could be completely understood, because we created it. But we have reached the point where we can't predict how the systems we design will perform, and it's inhibiting our ability to do some really interesting system designs. We are allowing distributed control and intelligent agents to govern the way these systems behave. But that has its own dangers; there are cascading failures and dependencies we don't understand in these automatic protective mechanisms.
Will we see catastrophic failures of complex systems, like the Internet or power grid? Yes. The better you design a system, the more likely it is to fail catastrophically. It's designed to perform very well up to some limit, and if you can't tell how close it is to this limit, the collapse will occur suddenly and surprisingly. On the other hand, if a system slowly erodes, you can tell when it's weakening; typically, a well-designed system doesn't expose that.
So, how can complex systems be made more safe and reliable? Put the protective control functions in one portion of the design, one portion of the code, so you can see it. People, in an ad hoc fashion, add a little control here, a little protocol there, and they can't see the big picture of how these things interact. When you are willy-nilly patching new controls on top of old ones, that's one way you get unpredictable behavior.