I recently sat through a presentation where an analyst said that 802.11n wouldn't be that successful because it offers on the order of 100Mbit/sec., and "no one needs that kind of performance." I found this statement irritating, not so much because it's just plain wrong, but because making it reflects a fundamental lack of understanding of an important intrinsic element in wireless communications.
Let's deal with the obvious issue first. Any networking practitioner knows that there is no such thing as too much throughput. I remember when the LAN was initially invented -- 3Mbit/sec. Ethernet, and Datapoint's (anyone remember them?) 2.5Mbit/sec. ARCnet, which is apparently still in use. The question most people asked when exposed to a LAN for the first time was, "Who needs that kind of bandwidth?" or some such thing. Little did they know that we were entering the era of distributed and client/server computing, and network performance would ultimately mean everything to the success of applications deployed in this way.
It was, I'll agree, a little hard to see how more throughput would matter in an era of alphanumeric, green-screen terminals, but we've moved on. Gigabit Ethernet is pretty much standard on today's PCs and costs no more (and usually a lot less) than the slower versions did when they were introduced.
Well, if we're going wireless, it stands to reason that higher throughput will be desirable when we're mobile as well. We are, after all, asking the same of wireless as we do of wire -- reliable, high-performance connectivity capable of supporting the same applications as those on wired LANs. That's reason enough to embrace higher-performance WLANs (or any wireless connectivity, for that matter), but there's one that's even more important, bringing me to that intrinsic element I alluded to above.
And that's the issue of capacity. I've touch on this before, but let's herein define capacity as the ability to move a given amount of data -- for all users -- per unit of time. Wireless obviously differs in one big way here. If we need more capacity in the wired world, we install more wire. This ability has really spoiled us, because adding capacity in the wireless world is an entirely different challenge. Each wire (or cable or fiber) we add is its own nice, neat, electromagnetic world. Signals on individual physical carriers don't interfere with one another, and there's a potentially vast amount of capacity available on each wire by itself.
But on wireless, we have the opposite problem. There is only a limited amount of spectrum available in any given location, and the frequencies we use may not propagate exactly how we'd like them to. Add in interference, fading and the numerous other radio artifacts that serve to boost the salaries of talented RF engineers, and we have to look at bandwidth and throughput in a whole new light.
The idea, then, in wireless is to make the best use of the scarce resource that is the radio spectrum at any given moment. Sure, it's a matter of raw throughput, but even more important is getting one's packets through the air as reliably, efficiently and expeditiously as possible so as to leave time and room for other packets -- that's capacity in the wireless world. Higher signaling rates mean the air can be productively busy more often, no matter how much throughput an individual application might need. So, as with wire, faster is always better, but for a different reason.
Note that the issue of capacity extends to any wireless network. We have only a limited amount of bandwidth (spectrum) available no matter what. Indeed, as we move into licensed bandwidth, spectrum gets not only relatively scarce, but very expensive as well. It thus behooves cellular operators and other licensed bandwidth operators to make the absolutely best use of this spectrum at any given moment in time, if for no other reason than their shareholders will thank them later.
Think of spectrum as a perishable commodity, like airline seats. It's use it or lose it; there is no second chance to recoup the opportunity missed when time goes by and there's no traffic to occupy those expensive airwaves. And if each user takes less time to get his packets through, so much the better. Not only do they get higher throughput, but there's more overall capacity available to make a potentially large number of ever-more-demanding users happy simultaneously. And happy and productive (OK, at least productive) users are at the top of any IT manager's list of objectives.
Craig J. Mathias is a principal with Farpoint Group, an advisory firm specializing in wireless networking and mobile computing. He can be reached at craig@farpointgroup.com.