Interference is a fact of life in wireless. Indeed, I still run across those who simply refuse to use wireless LANs, especially in business applications, because of the fear that they might be subject to interference.
Interference can occur on licensed frequencies under a variety of circumstances. Of course, it also occurs on wire, except then we call the result a collision (and, by the way, I don’t see anyone ripping out Ethernet cables because of those). We’ll therefore define interference as the unintentional degradation of a given wireless signal by another at the same frequency and at the same time and close enough (or at least with sufficient amplitude) to damage, perhaps beyond repair, that given signal.
This can clearly happen in the unlicensed bands where wireless LANs operate, and particularly on channels 1, 6 and 11, which everyone oddly seems to use, but more on that below. By the way,intentional interference is the basis of electronic warfare systems, and is based on jamming the enemy’s signals, which are used to guide missiles and detect aircraft.
In the commercial world, jamming is seldom seen, but in theory, it could form the basis of a major denial-of-service attack. However, since such could occur at any frequency, we’ll save that subject for another day. So, the question is whether nearby WLAN (and other unlicensed-band) traffic can cause degradation serious enough to imperil the integrity of a given WLAN. The answer is, Well, yes, that’s certainly possible, but it’s very, very unlikely. Let me explain why.
First of all, let’s further consider the definition of interference I presented above. In order for one signal to harm another, it must be close enough, or at least of sufficient amplitude (transmit power), so that one radio wave degrades another. Since radio waves fade exponentially as they travel, and since most WLANs operate at the same default transmit power (about 100 mW), that means that only a transmitter relatively nearby, on the same frequency and transmitting at the same power level, will have the potential to cause destructive interference.
It’s also possible for the cumulative noise floor of the overall environment to be raised by enough stations operating in close proximity, but again, it would take a lot of stations transmitting all the time for this to be the case. And that isn’t very often because even if there are many nodes nearby, most network nodes are idle most of the time. Don’t believe me? Go look at the statistics gathered by your company’s network management system. Very few nodes on your entire network are ever really busy, and these tend to be on critical servers and network infrastructure equipment, like core routers. Traffic gets pretty sparse and infrequent (if bursty) near the edge of the network, which also means that opportunities for interference get rather sparse as well.
In the case of wireless, we can automatically (via centralized architectures or at least via centralized management) manage power levels and frequency assignments on access points to minimize interference, so that interference is much less important than, for example, contention due to an underprovisioned infrastructure.
In the home, I suggest setting your WLAN channel to something other than 1, 6 or 11 (I like 3, 4, 8 and 9) to stay away from those who take rote advice aimed at enterprise installations where roaming is important. Or do what I do – use 802.11a, with a lot more channels and almost no users. I never see interference using the 5-GHz bands, and I suspect it will be quite some time before anyone does.
This leaves high-density public spaces as the places where interference is likely to be a problem. And yet, even with dozens of WLAN networks in range, I’ve never had a problem on a public-access network getting reasonable (1Mbit/sec.-plus) throughput and excellent Skype call quality as well. This leads to two conclusions.
First, interference is likely not a problem today and is easy to manage in most cases when it is. Second, we should study this problem further, because I expect many more WLANs to be deployed over the next few years, and we’re likely to need better methods for automatically dealing with interference when it becomes an issue. We’re shortly about to begin a project here to do just that – quantify interference and discuss techniques for dealing with it. I’ll let you know what happens with this in another column later this year.
In the meantime, next week I’ll review a counterintuitive approach to getting more capacity out of wireless LANs that one would think would be a source of interference – something we calldense deployments. I’ll also look at another controversial technique for managing interference that a couple of vendors have built into products.
A final note: Last week’s column on site surveys generated the predictable response from a few readers, mostly those who earn at least some income from doing site surveys and, um, questioning my conclusion, which I stand by. One of these days, we’ll have to do a bake-off – my method vs. the traditionalists, in a live test. Since I’ve already verified my conclusions, I’m sure I’ll win. Nonetheless, scientific results are always best.
Craig J. Mathias is a principal at Farpoint Group, an advisory firm specializing in wireless networking and mobile computing. He can be reached at craig@farpointgroup.com.