It's not about "full bars", stupid

This past weekend, I attended the Arizona Cardinals football game.   I want to talk about the environment during the game.  No, I'm not talking about the environment created by the thousands of screaming fans, the cheerleaders, and the mascot.   I am talking about the wireless environment. 

During the game, I couldn't seem to place or receive calls, nor could I send or receive text messages.  I couldn't surf the Internet or do anything else that makes having a smart phone worth the investment.  What was particularly troublesome to me was to look down and see that I had "full bars" and that I had a 3G connection.  My phone was telling me that I had a strong signal and the highest data rates, yet it simply didn't function

I thought about this for some time and the only possible explanation that I can come up with is that the cellular network was having congestion problems.  The University of Phoenix Stadium seats over 64,000 people for football games.  It's my belief that there were thousands of people taxing a limited number of cell phone towers within the area.  For me, this was confirmed by the fact that once the game was over and all the fans began to disperse, my phone began to operate normally. 

Time and again, I have seen similar congestion issues with Wi-Fi.  Where end users have "full bars" and their computer shows the signal strength as "Excellent", but the connection is unusable. What causes this scenario in Wi-Fi?  There are several contributing factors.  First, 802.11 wireless is half-duplex, meaning that any given transceiver cannot "talk" and "listen" at the same time. 

There is also a certain amount of bandwidth counted as overhead.  This overhead is made up of all the management and control frames needed to establish and maintain connections. Therefore, wireless throughput is generally less than half of the total bandwidth.  For example, 802.11g technology provides 54 Mbps of bandwidth.  However, due to overhead and the half-duplex nature of Wi-Fi, there is usually only 20 Mbps of total throughput. 

Additionally, Wi-Fi is a shared medium, meaning that if 10 users were all on one radio, all ten would only get a portion of the total capacity.   For simplicity's sake, let's assume that all users get an equal amount of bandwidth (which is actually a gross oversimplification).  If we continue with the example above, but now share the link between 10 users, each person would only get 2 Mbps of throughput.  A shared medium is fine to a point, but it doesn't scale well.  What if there were 20 users per radio?  What about a 100?  What about a 1000? 

This seemed to be what was happening at the football game.  Even though the indicators on my phone showed that I had a great connection, there were just too many people trying to share throughput from a single tower.   The towers themselves represent a huge capital investment to the cellular carriers.  So instead of building more towers, phone carriers began putting additional radios on the same tower.  Each radio was on a slightly different frequency, which allowed more users to use the network served by any given tower. 

The Wi-Fi industry has followed suit. Many vendors are now placing 3 or 4 (or more) radios into a single access point.  For the reasons stated above, this allows the wireless network to handle the density demands caused by the proliferation of client devices such as laptops, tablet PCs, dual mode phones, etc.   Personally, I have seen it work well in K-12 education environments where schools are now supporting one laptop per student initiatives across the wireless network.   In this case, as it was with the football game, a good end-user wireless experience is about more than having "full bars" on your device.  It's about the network having enough *capacity* available to adequately handle user demand. 

From CIO: 8 Free Online Courses to Grow Your Tech Skills
Join the discussion
Be the first to comment on this article. Our Commenting Policies