ISPs experimenting with new P2P controls

Peer-to-peer traffic management was a hot topic at this year's NXTcomm convention in Las Vegas, as keynote speakers and telecommunications industry panelists highlighted new methods for handling P2P traffic crunches.

Internet service providers' methods for managing P2P traffic have come under intense scrutiny in recent months after the Associated Press reported last year that Comcast Corp. was actively interfering with P2P users' ability to upload files by sending TCP RST packets that informed them that their connection would have to be reset.

Because the RST packets did not appear to be sent directly from the company, critics accused Comcast of deceiving its customers and actively blocking their ability to share files online. Although Comcast has said it doesn't actively block any P2P protocols and merely "delays" P2P uploads during times of heavy congestion, the company has agreed to change its P2P traffic management policies and stop targeting traffic such as that of BitTorrent.

NXTcomm panelists and keynote speakers agreed that heavy P2P traffic could cause network management problems for Internet service providers. Typically, P2P technology such as BitTorrent distributes large data files by breaking them up into small pieces and sending them through multiple sources. After all the data is received, the file is then reassembled as a whole. But while this method of file sharing is much faster and more efficient than relying upon one centralized server, it also can cause significant network strain because P2P protocols are mainly designed to download large chunks of data from sources wherever they can be found, without particular regard to network efficiency.

But despite their concerns about P2P's effects on networks, none of the speakers at NXTcomm endorsed the method of sending RST packets to all P2P users during peak hours. Rather, some said it was time to follow the lead of Comcast and begin implementing caps for individual users who are consuming disproportionately high amounts of bandwidth. This way, Internet service providers wouldn't be targeting individual P2P protocols, and casual P2P users wouldn't have to reset their connections every time they uploaded files during peak congestion hours.

"ISPs need to figure out how traffic is being distributed among their users," said Larry Roberts, the founder of Anagran Inc., which specializes in helping IP networks manage their P2P traffic. "You can't make money if you're giving the majority of your bandwidth to 5% of your users. ... The concept that has come forward is that there should be more equality for users based on what they pay for individual usage. I think this is a reasonable and appropriate approach rather than trying to look only at the applications themselves."

Roberts also is one of the designers and developers of the Arpanet computer packet network that evolved into the Internet.

The two kinds of caps most discussed were the kind proposed by Comcast that would slow down individual users' P2P uploads during peak hours, and the kind recently discussed by AT&T that would slap users with overage charges for downloading what the company considers to be heavy amounts of data per month. In their proposals, both companies have made clear that the vast majority of users would not be affected by the cap and that it would only affect users who upload large files such as high-definition movies on a near-constant basis.

Qwest Communications International Inc. Chief Technology Officer Pieter Poll said that while his company has been looking at some ways to mitigate the worst results of heavy P2P traffic, he also thinks that Internet service providers should look outside the realm of P2P traffic when identifying heavy users. Oftentimes, Poll said, some of the heaviest bandwidth users on a network aren't even aware of the bandwidth they're consuming and are happy to change their practices in order to consume less.

"Like other carriers, Qwest has an acceptable-use policy, and folks who are on the extreme side of that policy are notified that they're on the extreme side," Poll said. "But what we've found when we look at the highest-end users is that a significant percentage of what they use is certainly P2P traffic, but also that their high bandwidth consumption could be the result of malware infections or of Web cameras running 24 hours a day. These are issues that they likely don't know about."

Another method for managing P2P traffic that frequently popped up at NXTcomm was the experimental "smart" routing technology being developed by the P4P working group. Last March, Verizon Communications Inc., P2P software developer Pando Networks Inc. and researchers at Yale University conducted a field test of P4P technology that allows the network to pick and choose more local sources that will optimize the delivery route of large files, rather than inefficiently grabbing data from any source available across the globe. Speakers from Qwest, AT&T and Verizon all touted their companies' participation in the Executives P4P Working Group at NXTcomm, and some panelists said that if the technology is successful, it could completely change the way that P2P technology interacts with networks.

"Currently, P2P protocols go after sources of video all over world," Roberts said. "The cost of doing that is much higher than if you do it with more-local sources. Doing it locally is tremendously less taxing for carriers."

James Glover, a senior product manager at management systems developer Redback Networks Inc., said another option for controlling P2P traffic on the network would be to cache popular P2P content such as movies at the edge of networks and make it available to distribute to users on a localized basis. Thus, when a popular new movie hits the Web for legal download, BitTorrent users would be able to get the content from nearby cache servers rather than pick up pieces of it from all over the world.

"The best part about caching is that it can be applied not only to P2P, but also to over-the-top content," Glover said, referring to companies such as Skype Technologies SA and Vonage Holdings Corp. that deliver voice services over the top of IP networks.

"If you can strike an agreement with the creator, you can put the cached content at edge of the network," he said. "This sort of distribution model could provide ISPs with revenue-generating hosting services for voice, video and data, which could then be customized to support localized marketing."

In the end, said Ericsson North America CTO Arun Bhikshesvaran, Internet service providers are likely to try a wide variety of methods for managing P2P traffic, and users shouldn't expect one method to instantly crop up that will satisfy both their demands and those of the network.

"It's really more of an evolution of the service-provider model than anything else," he said. "It's an evolution of the business model, and there will be more to come ... hopefully not to the detriment of the users."

This story, "ISPs experimenting with new P2P controls" was originally published by Network World.

Copyright © 2008 IDG Communications, Inc.

7 inconvenient truths about the hybrid work trend
Shop Tech Products at Amazon