Skip the navigation

On its way: A new Internet

Researchers are building a new Net, one layer at a time.

December 8, 2003 12:00 PM ET

Computerworld - Researchers developing the next incarnation of the Internet say it will be faster, more reliable and more secure. Moreover, it will be self-aware and able to determine the best way to deliver data and services.
The most prominent next-generation Internet project is PlanetLab, a research testbed that's been in existence for about a year and a half. It consists of 160 servers hosted at 65 sites in 16 countries. The goal is for PlanetLab to grow to 1,000 widely distributed server nodes that connect into the majority of the current Internet's regional and long-haul backbones.
"It's a playground for new services. Depending on which service you're most excited about, that's what PlanetLab will look like," says Frans Kaashoek, a professor of computer science and engineering at MIT, a PlanetLab developer.
Kaashoek and other scientists are developing architectures that will automatically distribute data to multiple points around the globe in order to speed delivery and will have multiple network paths to ensure that data gets to its destination. The network will read data requests and direct them to the servers closest to the point of origination to fulfill the requests.
The challenge for scientists is to put intelligence into the network itself so it can understand the information that's being transported across thousands of servers and millions of miles of cable.
PlanetLab, which is up and running for the research community, is a joint project being led by Intel Corp. and about 70 university scientists around the world.
Just as the Internet was an overlay network on top of the telephone network, PlanetLab provides for an additional layer on top of the Internet. In turn, services such as streaming media, peer-to-peer file sharing and videoconferencing will be layered on top of PlanetLab.
One network layer atop PlanetLab is IRIS, or the Infrastructure for Resilient Internet Systems. IRIS promises to speed up searches and information transfers by using a self-organizing, peer-to-peer overlay network to position data closer to end users and thwart denial-of-service attacks by balancing loads among Web servers.
John Kubiatowicz, an associate professor at the University of California, Berkeley, says IRIS is a radical departure from the client/server model and application-specific environment of today's Internet because of its ability to spread data and rebuild it using sophisticated algorithms.
Kubiatowicz is also working on another layer to ride on top of PlanetLab, called OceanStore, which is a utility-type service for storing data across millions of servers.
Backup Plan
In OceanStore, Internet service providers and others would be paid to act as repositories for the world's information, which would be kept as multiple copies, protected by encryption and automatically rebuilt should any single storage point fail.
"If you think about the classic problem with archival storage, data resides on tape in some basement, and 10 years later you can't read the tape," Kubiatowicz says.
"The only way data can be preserved over the long haul is if it's separated from the physical media it's originally stored on. That means the places where it is stored must change over time," he explains.
OceanStore's software does that by breaking data into many tiny, encrypted parts and moving them across a vast array of Web servers that can be driven by policy engines to resave or move data to different formats over time.
"You'd pay a monthly fee to a company to provide a storage service, and in turn, that data would be kept secure for hundreds of years, protected via encryption, and it could be accessed from anywhere in the world quickly because it would be cached locally," Kubiatowicz says. "Basically, you'd be able to plug into the wall and get storage."
Kubiatowicz says companies could use OceanStore for routing data in-house to servers across their entire infrastructures for greater redundancy and resiliency.
Netbait is another layer running on the PlanetLab testbed. Like a doctor tracking a new virus in the body in order to discover how to fight it, Netbait will be able to track worms and viruses as they appear and watch how they propagate, developing profiles to help stop them in their tracks.
"It'll look at the way [a virus] is trying to penetrate a Web site. That would allow you to have an early warning of worm or virus behavior, allowing for faster diagnostic analysis and the ability to warn people about how to protect themselves from it," says Kevin Teixeira, a spokesman in Intel's research division.
While scientists are currently using PlanetLab to disseminate research information, one of the most promising aspects of the network for everyday users is its ability to provide multiple copies of data or video on servers throughout the world, closer to those requesting it.
"There are more servers and more clever algorithms that know how to send data to the closest computer and cache it there," Kaashoek says.
The new Internet will unfold over many years, he says.
"Just as the telephone [network] emerged, this overlay of intelligent networks will grow and populate, and there'll be certain versions of it that people will eventually standardize on," Kaashoek says. "In an evolutionary way, the Internet will upgrade itself over time."

PlanetLab's Layers


Our Commenting Policies
Internet of Things: Get the latest!
Internet of Things

Our new bimonthly Internet of Things newsletter helps you keep pace with the rapidly evolving technologies, trends and developments related to the IoT. Subscribe now and stay up to date!