Advertisement

Archive for Sunday, April 15, 2007

Researchers explore rebuilding Internet to improve security

April 15, 2007

Advertisement

Leonard Kleinrock demonstrates how the first Internet communication was made with the help of an Interface Message Processor machine at his office at the UCLA Computer Science Department.

Leonard Kleinrock demonstrates how the first Internet communication was made with the help of an Interface Message Processor machine at his office at the UCLA Computer Science Department.

— Although it has already taken nearly four decades to get this far in building the Internet, some university researchers with the federal government's blessing want to scrap all that and start over.

The idea may seem unthinkable, even absurd, but many believe a "clean slate" approach is the only way to truly address security, mobility and other challenges that have cropped up since UCLA professor Leonard Kleinrock helped supervise the first exchange of meaningless test data between two machines on Sept. 2, 1969.

The Internet "works well in many situations but was designed for completely different assumptions," said Dipankar Raychaudhuri, a Rutgers University professor overseeing three clean-slate projects. "It's sort of a miracle that it continues to work well today."

No longer constrained by slow connections and computer processors and high costs for storage, researchers say the time has come to rethink the Internet's underlying architecture, a move that could mean replacing networking equipment and rewriting software on computers to better channel future traffic over the existing pipes.

Balancing various interests

Even Vinton Cerf, one of the Internet's founding fathers as co-developer of the key communications techniques, said the exercise was "generally healthy" because the current technology "does not satisfy all needs."

One challenge in any reconstruction, though, will be balancing the interests of various constituencies. The first time around, researchers were able to toil away in their labs quietly. Industry is playing a bigger role this time, and law enforcement is bound to make its needs for wiretapping known.

There's no evidence they are meddling yet, but once any research looks promising, "a number of people (will) want to be in the drawing room," said Jonathan Zittrain, a law professor affiliated with Oxford and Harvard universities. "They'll be wearing coats and ties and spilling out of the venue."

The National Science Foundation wants to build an experimental research network known as the Global Environment for Network Innovations, or GENI, and is funding several projects at universities and elsewhere through Future Internet Network Design, or FIND.

Rutgers, Stanford, Princeton, Carnegie Mellon and the Massachusetts Institute of Technology are among the universities pursuing individual projects. Other government agencies, including the Defense Department, have also been exploring the concept.

Still in planning stages

The European Union also has backed research on such initiatives, through a program known as Future Internet Research and Experimentation, or FIRE. Government officials and researchers met last month in Zurich to discuss early findings and goals.

A new network could run parallel with the current Internet and eventually replace it, or perhaps aspects of the research could go into a major overhaul of the existing architecture.

These clean-slate efforts are still in their early stages, though, and aren't expected to bear fruit for another 10 or 15 years - assuming Congress comes through with funding.

Guru Parulkar, who will become executive director of Stanford's initiative after heading NSF's clean-slate programs, estimated that GENI alone could cost $350 million, while government, university and industry spending on the individual projects could collectively reach $300 million. Spending so far has been in the tens of millions of dollars.

And it could take billions of dollars to replace all the software and hardware deep in the legacy systems.

Clean-slate advocates say the cozy world of researchers in the 1970s and 1980s doesn't necessarily mesh with the realities and needs of the commercial Internet.

'Mission critical'

"The network is now mission critical for too many people, when in the (early days) it was just experimental," Zittrain said.

The Internet's early architects built the system on the principle of trust. Researchers largely knew one another, so they kept the shared network open and flexible - qualities that proved key to its rapid growth.

The Internet will continue to face new challenges as applications require guaranteed transmissions - not the "best effort" approach that works better for e-mail and other tasks with less time sensitivity.

"We made decisions based on a very different technical landscape," said Bruce Davie, a fellow with network-equipment maker Cisco Systems Inc., which stands to gain from selling new products and incorporating research findings into its existing line.

"Now, we have the ability to do all sorts of things at very high speeds," he said. "Why don't we start thinking about how we take advantage of those things and not be constrained by the current legacy we have?"

Comments

Use the comment form below to begin a discussion about this content.

Commenting has been disabled for this item.