Search engine Google is growing so fast that it has to add 30 servers per day to keep up with all the new material appearing on the web. The demand for broadband connections is increasing annually by over 15%. The Internet 2 in the US will connect with Europe’s Géant for the start of a super fast network. Will there be a need for Internet 3? More Here
What is the next generation?
by Aliza Earnshaw Business Journal Staff Writer
The internet, as we know it, is a network of networks, a chaos of commerce, ideas, random blurtings and sheer hucksterism.
Stay tuned: the next-generation internet is coming, and it promises to be faster, bigger, better.
The internet is already growing faster than most of us can imagine. Google, the premier search engine of the internet, adds 30 servers per day to keep up with all the new material appearing on the web. Demand for internet access to all this information is growing, too.
A Gartner Dataquest survey conducted in June showed that over seven months, the number of U.S. households actively using the internet grew by 15 percent, to 65 million.
Demand for broadband is ascending an even steeper curve. Almost 25 percent of U.S. households are connected to the internet via high-speed connection, whether cable modem, some variant of DSL or satellite. In the same seven-month period that saw active internet use increase by 15 percent, adoption of high-speed access increased 82 percent.
High-speed networking and the applications to run over it are driving current research and development in what is commonly referred to as Next-Generation Internet (NGI). The new, improved version of the internet aims to be faster, capable of delivering increasingly larger data files in real time, with better service.
Several major areas occupy the time of government, industry and academic researchers in next-generation internet technologies. These are:
The physical network itself, a web of optical-fiber lines, routers, hubs and switches, and the technologies that support and enable it.
Applications that run over high-speed networks, including transfer of large data files, real-time streaming of video and audio, and collaborative work in real time.
“Middleware,” the software that works to run applications over the internet.
Security and privacy, which are partly to do with middleware–authentication, encryption and digital watermarking technologies–but also have to do with the sociology of the internet, and how to establish guidelines for responsible use of such a powerful information transfer engine.
Of the next-generation internet efforts under way, the best known is Internet 2, the consortium of nearly 200 U.S. universities that are researching advanced internet technology.
The federal government also has its Next-Generation Internet initiative, involving several federal agencies. Most of these are also working with Internet 2 committees and research groups.
This month will also see the launch in Europe of Géant (French for “giant” or “gigantic”), a network similar to Internet 2, but much larger.
Partially funded by the European Commission, it will cover more than 3,000 European academic and research institutions in 32 countries, and link up the national research networks of each participating country.
Géant will also link directly to Abilene, the 10,000-mile network on which Internet 2 runs.
Today’s public internet has several major problems that are being worked out in NGI and Internet 2 research. One of these is quality of service, a term that has long been used in the telecommunications industry.
Ironically, it is that industry’s success over its 100-year history in establishing a very high standard of service–we always expect to hear the dial tone when we pick up our phones–that drives the expectations for quality of service over the internet.
The internet is often described as a “best-effort” service. When a server sends data over the internet, it sends it in packets, and if all of the packets that are part of a transmission do not make it, then the receiving computer will keep requesting the information until it all arrives.
While this works perfectly well for e-mail and for transfer of simple files, such as documents and ordinary digital pictures, it is not always successful for transfer of much larger files, nor for transfer of real-time or even cached video and audio transmissions.
Anyone who has tried to watch a movie clip on the internet knows that it’s a dodgy effort–sometimes it works acceptably, but sometimes the stream pops and crackles, and information is lost.
Quality of service is partly a technical issue, and partly a matter of assigning standards that all parties can follow.
Internet 2 is working to establish standards, something that the current internet, composed as it is of many different private and public networks, has not yet been able to do as an umbrella effort.
Creating a method and model of organizational structure for quality control will be one welcome outcome of Internet 2 for the public internet.
Research being conducted on Internet 2 by academics, government agencies and industry has been and will continue to migrate to the much messier public internet, refining and improving it.
“Hopefully, we’ll never need an Internet 3,” said Mary Kratz, manager of Health Science Initiatives for Internet 2. “We’ll just keep evolving the current internet infrastructure over time.”