Weaknesses in the architecture behind the Internet mean that surfing can sometimes lead to slow speeds and a tiresome wait for a video to load. Redeveloping the whole architecture of the Internet is an option recently discussed even by the Fathers of the Internet. However, a group of European engineers decided to go the opposite way and to monitor traffic and tailor services to meet demand.
There is no single entity behind the Internet. It is made up of different networks that are managed by service providers. These service providers — or operators — manage what data is being sent and monitor the amount of traffic being used in terms of simple web browsing, multimedia streaming, or peer-to-peer file sharing. When the data traffic on a network is too dense, bottlenecks can occur, slowing the delivery of information to your computer, which can result in a slower Internet experience. A European EUREKA-backed project entitled Traffic Measurements and Models in Multi-Service (TRAMMS) networks, incorporating teams from Sweden, Hungary, and Spain, aimed to solve this issue by gaining access to Internet networks run by operators in both Sweden and Spain and monitoring traffic over a period of three years. This gave them an excellent insight into user behavior, enabling them to accurately measure network traffic so that in the future, service providers know how much capacity is needed and can avoid bottlenecks.
Taming the Internet beast
The particularity of this research project is that the team of experts taking part in it was given access to very sensitive data on Internet traffic measurements. Operators normally tend to guard this information jealously as it constitutes their core business. “Internet traffic measurements are very difficult to find if you are not an operator,” said Andreas Aurelius, coordinator of the project and senior scientist at Acreo AB, one of the project partners. Previous research in this field has normally been limited to campus networks, and limited to a geographical area. “That is one of the unique things about this project,” he says. “We were using data in access networks, not campus networks as most researchers do.”
The type of information the project monitored were designed to get an overall view of traffic passing through the networks. This included IP traffic (the flow of data on the Internet), routing decisions (the selection of which path to send network traffic), quality of service (giving priority to certain applications, such as multimedia), and available bandwidth. This was innovative as the partners developed new tools that measured traffic that gave a complete picture of a network. These tools, already targeted for use by many operators will make web browsing considerably faster.
“For everyday users, this means better quality for multimedia services over the Internet, like streaming for example” said Aurelius.
Setting new standards to measure Internet traffic
Another question to be answered was how the team was able to acquire all of this information without flouting any privacy laws. The answer is that through agreements with the operators, the partners had access to certain information, but not all of it. “The information was post-processed, so it only contained data. It wasn't linked to any customers or IP addresses. We could see what type of application was being used, for example if it was peer-to-peer, but we couldn't see what file was downloaded,” said Aurelius.
The team managed to collect an astonishing 3,000 terabytes of data over the three years of the project, which allowed them to study trends and changes over an extended period of time amid a continuous influx of information.
The project was also notable for the fact that a number of the processes that were carried out are under consideration by the International Telecommunication Union to become standardized forms of measurement. An example of these standards is Bandwidth Available in Real Time (BART), which monitors available bandwidth between a sender and a receiver in a network such as the Internet.
Aurelius is now working on a follow-up project entitled IPNQSIS. This project deals with quality of experience in network services, such as voice over IP (VoIP), video on demand (VoD), IPTV, and so on. These are sectors where network service providers are expecting huge revenue opportunities and need to improve the quality of the offered services as perceived by the users. This makes the topic an ideal follow-up project for the team that already worked on TRAMMS.
The TRAMMS project came to a conclusion at the end of 2009 and the first results are impressive. No less than five companies have taken up the methodology used for traffic measurements: Ericsson, Procera, Telnet-RI, NAUDIT, and GCM Communications Technology.