Sie sind auf Seite 1von 20

Internet Evolution 2011

Reflections on the ways in which the Internet is changing on a global scale http://InternetSociety.org/evolution

www.internetsociety.org

Internet Evolution 2011: Reflections on the ways in which the Internet is changing on a global scale
2011 Internet Society

Internet Evolution 2011


Reflections on the ways in which the Internet is changing on a global scale Introduction
The Internet is evolving. The majority of end-users perceive this evolution in the form of changes and updates to the software and networked applications that they are familiar with, or with the arrival of entirely new applications that change the way they communicate, do business, entertain themselves, and so on. Evolution is a constant feature throughout the network stack. Fundamental discoveries in optical networking that allow ever more bandwidth to be obtained from deployed fibre-optic cables, new standards for wired and wireless link technologies (such as 100 gigabit Ethernet and LTE), new congestion control algorithms, improved security infrastructures: these are all examples of the kind of evolution that most users dont see. As the Internet becomes an ever more pervasive and critical infrastructure underpinning society and commerce around the globe, so understanding the ways in which the Internet is changing grows in importance for technologists and policymakers alike. Reflecting on what those changes mean for the likely future trajectory of the Internet is critical. Can we think about the Internet as an evolving whole? What form would such evolution take, and where could we look for data that offers insight into changes at the macro scale? In an effort to tease out some answers to these questions and to provoke thought and further consideration among a wider audience, the Internet Society invited a range of experts to reflect on some recent data-driven observations about the ways in which the Internet is changing on a global scale. We present the data here, along with some of the key findings that we sought reflections on, and then we let the contributors speak in their own voices about how they view the Internet evolving. We have not sought to drive consensus or to weight the responses we received in any way. Rather we have tried to present a range of views about some of the most clearly discernible trends in how the Internet is changing, backed up by hard data, in the hope that this will cause the reader to think more about what this means for them and for the Internet as a whole.

www.internetsociety.org 1

Application layer Transport layer Network layer Data link layer Physical layer
Figure 1 The network stack

The Data
Measuring the Internet is hard. Really hard. The scale and diversity of the system make obtaining objective, comprehensive measurements of any aspect of the interconnected whole virtually impossible. Instead we must rely on sampling and educated guesswork, with all of the caveats that entails. Nevertheless, in the firm belief that data are better than no data, we highlight a couple of pieces of interesting, recent work that provide some hard data to underpin discussion and speculation about how the Internet is changing.
of networks contributed 2007 1000s 50% of all observed Internet traffic networks contributed 50% of 2009 150 all observed Internet traffic

10 ports contribute 50% 2007 Over of all observed Internet traffic port contributes 50% of all 2009 1 observed Internet traffic

The Atlas Internet Observatory Report, 20091 This study presents some findings from one of the worlds largest Internet monitoring infrastructures. It is based on aggregated statistics from more than 100 Internet service providers (ISPs) and content providers. The authors estimate that approximately 25 percent of all inter-domain Internet traffic contributed to their results. The major findings are: Consolidation of content contributors: 50 percent of Internet traffic is contributed by 150 networks Consolidation of applications: the Web browser is increasingly used as the application front end, applications increasingly use Web protocols to communicate Evolution of the Internet core: content is increasingly directly interconnected with consumeror eyeballnetworks. There is lots of economic innovation taking place in the commercial agreements between operators. This last finding is described in more detail in The Complexity of Internet Interconnections,[footnote], a study on which we invited comments from our contributors.
1 Labovitz, C., Iekel-Johnson, S., McPherson, D., Oberheide, J. & Jahanian, F. (2010), Internet Inter-Domain Traffic, Proceedings of ACM SIGCOMM 2010, India.Paper: http://ccr.sigcomm.org/online/files/p75_0.pdf. Slides: http://www.nanog.org/meetings/nanog47/presentations/Monday/Labovitz_ObserveReport_N47_Mon.pdf

2 Internet Evolution 2011

Complexity of Internet Interconnections: Technology, Incentives and Implications for Policy 2 This paper discusses the variety of networks that make up the Internet and the diverse ways in which they choose to interconnect with each other. The trend towards large content providers and overlay distribution networks playing an increasingly important role is clearly identified. It is an interesting question of whether the increasing use of overlay networks means that visibility into network structure and content necessarily decreases. If it does then that would have implications for the methodology adopted by the Atlas study. As stated at the outset, Internet measurement is hard. The traditional model of network interconnection in the Internet was a relatively simple hierarchical model (see Figure 2). As the marketplace for Internet service has evolved,

Faratin, P., Clark, D., Gilmore, P., Bauer, S., Berger, A. & Lehr, W. (2007), Complexity of Internet Interconnections: Technology, Incentives and Implications for Policy, Proceedings of the 35th Annual Telecommunications Policy Research Conference, George Mason University, USA. Paper: http://people.csail.mit.edu/wlehr/ Lehr-Papers_files/Clark%20Lehr%20Faratin%20Complexity%20Interconnection%20TPRC%202007.pdf

Backbone networks Regional/ wholesale networks Access networks Customer networks


Figure 2 Traditional Internet hierarchical model

www.internetsociety.org 3

so the interconnection landscape has matured and diversified (see Figure 3). These figures are based on similar figures from the Internet Inter-Domain Traffic paper cited earlier. The paper provides insight into recent operational developments, explaining why interconnection in the Internet has become more complex, the nature of interconnection bargaining processes, the implications for cost/revenue allocation and hence interconnection incentives, and what this means for public policy.

Internet core

Backbone networks

Large content networks

IXP Regional and access networks

IXP

IXP

Customer networks
Figure 3 Flatter, more connected model

4 Internet Evolution 2011

The Contributors
Kenjiro Cho is deputy research director at Internet Initiative Japan, Inc. He also serves as adjunct professor at Keio University and Japan Advanced Institute of Science and Technology, and as a board member of the WIDE project. His current research interests include traffic measurement and management as well as operating-system support for networking. Alissa Cooper is the chief computer scientist at the Center for Democracy and Technology. Her work focuses on a range of issues including consumer privacy, network neutrality, and technical standards. She conducts research into the inner workings of common and emerging Internet technologies and she seeks to explain complex technical concepts in understandable terms. She has testified before the U.S. Congress and the U.S. Federal Trade Commission and she writes regularly on a variety of technology policy topics. She currently cochairs the Geographic Location/Privacy working group (Geopriv) within the Internet Engineering Task Force (IETF) and is a member of the Internet Architecture Board (IAB). Jon Crowcroft is the Marconi Professor of Networked Systems in the Computer Laboratory of the University of Cambridge. Prior to that he was professor of networked systems at UCL in the Computer Science Department. He has supervised more than 45 doctoral students and more than 150 masters students. Jon is a fellow of the ACM, the British Computer Society, the Institution of Electrical Engineers, the Royal Academy of Engineering, and the IEEE. He was a member of the Internet Architecture Board from 19962002 and he attended the first 50 IETF meetings. He served as the general chair for the ACM SIGCOMM from 1995-1999 and he is the 2009 recipient of the Sigcomm Award. Geoff Huston is the chief scientist at the Asia Pacific Network Information Centre (APNIC) where he undertakes research on topics associated with Internet infrastructure, IP technologies, and address distribution policies. Widely regarded as the preeminent researcher on IPv4 exhaustion, he is routinely referenced by international agencies and is frequently quoted by the ICT media. Geoff has also presented at a number of global technical and government forums, including the Asia-Pacific Economic Cooperation (APEC), the Internet Corporation for Assigned Names and Numbers (ICANN), the Internet Engineering Task Force (IETF), International Telecommunications Union (ITU), and the Organisation for Economic Co-Operation and Development (OECD).

www.internetsociety.org 5

Prior to APNIC, Geoff served as the chief Internet scientist at Telstra and as the technical manager of the Australian Academic and Research Network (AARNET). He was a leading figure in the development of Australias academic and commercial Internet services. Bill St. Arnaud is a Green IT consultant who works with clients on a variety of subjects, such as the next-generation Internet and practical solutions to reduce greenhouse gases emissions, such as through free broadband and electrical highways. Previously, Bill worked as director of network projects for CANARIE Inc., an industrygovernment consortium that promotes and develops information highway technologies in Canada. At CANARIE, Bill was responsible for the coordination and implementation of Canadas next-generation optical Internet initiative called CA*net 3. Previously he was president and founder of a network and software engineering firm called TSA ProForma Inc. TSA was a LAN/WAN software company that developed wide area network client/server systems for use primarily in the financial and information business fields in the Far East and the United States. Bill is a frequent guest speaker at numerous conferences on the topic of the Internet and optical networking and he is a regular contributor to several networking magazines. He is a graduate of Carleton University Ottawa School of Engineering. Joe Touch is the Postel Center director at the University of Southern Californias (USCs) Information Sciences Institute (ISI) and a Research Associate Professor in USCs computer science and EE/systems departments. He joined ISI in 1992 and his current projects include satellite networking, virtual networks, optical Internets, and high-performance zero-configuration network security. His interests include Internet protocols, network architecture, high-speed and low-latency networks, network device design, and experimental network analysis. He has four U.S. patents and more than 80 conference and journal publications. Joe is a member of Sigma Xi, a Distinguished Scientist of the ACM, and a senior member of the IEEE. He currently serves as chair of the IEEE TCCC and he is on the editorial board of IEEE Network and Elseviers Journal of Computer and Systems Sciences. He is ACM SIGCOMMs conference coordinator emeritus, an active participant in the IETF, and a member of numerous conference steering and program committees.

6 Internet Evolution 2011

Jonathan Zittrain is professor of law at Harvard Law School and the Harvard Kennedy School of Government, cofounder of the Berkman Center for Internet and Society, and professor of computer science in the Harvard School of Engineering and Applied Sciences. He is a member of the board of trustees of the Internet Society and he is on the board of advisors for Scientific American. Previously he was a professor of Internet governance and regulation at Oxford University. His research interests include battles for control of digital property and content, cryptography, electronic privacy, the roles of intermediaries within Internet architecture, and the useful and unobtrusive deployment of technology in education. He performed the first large-scale tests of Internet filtering in China and Saudi Arabia in 2002, and now, as part of the OpenNet Initiative, he has coedited a study of Internet filtering by national governments titled Access Denied: The Practice and Policy of Global Internet Filtering, and its sequel, Access Controlled: The Shaping of Power, Rights, and Rule in Cyberspace. His book, The Future of the InternetAnd How to Stop It, is available from Yale University Press and Penguin UK and under a Creative Commons license. Papers may be found at http://www.jz.org.

www.internetsociety.org 7

The Questions In our questioning we were interested in understanding what our contributors thought the changing shape, structure, and composition of traffic on the Internet could tell us about its likely future. Is the current trajectory healthy for the future of the Internet? What is a healthy Internet anyway? Is the Internet as an open platform for innovation threatened by the trends being observed? What, if anything, can this data tell us about how the Internet is evolving in 2011? Large Internet content providers are concerned about the availability and performance of the services they provide to their customers, which is one of the reasons for the trend underlying this question. This so-called flattening of the Internet topology is a classic example of the disintermediation effect that the Internet has had on communications and commerce. Could this also signal the creation of a relatively small number of large-scale, highly resilient, high-performance content channels? Is this a step on the road to Internet as television as the infrastructural changes required to support massive online content providers become baked in? One explanation for the changes we see happening is that innovation is naturally more difficult as complex systems mature. Its hard to think of a man-made system more complex than the Internet. Innovation has to move to new layers as the fundamental pieces start to get locked in place.

The Internet itself was an innovation and it has served as an open platform for unprecedented innovations in networking, applications and services for years. The data provides evidence that the Internet is becoming flatter (increasingly direct interconnection of content and consumer). Is this necessarily part of a trend towards a less-innovative platform? Geoff Huston It is true to note that the larger the Internet gets the more the common elements of the Internet become cemented in place and fundamental innovation becomes stymied to the point of intractability. So, over time, the scope of innovation has narrowed. These days we talk about innovation in quite specific contexts. Innovation occurs in the level of the application space rather than in the underlying protocol space. As the Internet grows larger and as individual enterprises exercise increasing levels of over-arching control over ever-larger sectors within the Internet space, there are considerable pressures on the scope of innovation within the Internet. The result to date is that the number of available parameters that are accessible

8 Internet Evolution 2011

to competitive innovation and to new entrants into various market sectors in the Internet has declined. So, in this explanation, there is no shortage of innovationone just needs to know where to look for it. And the application layer is one area that continues to see enormous innovation. Jon Crowcroft How many apps are there in the iPhone and Android and Blackberry stores? Were looking in the wrong place. Network-centric measures completely ignore the simple observation that the operational network defends against the unknown by blocking packets that look funny. Hence innovation hides above the network layer. Were looking using the wrong eyes. A more prosaic explanation may be that changes in the topological structure of the Internet are part of a technological survival-of-the-fittest. In that light, the evolution we see reflects the flexibility of the underlying architecture, and the trend is, therefore, a positive one. Kenjiro Cho The innovation of the Internet is much broader. One of the key insights in the original Internet design was that diversity is essential to technical evolution. New technologies appear and survive as a result of natural selection in technical and social environmental changes. The reported trend in the ISP hierarchy simply reflects the fact that the traditional connectivity business becomes less attractive in the market. Rather than viewing this flattening of the Internet topology as a threat to the potential for innovation, perhaps it is in itself proof that the architecture supports innovation. Alissa Cooper If innovation is viewed in a broad senseencompassing developments at all layers of the Internet and in the business arrangements that support the Internetthe reports cited earlier appear to be pointing towards more innovation, not less. Both reports document how new and diverse entrants of different types have broken into what was previously a more consolidated interdomain traffic market. This trend speaks directly to innovations related to both network capacity and business arrangements. But because CDNs [content delivery networks] and content networks are only as valuable as the content they carry, their growth is also representative of the immense application-layer innovation that has characterized the Internet in recent years, including everything from advances in video compression technology to the evolution of the Web browser to the explo-

www.internetsociety.org 9

sion of mobile applications. The flatness of the network topology may actually represent how vibrant the network has become in supporting innovative applications, content, and services. Joe Touch Changes in the Internet backbone structure do not necessarily have any correlation to innovation. In some sense, the flattening itself indicates that the Internet architecture supports innovation, one where data centres are no longer at the edge. In another sense, the data path has no relation to the functionality that can be supported. Today, presuming the reports cited earlier are correct, data are going more to these cores. Tomorrow, maybe they will go to the user-edge direction (finding YouTube videos on my home machine, rather than needing to upload them). Bill St. Arnaud No I dont think it will mean less innovation. In fact this is another meta innovation built on top of the original Internet that will probably speed up new services and applications. Just as most innovation in the Internet today is now on top of the HTTP stack, as opposed to the original TCP/IP, I suspect the flatter Internet will enable a new round of innovation. While this is, on balance, a positive outlook, some important concerns to keep in mind when considering this trend are the openness and interoperability of the actual content delivery networks. As users come to expect high availability and highly responsive services, new content providers will increasingly be driven to make popular content available through those new channels. Jonathan Zittrain There is certainly cause for concern if content-delivery networks routinely bypass the standard Internet cloud. It suggests that those with lots of bytes to move will have to contend with a new range of gatekeepers other than their own choice of ISP. That leads to the requirement of making more deals in order to set up a new site or manage its growth, and more points of control should governments seek to filter out undesirable material. As the data indicate, streaming video is now the largest category of Internet traffic. Peerto-peer (P2P) traffic is still increasing but at a slower rate. The next question is concerned with whether a potentially superior technology for background bulk data transfer (P2P) is being overtaken by less-efficient tools for content delivery as a result of nontechnical concerns. Is alignment with existing business models trumping technical efficiency in the marketplace for high-bandwidth content delivery? In considering this question, the superiority of P2P technology for certain types of applications and file transfers is clear to some.

10 Internet Evolution 2011

Is the relative decline in P2P traffic volume indicative of the triumph of business models over technology?

Geoff The use of P2P models by the open-source community as their preferred method of distribution as compared to the archaic and woefully inefficient models used by the mainstream vendors for software upgrades is a clear illustration of the effectiveness of P2P to perform the heavy lifting when widespread nonreal-time content distribution is the goal.

The relative decline in usage is indicative of the effectiveness of the threat of overwhelming penalties applied on an almost random basis. The copyright holding industry is staring down the barrel of irrelevance and rather than adjust their business model to new technologies and utilize distribution channels that are overwhelmingly more efficient and cheaper than their existing hard, media-biased distribution systems, they are consistently adopting a position of fortressing up anachronistic business models of bygone years and using lobby pressure against the political system to enact Draconian penalties for anyone who would dare to transgress their copyright assets. Bill Yes. P2P decline is largely attributable to the Gestapo-like tactics of the record and movie industry to thwart piracy. The film and music industries have certainly lobbied hard to restrict and reduce incidences of illegal file sharing over the Internet. However, the demonstrated shift in traffic patterns may also indicate that P2P tools are reaching a natural peak in their user base. As more convenient tools become available, users, regardless of the underlying mechanisms, will tend to adopt them. Kenjiro I do not see it as business modelsversus-technology. It shows a success of video content over the Internet; initially, P2P was used among technically savvy users but it is shifting away to Web-based services to reach much broader users. The data are also undoubtedly a reflection of the impact of ISP traffic-management practices, which seek to restrict the use of P2P applications during certain periods or for certain classes of subscriber. The extent to which these practices are the root cause of the observed data is an interesting question. Alissa It is likely that a combination of factors contributed to the findings: the growth of streaming video and direct download, network operator traffic management practices that target P2P for throttling, P2P protocol innovation, and encryption of P2P application traffic.

www.internetsociety.org 11

These last two factors, and the wealth of continued innovation in P2P applications and protocols, would seem to indicate that technology continues to play an important role in the development of P2P. Recent developments in the IETF likewise signal continued interest in P2P innovation, most notably with the formation of the ALTO, LEDBAT, and DECADE working groups in the time since the study. Because P2P technologies have proven to be immensely effective in transferring large volumes of data and localizing data transport, they are likely to remain a fruitful area of software development for many years to come. The decline in the P2P share of traffic volume does hold an important lesson for the future of traffic management. Because the mix of applications on the network will continue to change, traffic-management policies that target particular applications will continue to require upgrades while potentially becoming less effective as the targeted applications decline in popularity. This constant game of catand-mouse with applications developers and its diminishing returns may compel network operators to pursue traffic-management solutions based on generic traffic characteristics rather than application or protocol signatures. Of course, as we discussed earlier, there is plenty of application-layer innovation going on, and that includes P2P applications. Given the pace of change on the Internet, maybe this shift in application preferences is a short-lived phenomenon and not enough to tell us anything about the overall direction of travel for Internet evolution. Joe A single report isnt a trend, and a single trend isnt a triumph. If they were, we would be talking about the triumph of Archie over FTP. Triumphs in the Internet are short-lived. There are many possible reasons for the data reported, not the least of which is the use of negotiated ports to avoid firewall filtering (e.g., to circumvent copyright enforcement). Another viable reason is that some of what P2P used to do is now being serviced by YouTube, iTunes, and NetFlix, to name a few. Jon The range of relationships between service providers and customers is richer than ever; P2P is just one (network-centric) model. Jonathan I think its hard to draw firm conclusions from raw traffic volumes. P2P changes could indicate a new generation of protocols that favour transmission over subnets, which would result in less traffic at network interconnection points. And a minute of video requires much more traffic than a minute of reading, so its hard to say the Web is taking a back seat to a non-Web P2P protocol on that ba-

12 Internet Evolution 2011

sis alone. The right metric is mindshare, and meaningful opportunity for someone with a message to convey it using various modalities. As we observed earlier, we see the Web browser increasingly being used as the universal application front end and the use of Web protocols, in particular HTTP over port 80, dominating the use of the Internet. The next question focuses on whether such protocol dominance is indicative of a problem in the way the Internet is evolving. In the opinion of many, as the lower layers of the network stack become more and more established, and more and more restrictive, innovation moves upwards.

What is your perception of the import of the increasing dominance of a handful of application protocolssimplification? ossification? something else?

Bill Simplification and innovation. The Internet used to be thought of as an hourglass where TCP/IP was the common layer between services and network infrastructure. Now HTTP is the second hourglass that has become the new common layer for most applications. The development of HTTP as the common layer was a result of the business decision to implement firewalls that block most traditional protocols based on TCP/IP. I think attempts to block P2P or other services will result in the same phenomena where more abstracted services will be deployed to route around these obstructions. Jon Its a result of network-layer restrictions and security measures. It hasnt resulted in true ossification, just in the inability of network-layer measures to detect innovation. I do a lot of work on mobile devices these days and I see a huge range of protocols on top of http/xml-rpc.

We have already heard that the application layer is now the focus for innovation on the Internet. Clearly, widespread firewalling policies tend to drive application developers to use the protocols most likely to obtain end-to-end connectivity. So we see more applications, but fewer application protocols visible from the transport layer as a consequence of network-layer security policies. Kenjiro It shows the success of Web-based service platforms, although one of the reasons behind it is widespread firewalls. Alissa The prominence of HTTP (due in part to restrictive default settings in fire-

www.internetsociety.org 13

walls) NAT devices, and other middleboxes belies the explosion of protocols and technologies that continue to proliferate using HTTP or running over HTTP. Applications from voice calling to geolocation to P2P file distribution and many others are making use of HTTP as a base protocol, while AJAX and related technologies continue to revolutionize the Web itself. Deployment of new protocols has always been more difficult at lower layers of the network stack than at upper layers; rather than becoming simplified or ossified, application development seems to continue to move up the stack. Jonathan Theres a bit of a Turing machine available here: nearly any application protocol can be shoehorned into another, putting aside differences between lower level protocols like UDP and TCP. So we would want to know more about whether the change in traffic is a response by app makers to firewall and other issues masquerading as http or a genuine consolidation of everything into what appear to be Web pages. The question is partly whether this trend is indicative of evolution towards some future steady state or whether what we are observing now tells us very little about the likely future for the Internet. Change is a constant on the Internet, and it may be that what appears to be a trend today is merely a point in a continuum about to be swept away by the next big thing. Geoff Many years ago the Internet was (a) email, (b) ftp, (c) the DNS, and (d) nothing more. These days it is (a) Web, (b) more Web, (c) p2p, and (d) not much else. Its hard to see that the dominance of a handful of protocols is increasing. I would make the case that a small number of application protocols has always been a prominent feature of the Internet. On the other hand, unlike the telephone network, the number of application protocols has been greater than one throughout its history. Joe Its cyclical. One could say the same of how .ps files and FTP were dominant in the late 1980s. Things change and evolve. Were constantly seeing new protocols that overtake old ones. This is all just evolution supported by the current architecture. The question that follows also posits the idea that reduced protocol diversity may lead to a desirable simplification of the network stack. In this sense, the trend may continue, and this may be positive.

14 Internet Evolution 2011

Can the trends we have picked out from the data tell us anything about the likely future for the Internet? Are the data highlighting aspects of the Internet that are evolving in a constant, discernible direction towards some identifiable, future-networking paradigm? For some, the potential for further innovation at the network layer of the protocol stack is now over.

What do these observable trends in Internet evolution mean for the future of the Internet?

Jon Its bad news for IPv6 and TCP changes and other transport protocol deployments but its not bad news for the big picture, necessarily. You would need a different set of studies to determine that. Some study of mobile networks might be useful too.

Internet growth is a certainty, but expecting innovation to continue to be possible across the spectrum of networking technologies is, perhaps, unrealistic. Wireless and energyefficient networking may be examples of important areas of future innovation. Bill I think the Internet will continue to expand and grow in resiliency. The wireless and green Internet will probably be the next major area of innovation. As we have already observed, the most discernible Internet constant seems to be change itself. Accepting that, then perhaps nothing we have observed and discussed earlier is a trend that cant or wont be swept away in time. Joe I dont think they mean anything other than things are changing as they always have, converging as they always have, and diverging as they always have. Kenjiro We believe in diversity for evolution in the face of environmental changes. As the Internet matures, we see more diversity in higher levels than the network level. However, we also need to sustain the diversity at the network level in order to have possibilities for innovation at the network level. Technologies come and go; we should not overreact to individual trends. Most trends are not so harmful to the diversity. After all, the Internet ecosystem is fairly robust against turbulence. However, some may lead to irreversible impacts on diversity. For example, widespread firewalls and NATs have considerably reduced diversity in port usage and in communication initiation styles. In general, any form of filtering is, once introduced, hard to remove. We have to consider its consequences seriously beforehand.

www.internetsociety.org 15

Despite all of this change however, the Internet will remain a technological construct within a wider societal context. Therefore, as the Internet evolves and matures, getting the regulatory balance right is of growing and fundamental importance. Openness and neutrality are key to safeguarding the future for the Internet as a technology underpinning human development. Geoff Any large, engineered system ossifies over time. Incumbents attempt to entrench their position into one of dominance and attempt to erect barriers to new entrants and competition. Innovation is a critical lever in terms of balancing incumbents with competitive new entrants. Innovation is the critical factor that allows competition to compete on equal terms with high-volume incumbents. The critical attribute of the Internet that allows the continued entrance of innovation is the networks openness and neutrality. As long as consumer devices are sufficiently open that consumers can access novel applications and the networks are sufficiently neutral that new applications run as well as any other application then we can be confident that the future of the Internet is assured. As a source of considerable concern, these assertions are not true of todays Internet. We are seeing the rise of the locked user device, the threats being placed on the neutrality of the carrier with respect to content and the constraint pressure being placed on the carrier to drop its neutral role and undertake the role of content inspector and copyright enforcer. These pressures will ossify the Internet and make it incredibly resistant to further innovation and change. This will not play out well. In the same way that the telephone operators suppressed innovation and change in their fiefdoms in the 1970s and 1980s while the rest of the industry was making gigantic strides in the use of computers to create innovative service offerings, the deliberate inability of the phone companies to undertake any realistic form of change to their comfortable monopoly implied that the pressure for innovative change simply increased, and when it finally crashed through in the late 1990s the resultant change was revolutionary rather than evolutionary, and I suspect that its taken us most of the last decade to sort out the resultant mess! I would like to think that the Internet offers the ability to operate a more sustainable business model where the obvious advantages offered though sheer scale of operation and stability of technology are offset by the constant pressure of innovation and change. Ultimately in such an environment we see the end user being the beneficiary, due to both the competitive pressures driving efficient service delivery and innovation driving new service-delivery models. However

bp-internetevolution-201104-en

to make such a model self-sustaining calls for a delicate touch on the regulatory levers. Too much in the direction of regulatory laissez faire results in the formation of cartels and de facto monopolies in the industry, while an over-enthusiastic regulatory regime repels entrepreneurship and innovation and ultimately competition. I suspect that we have yet to learn the right settings in this space, and at the moment we are teetering towards the creation of a new set of industry monopolies. Efforts directed to their dismantling will absorb much of our attention in the coming years.

Any other observations or interpretations of these reports in terms of impact and import for the future of the Internet that you would like to share?

Geoff IPv4 address exhaustion is occurring at a very interesting time. Its hard to expect the carriers to spend large sums of money supporting a dual-stack infrastructure if the net result is to cement their role as a commodity bit-carriage operator. I would not be surprised to see the carriers, particularly in the lucrative mobile-data sector, use this situation as an opportunity to create novel application-level gateways that, in effect, act as toll gates on the wire and allow the carriers to meter the applications and the users traffic. Jon Time to study the end-user properly. Kenjiro We also observe similar trends in Japan. Bill There is no question that the Internet is maturing in terms of protocols, technology advances, and deployment. Most of the innovation is coming outside of the traditional Internet community as, for example, in content networking, and is being driven by the huge leverage of a large installed base. A large part of this alternate innovation is being driven by the need to get around todays major obstacles blocking deployment of innovative applications, namely incumbent telephone, cable companies, and wireless companies. If they had made sufficient investment in the last-mile infrastructure then many of the innovations like content networks, caching, and so on may not have been required. But as a result, only companies with deep pockets have the wherewithal to develop solutions that counter the regressive forces of the telcos and cablecos. Unfortunately that leaves smaller, innovative players out in the cold and makes life more difficult for them.

www.internetsociety.org 17

Glossary
CDN HTTP Content Distribution Network: a network of servers that improves the timeliness and availability of Internet content. HyperText Transfer Protocol: a protocol to exchange hypertext requests and information between servers and browsers. The fundamental protocol of the World Wide Web. Long Term Evolution is a project of the 3rd Generation Partnership Project (3GPP) to define the latest standard for mobile network technology. Network Address Translation: The process (and the device that carries it out) of modifying address information in IP packet headers while they are in transit. Transmission Control Protocol / Internet Protocol: The fundamental protocols of the Internet. Peer-to-peer: a decentralized file distribution mechanism that allows peers to collaborate directly in the distribution of data.

LTE NAT TCP/IP P2P

XML-RPC eXtensible Markup Language Remote Procedure Call: XML-RPC provides a lightweight means by which one computer can execute a program on a cooperating machine across a network

bp-internetevolution-201104-en

Das könnte Ihnen auch gefallen