Sie sind auf Seite 1von 35

Bryan School of Business and Economics

University of North Carolina at Greensboro

The Economics of Net Neutrality: A Literature


Review

Public Economics — Semester 2: 2010

Student: Derek Tyler Mobley


ID: 885852363

Professor: Christopher A. Swann


Contents
1 Introduction 1
1.1 Defining Network Neutrality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

2 The Emergence of Broadband: Cable and DSL 5

3 Models of the Broadband Industry 6


3.1 Vertical Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
3.2 Layered Structure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9

4 Vertical Integration in the Broadband Industry 11

5 Policy Approaches and Critiques 13


5.1 The End-to-End Argument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5.2 The Argument for Last-mile Competition . . . . . . . . . . . . . . . . . . . . . . 15
5.3 The Broadband Discrimination Regime . . . . . . . . . . . . . . . . . . . . . . . 17
5.3.1 Endorsement by the FCC . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

6 Economic Experiments and Wi-Fi 22

7 Game Theory Applied to Broadband 24


7.1 Simple Bertrand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
7.2 Cournot Duopoly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
7.3 Differentiated Bertrand . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
7.4 Market Entry: Sequential Games . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
7.5 A Bayesian Interpretation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29

8 Conclusion 30

9 Appendix: Terms and Concepts 32

References 33

ii
1 Introduction

The explosive growth of the Internet over the past two decades has created an environ-

ment where the quantity and speed of information that can be transmitted increase daily to

historically unprecedented levels. Physically, the Internet is composed of a large set of local

networks that connect globally to other networks. The Internet protocol suite, which includes

the basic protocols for transferring data across the Internet, provides the logical structure for

this web of networks1 . Exponential growth in both network capacity and the number of users

has contributed to an increasingly complex market for access to the Internet. The classical

Internet, which for the public consisted primarily of the World Wide Web and Email, is now

only a subset of this market. The availability of broadband technologies has made VoIP, video

streaming, and forms of complex online entertainment increasingly available to consumers.

The combination of the widening use of these bandwidth intensive applications, and the

related convergence of firms seeking to profit from both the broadband technology as well as the

transmitted intellectual property, has provided incentives for firms to exploit network resources.

The fact that broadband providers effectively possess a natural monopoly over broadband service

(especially for last mile users), means that a market failure due to imperfect competition could

arise in the market for online content. This would cause content markets to be inefficient,

and create dead-weight losses to society. To support the argument of imperfect competition in

the broadband industry, it should be noted that as of 2003, over 60 percent of the broadband

industry was under the control of four firms2 . Broadband providers could use their market power

to raise prices for consumers, and increase prices for or purchase Internet content providers.

Thus, imperfect competition in the broadband market could have a spill-over effect.

Those concerned with this potential failure argue that regulation should be used to ensure
1
Major terms related to the operation of the Internet are defined in the Appendix in section 8.
2
These were Comcast, Verizion, AT&T, and Time Warner [15, p.13].

1
that the design of the Internet retains its classical roots. Variations of this argument fall under

the concept of Network or Net Neutrality. The classical Internet, using TCP/IP, treats all data

packets sent across the network with equal priority. Net Neutrality advocates argue that by

keeping this system in place, broadband providers will be unable to use their market power

to drive up costs to potential competitors, and will be unable to strategically control content

transmitted between users3 . Under a Net Neutrality Regime, consumer welfare would arguably

be preserved, as well as the incentive for content providers to innovate.

However, opponents of Net Neutrality assert that not allowing broadband providers to dis-

criminate their services will be counterproductive. Since VoIP, video streaming, and online

entertainment are bandwidth intensive, providers should be permitted to raise the cost for con-

sumers who use higher bandwidth. Finally, opponents argue that the inability of providers to

discriminate will actually be a barrier to innovation, weakening the incentive to increase their

quality of service (QoS).

Both sides agree that the broadband industry is imperfectly competitive. They disagree over

whether this can spread to the market for Internet content. Opponents feel that the radical

technological change in Internet access technologies will undermine firm’s attempts to use their

network strategically, and federal regulation could only slow this process. Newer technologies,

especially Wi-fi, tend to support their claims. Yet, Net Neutrality supporters counter that

inaction will allow the Internet to devolve into entrenched natural monopoly. The promised

innovation could then be suppressed by monopolistic practices. It is also unwise, the supporters

feel, to deviate from the architecture that should be credited with the initial success of the

Internet. In order to examine the faults and merits of these arguments, a closer investigation

of their economic reasoning is required.


3
[14][9].

2
1.1 Defining Network Neutrality

There are many versions and definitions of Net Neutrality. However, they all cover the

same basic principles. At its root, Net Neutrality specifies a set of rules for competition in all

markets that deal with content distributed via the Internet. The implicit goal of these rules is

to maximize social welfare by encouraging web content producers to innovate[14][9]. Tim Wu

provides a description of this idealized form of competition and Net Neutrality’s place in it:

A communications network like the Internet can be seen as a platform for a com-
petition among application developers. Email, the web, and streaming applications
are in a battle for the attention and interest of end-users. It is therefore important
that the platform be neutral to ensure the competition remains meritocratic [14,
p.146].

The absence of a role for broadband companies in this definition is no coincidence; Wu treats

the platform (the broadband providers) as passive in this context. Stanford and Berkeley Law

Professors Lawrence Lessig and Mark Lemley also expressed benefits of Net Neutrality. They

argue that:

By designing the network to be neutral among uses, the Internet has created a
competitive environment where innovators know that their inventions will be used
if useful. By keeping the cost of innovation low, it has encouraged an extraordinary
amount of innovation in many different contexts [9, p.8].

If the market for broadband service were perfectly competitive, then competition among

content providers would not be in danger. However, the presence of few firms in this section of

the industry, the very high costs incurred for entry, and the virtual monopoly firms operating at

the last mile of the network experience, all attest to imperfect competition between broadband

providers4 .

The presence of firms with market power, which control the platform through which web ap-

plications are distributed, raises the possibility of strategic action within the platform. Broad-

band providers could theoretically use their market power to price discriminate among their
4
[6][14] [15].

3
customers, to vertically integrate into the content market, or both. If either of these actions are

taken, the platform is no longer neutral and the incentive to innovate among content providers

will be suboptimal. Thus, advocates of Net Neutrality conclude that some regulatory action

must be taken in order to minimize strategic behavior by broadband providers. However, the

form that this regulatory action should take varies among advocates.

Attempting to condense Net Neutrality into a positive economic framework may yield the

following argument: The way to maximize social welfare across all Internet related services is

to maximize the incentive of content providers to innovate. Arguably, this requires ensuring

perfect competition among content providers. This requires a network that cannot be tilted

to favor any one content provider. However, the presence of imperfect competition in the

broadband market threatens competition among content providers. Firms with market power

can both price discriminate with consumers and content providers, as well as use this market

power to influence the content market. From the point of view of society, this might be treated

as a market failure. Strategic action resulting from imperfect competition in broadband could

create a market failure in the content market. If broadband providers favor certain content

producing firms, and all content producers must purchase the services of these providers to

distribute their product, unfavored firms would face artificially high transactions costs if the

broadband providers attempt to discourage competition with their own products. In order to

fix the problem, the behavior and pricing options of broadband providers must be regulated. If

broadband providers cannot price discriminate or strategically control the content distributed

over their section of the network, the neutrality of the Internet will be maintained.

This argument rests on several assumptions that Net Neutrality opponents question. Fur-

thermore, the response of the broadband providers are not taken into account in this analysis.

Relaxing some of these assumptions and examining the response of firms to this kind of reg-

ulation could undermine the validity of the optimality of a Net Neutrality regime. Further

4
discussion of this argument requires some knowledge of the history and structure of broadband

services.

2 The Emergence of Broadband: Cable and DSL

Before the advent of Cable and DSL, the physical structure of the Internet differed more

radically than is apparent to end users both before and after these new services entered the

market. The aforementioned classical Internet operated over phone lines, sending data packets

across Networks using TCP/IP, primarily for the purpose of constructing and visiting web pages

on the World Wide Web, or for the use of email. Its capabilities were largely confined to these

functions for the majority of end users. A final well known experience included the temporary

connection created by placing a phone call.

Shortly before the new millennium, cable and telephone providers started to supply broad-

band service for end users in mass. In this environment, end users now experienced a permenant

connection to the Internet, with the capability of receiving and sending much larger quantities

of data. The early introduction of broadband also created a regulatory issue, which when exam-

ined closely, lends credence to the notion that broadband providers will attempt to strategically

control Networks in the absence of regulation. According to Werbach:

While the rules require digital subscriber line(DSL) operators to carry any ISP,
the leading cable operators signed exclusive contracts with two broadband ISPs: Ex-
cite@home and Roadrunner. Other ISPs that wish to serve those customers cannot
do so over the cable plant. Moreover, the cable ISPs are able to impose content
restrictions such as limitations on the length of video streams that subscribers can
access [13, p.52].

The crescendo of this institutionalized asymmetry took place when the FCC classified cable

Internet offerings as information services which, by the Telecommunications Act of 1996 passed

by the Gingrich Congress under the Clinton Presidency, are excluded from any special price or

5
content regulation5 . This ruling recognized and endorsed the strategic behavior of broadband

cable providers. Interestingly, the FCC waited until August 5, 2005 to classify broadband offered

by telephone providers as an information service6 .

The fact that DSL did not have the opportunity to strategically control the network, whereas

cable did, was simply an accident of regulatory lag. The fact that regulatory lag led to a situation

where Cable had the ability to act strategically, and chose to do so, might support the claims

of regulatory ineffectiveness that opponents of Net Neutrality favor. Yet, it also demonstrates

that firms with the ability to strategically manipulate the network believe that they have an

incentive to do so. However, now that the regulatory differences no longer exist, deeper study of

the structure of the broadband industry is required to ascertain if strategic action still benefits

the individual providers.

3 Models of the Broadband Industry

Typical industries generally can be characterized by a chain of supply. It consists of clas-

sifying all firms involved in an industry into three basic categories. This method, called the

vertical method, can be used to outline the structure of the broadband industry. The first

group contains manufacturers, the second category wholesalers who connect manufacturers to

retailers, and firms who distribute the products to consumers fall under retail7 .

The other theoretical structure is called the layered model. Unlike the general method, this

model has been tailored to the broadband industry after closer analysis8 . This approach divides

the industry into four different but overlapping layers. Net Neutrality advocates typically frame

their arguments in terms of the layered model.


5
[13, p 42,43,53].
6
[3].
7
This paradigm of the broadband industry has been derived exclusively from C. S. Yoo [15].
8
The seminal work for this model is Kevin Werbach’s “A Layered Model for Internet Policy”[13].

6
3.1 Vertical Structure

The vertical paradigm of the broadband industry may be the easiest to comprehend, and it

also provides a basic structure to examine the problem at hand. Breaking up the industry into

manufacturing, wholesale, and retail reveals the source of the market failure. In the broadband

industry, there is a market failure in the wholesale category, which might spill over into the

retail category, due to the prospect of vertical integration between the two categories. The

following figure demonstrates this relationship:

Figure 1: Vertical Model9 .

Category Description

Retail Last-mile providers who contract with end users.

Wholesale Broadband providers who mass purchase con-

tent and hold as inventory for retail.

Manufacturing Firms that produce Internet Applications.

If the few firms in the wholesale industry were to vertically integrate into the retail section,

then the many firms that manufacture content would essentially be selling their content to

firms with some monoposony power, by standing between them and consumers. At the same

time, these firms facing consumers would have the ability to manipulate the price that these

services were sold to consumers. One could consider these problems as two sides of the same

coin, because any content consumers that decided to become a content producer would face

imperfect competition accordingly.

A fundamental disagreement in the literature arises in the evaluation of the seriousness of

this situation. Those who disagree with Net Neutrality argue that regulating broadband, a

relatively new industry, might be premature. Given time, as more firms enter the industry, this
9
Derived from [15]

7
problem could be resolved on its own. Furthermore, the structure of the market might invalidate

the claim that vertically integrating into the retail sector would allow the wholesale firms any

greater market power. Evidence for this argument rests on analysis of vertical integration by

the New Chicago School, and will be treated in section four.

Net Neutrality advocates, however, stress that if these firms act strategically to control

prices and content, that the very foundation of the Internet would be destroyed. If this takes

place, assuming the rapid expansion of the Internet was due to this foundation, then immense

losses to social welfare could result. This fundamental principle, called End-to-End or e2e,

underlies the set of TCP/IP protocols that govern the Internet. This principle requires that

no strategic action take place within a Network, something obviously antithetical to firms with

market power[9]. Section five, Policy Approaches and Critiques, takes a closer look at both of

these arguments in terms of the layered model. Furthermore, the real chaos associated with the

rapid rise of broadband and the Telecommunications Act of 1996 resulted from the breakdown

of a horizontal classification system in place since the Communications Act of 1934, which uses

the vertical model implicitly. An attempt to bring structure and temporal ordering to the FCC’s

policies has been made in the following figure:

Figure 2: Horizontal Classification10 .

Title I Jurisdiction 1934 Act

Title III Radio Title II Telephone Title IV Television

Phone 1996 Act Info Service 1996 Act


10
Derived from [13]

8
The problems with this framework have been alluded to earlier. Whereas cable fell under

Title IV restrictions, which allowed for strategic action, telephone providers were subject to

stricter Title II regulation. However, information services, although technically under Title II,

were by law not subject to the same content and pricing regulations as classic telephone and

communication services. As of 2005, broadband Internet service was classified as an Information

service for both Cable and Phone companies11 , essentially leveling the playing field. With this

framework in mind, the industry can now be reinterpreted with the layered model.

3.2 Layered Structure

The layered structure, created by technologist Kevin Werbach, better captures the com-

plexity of the broadband industry. It will allow for a more complete understanding of some of

the nuances of policy proscriptions, be they action or inaction. The rationale for the layered

model stems from the inability of the vertical model to handle the interconnectedness of the

broadband industry. In response to these shortcomings, Werbach created a model of the broad-

band industry that builds it up from several layers, namely: Physical, Logical, Applications,

and Content. The following figure describes these layers:

Figure 3: Layered Model12 .

Layer Function

4. Content The information generated or distributed with

Application software.

3. Application Layer Software for web browsing, video streaming etc.

2. Logical Infrastructure TCP/IP and other protocols.

1. Physical Infrastructure Actual network hardware.

11
[3].
12
Derived from [15] and [13]

9
One of the distinguishing characteristics of this approach is that it does not suppose any

one particular firm to be confined to any one of these layers. The layers define the industry, and

firms can and do provide services that often go between these layers. For example, a company

that produces application software might create and distribute content with this software for

advertising or other proprietary purposes. Most importantly, it also acknowledges the possibility

that firms in charge of the physical infrastructure (which are bound to be large) could operate

in all four of these layers. Although not a direct appeal to Net Neutrality, Werbach makes the

following observation about the application layer:

By and large, applications need not be regulated to ensure competition, so long


as the physical and logical infrastructure underneath is open. With open platforms,
anyone can build new applications to compete with incumbent providers [13, p.63].

Although Werbach does not deal with the issue of Net Neutrality explicitly, this observation

clearly echoes the opinions of Wu and Lessig regarding what kind of broadband competition is

best for society. The reason that Net Neutrality advocates frame their arguments in terms of

the layered model is that it clearly exposes the potential danger to competition in applications

and content. The status of the overarching firm as a telephone or cable ISP makes no difference.

Both have the ability to strategically control their networks, and the legal authority to do so.

The regulatory gap between 2002 and 2005 already demonstrates that at least one firm has the

incentive to act strategically when the other cannot do so. However, the behavior of firms when

both have the ability to act strategically is still an open question. Finally, the issue of the cost

effectiveness and the ability of regulation to correct a market failure in broadband still has not

been answered. The case for policy, through action or inaction, must now be considered.

10
4 Vertical Integration in the Broadband Industry

The technique of vertical integration provides the mechanism through which broadband

companies could undermine the neutrality of the Internet. When firms choose to vertically

integrate, it entails a merger or entry into adjacent or complementary markets for the good

or service which the firm in question supplies[5]. In the case of broadband, adjacent firms

can best be understood using the vertical structure, while the layered structure provides the

clearest layout of complementary services. From the point of view of the literature, the firms

in question are the wholesale category, which ideally corresponds to the physical layer in the

layered model. Vertical integration thus involves entry or acquisition by wholesale firms of

retail or manufacturing, which really involves merging the infrastructure of the physical and

application layers of the Internet. For example, if the broadband provider Comcast Cable (a

wholesale purchaser or owner of the physical layer in the respective industry models) merged

with or acquired Blizzard Entertainment (the developers of popular online video game World

of Warcraft), this would constitute vertical integration. Joseph Farrell and Phil Weiser explain

the economic reasoning behind vertical integration:

The classic formulation, offered by Augustin Cournot in 1838, is that separate


complementary monopolies, each imposing a monopoly markup, wind up with a final
product price that exceeds the overall monopoly price. As a result, both consumers
and the producers are worse off than they would be if the two firms merged and
charged a monopoly price for the two goods together. More generally, this insight
explains that firms providing complementary activities or products are in a mutual
position of ’vertical externality’[5, p.98].

The applications market is competitive, whereas broadband providers certainly enjoy market

power due to barriers to entry. However, since the services that both provide complement

one another, there is a case for efficiencies gained from vertical integration. Furthermore,

broad legal13 and theoretical consensus follows the findings of the Chicago School on vertical
13
[1] cited the Chicago school to rule in favor of vertical integration.

11
integration14 . The Chicago School argued that for vertical integration to result in a Pareto

suboptimal allocation relative to the original allocation of goods and services, two requirements

must be met:

Stipulations for Suboptimal Vertical Integration15

Requirement

1. The vertically integrating firm must have mar-

ket power in its own market.

2. The market being entered must otherwise have

high barriers to entry.

Since broadband providers integrate into a competitive environment, these assumptions

might not hold. Thus, an argument can be made that vertical integration in the broadband

industry will at least be Pareto neutral. Furthermore, if efficiencies due to the complementary

nature of the services exist, it might even be Pareto improving to allow integration.

Net Neutrality advocates have responded to this argument by either asserting that that

second assumption holds once vertical integration has been established, or arguing that it is

irrelevant to the broadband industry. The stance that each party takes on these issues tends

to shape their policy argument. Those who accept the argument that vertical integration in

broadband is harmless feel that regulation should be confined to promoting entry of more firms

into broadband, and reject Net Neutrality as misguided. At the same time, those who question

the second assumption come down powerfully in favor of price and content regulation to assure

Net Neutrality. Finally, middle of the road scholars have attempted to re imagine Net Neutrality

in a context independent of the issue of vertical integration.


14
[5], [14], and [15] broadly accept this argument.
15
Adapted from [15].

12
5 Policy Approaches and Critiques

Having examined the industry and the prospects for vertical integration, the issue of policy

can now be addressed. As explained earlier, the stance taken on vertical integration in broad-

band tends to shape the policy recommendations of the literature. In the following section,

some of the major categories of policy recommendations will be examined and compared.

The first and oldest argument for Net Neutrality goes by the End-to-End argument. Pro-

ponents of this view feel that the neutral network of the classic Internet was optimal, and

any deviation from this architecture will destroy the incentive structures which have caused

the Internet to expand. Losses to innovation and productivity would consequently immense.

The second argument comes from those who accept the harmlessness and potentially efficient

prospects of vertical integration, concluding that supporters of e2e are shooting themselves in

the foot. Finally, advocates of what has been called a Broadband Discrimination Regime have

attempted to find a middle road that allows for vertical integration, while keeping the core of

Net Neutrality intact.

5.1 The End-to-End Argument

The problem of optimizing communication networks has been a subject of study for some

time in computer science. One of the classic papers to offer a functional solution for an efficient

network design was Saltzer et al’s. “End to End Arguments in System Design”. The MIT

researchers made the observation that, when sending packets across a network between two

computers, there are a great many points where error checking functions could be deployed to

ensure that the packets have not been corrupted. An advantage to intermediate error checking

could be catching problems early and saving delivery time. However, as the probability of

packet corruption inside the network falls, error checking points become increasingly superfluous.

Finally, if the probability reaches some threshold, it would be more efficient to just have error

13
checking software at the ’ends’, or within the two computers, and just initiate a retry if a

packet failed to arrive intact16 . Many communications systems have subsequently been designed

incorporating their observations. Furthermore, the architecture of the Internet- i.e. the basic

protocols that all networks within the Internet share- follow this design principle.

In this environment, strategic action by the agents who own the respective networks would

be impossible. Since no underlying content function exists within the network, only at the ends,

the owners of the network have no mechanism to control the content shared between users. One

of the most noted supporters of e2e, Lawrence Lessig, also elaborates on result:

One consequence of this [e2e] design is a principle of non-discrimination among


applications. Lower-level network layers should provide a broad range of resources
that are not particular to or optimized for any single application– even if a more
efficient design for at least some applications is thereby sacrificed[9, p.6].

This is the result of the fundamental architecture of the Internet. Clearly, in order to allow

for the kind of content discrimination discussed earlier, broadband providers must place a func-

tion somewhere within their networks to control and monitor content. It has been established

earlier that these firms have the legal authority to do so. Thus, the criticism by advocates of

e2e that vertical integration into the content market by broadband providers would destroy the

Internet as it has been previously understood is not an exaggeration. However, this in itself

does not justify action to prohibit firms from making decisions which are currently legal.

In order to make the case for a government mandate of neutrality, e2e supporters typically

raise two issues. The first has been mentioned earlier, and involves the incentive to innovate

which arguably comes from a neutral network. However, advocates also raise a second issue

concerning freedom of information and expression[10]. The first has been considered in detail,

because it can be scrutinized theoretically. The second point, while it certainly should not

be dismissed, is mostly beyond the scope of this text. But, the reader should be aware that
16
[12].

14
this argument, like the last-mile argument, has a great deal of purely normative support. In

order to maintain Net Neutrality from an e2e perspective, the necessary policy action is open

access. This follows from the possibility that, if broadband providers vertically integrate into

the content market, they can erect barriers to entry in the content market, due to their special

position as shown in the layered model. In the words of Lessig:

If a regulated entity threatens to force the adoption of an architecture which is


inconsistent with the Internet’s basic design, and if that action affects a significant
portion of a relevant Internet market, then the burden should be on the party taking
that action to justify this deviation from the Internet’s default design[9, p.62].

The default design, of course, being End-to-End. Thus, in the ideal case of a Net Neutrality

regime under e2e philosophy, no firm could place functions inside of their own network without

making a legal case for its inclusion. This, one might imagine, would most likely involve some

kind of security protocol. Absent any other functions inside of the network, broadband providers

would be unable to control the content consumed by any particular user. In this case, vertical

integration into the content market would be completely ineffective in boosting the market

power of the broadband firms, because user choice is beyond their control. This would preserve

competition in the content market, where the incentive to innovate would be the strongest.

Consequently, if the assumption that maximizing content competition leads to maximum social

welfare, and this does not alter the firm’s behavior, then this could be the best policy course.

5.2 The Argument for Last-mile Competition

The approach of fostering competition where broadband firms are fewest is the closest to

a “hands off” method. Supporters of the hands off approach correctly see the end-to-end

argument as basically a call for price regulation. In order to restrain the market power of

broadband providers, end-to-end does call for restrictions on both prices and activities in the

broadband industry. Thus, for last milers, the issue then is that, “The government must weight

15
policy choices carefully: are the benefits of price regulation likely to exceed its costs?” [8, p.1].

Neutrality opponents believe this answer to be in the negative.

One of the primary criticisms last milers offer of the e2e argument is that it might be

wholly counterproductive. Net Neutrality advocates do not consider the way price regulation

might affect the incentives of broadband providers. Regulatory movement towards standardized

pricing and services could severely limit innovation on the part of broadband providers. This

would place a bound on the ability of content providers to innovate, at which point the content

innovation assumption upon which e2e is based would fail. Wu concisely states this point of

view:

Simply put, allowing network owners to employ different protocols can foster
innovation by allowing a wider range of network products to exist. Conversely,
compulsory standardization can reduce consumer surplus by limiting the variety of
products available [15, p.18].

Furthermore, the issue of regulatory lag could be a significant hindrance to the effectiveness

of Net Neutrality policy, specifically because it seeks to target such a dynamic market. This

issue has been discussed previously in relation to the Werbach paper. Regulatory lag led to a

three year window, between 2002 and 2005, where Cable and DSL providers operated under

different legal requirements for the same service. Since the Internet market continues to evolve

so rapidly, price and content restrictions could become redundant or a barrier to innovation

quickly. A recent example involves the adoption of wireless Internet or Wi-fi technologies. Wi-fi

provides the same service as land line based broadband, but in a fundamentally different way.

It will be discussed in section 6, for now it is enough to remark on its existence and rapid

expansion.

16
Given these criticisms that end-to-end type regulation could either be counterproductive or

completely ineffective, the last milers assert that a method of fostering competition where the

broadband market is most concentrated is the best policy. Economists such as Hahn believe

that:

Instead of imposing net neutrality, government should remove artificial regula-


tory barriers that slow the development of broadband and other information service
technologies. Examples of such barriers include limitations placed on the use of
spectrum and anticompetitive local rules, which limit the number of broadband
providers and dictate the kinds of services providers can send of their broadband
lines [8, p.2]

This call is echoed by Yoo[15, p.43]. In effect, last milers call for the exact opposite action to

be taken by federal regulatory agencies, though it still calls for them to act. Instead of a national

price and content standardization, the result would be a national policy of deregulation. Thus,

federal powers would be used as a tool to curtail local idiosyncratic regulations that contribute

to the concentration of broadband service providers.

5.3 The Broadband Discrimination Regime

The argument for the hands off approach raised the issue that regulatory lag and stan-

dardized pricing could be counterproductive to the goals of Net Neutrality. Supporters of e2e

have, to some degree, recognized these complications. Even Lessig acknowledges that,“...as a

practical matter, building security features and other content-distinguishing elements may be

inevitable, at least at the applications level,”[9, p.18]. In this spirit, law professor Tim Wu offers

what he calls a Broadband Discrimination Regime. This regulatory approach accommodates

competition in the content market, and the ability of broadband providers to have some control

of their own pricing. Thus, while it has roots in the e2e approach, it does address some of the

concerns of last milers. It has been saved as the last argument for both this reason, and the

fact that the FCC has tacitly embraced it as its approach to Net Neutrality policy17 .
17
See [2].

17
As a Net Neutrality supporter, Wu makes the case for the superiority of e2e design of a

network using the same reasoning about competition among application providers, arguing it

provides the best incentive for innovation. He calls the meritocratic selection of applications

an “evolutionary approach”. In attributing the rise of the Internet to this approach, he notes

that, “Backers of an evolutionary approach to innovation take the Internet, the fastest growing

communications network in history, as evidence of the superiority of a network designed along

evolutionary principles,”[14, p.146-147]. The neutral platform allows application providers to

adjust perfectly to the tastes of consumers.

However, this is a static setup that implicitly assumes that the given neutral platform has

bandwidth capacity enough to satisfy the needs of each offered application. It might still be

true that a neutral platform could allocate finite resources efficiently according to the tastes of

the users, but if this requires an artificial restriction of bandwidth capacity, then the process is

not technically efficient. Wu argues this in the context of quality of service (QoS) for bandwidth

intensive applications like VoIP or video streaming.

To the extent open access regulation prevents broadband operators from archi-
tectural cooperation with ISPs for the purpose of providing QoS dependent appli-
cations, it could hurt the cause of network neutrality. By threatening the vertical
relationship required for certain application types, it could maintain IP’s discrimi-
nation in favor of data applications[14, p.150].

If the last mile criticism about the incentive for broadband providers to improve their net-

work service proves valid, then the market would be made statically efficient although dynami-

cally inefficient by Net Neutrality. For example, the movement as of 2010 to making real-time

streaming of events in high definition could be stopped in its tracks by mandated neutrality,

as it requires a massive expansion of bandwidth. But, the current popular applications might

be better refined to the tastes of users. Yet, this requires implicitly instituting a bias towards

less bandwidth intensive programs. In this situation, the accuracy of the positive claim that

mandated Net Neutrality is objectively better than doing nothing is at best unclear.

18
The conclusion that Wu derives from this observation is that discrimination by band-

width use does not conflict with Net Neutrality. He notes that, “certain classes of appli-

cations will never function properly unless bandwidth and quality of service are guaranteed.

Hence, the absence of bandwidth management can interfere with application development and

competition,”[14, p.155]. This insight provides the foundation for the Broadband Discrimina-

tion Regime. Wu develops a legal framework with the goal of allowing broadband providers

to discriminate among users by bandwidth, but forbidding the same providers from controlling

the user’s choice of application for that bandwidth. Since monitoring bandwidth use requires

providers to place functions or protocols inside their network, the pure e2e design would be

dead. However, since neutrality still exists in principle, this regime is a kind of hybrid of the

previous two.

Putting the regime into practice requires passing legislation that sets the rules by which

broadband providers can discriminate among their customers. Wu offers an example of such

a law in his paper, where he defines the rights of broadband providers in positive terms. The

following table lists the situations where Wu feels broadband providers should have the right to

restrict network access:

Broadband Discrimination Legislation[14]

1. Compliance with federal, state, or local laws

2. Prevention of physical harm to a network

3. To control viruses, worms and Spam

4. To eliminate delay or jitter which undermines

QoS

5. To prevent unauthorized access to a network

6. To serve other purposes sanctioned by the FCC

19
The key to the success of this regime and the legislation putting it into practice is, according

to Wu, that they can “...distinguish between forbidden grounds of discrimination, those that

distort secondary markets, and permissible grounds, those necessary to network administration

and harm to the network,”[14, p.170]. Thus, the broadband discrimination regime defines a set

of rules to deal with the trade off between competition in the applications layer and innovation in

the physical layer. Broadband providers are allowed to discriminate, but not for the purposes of

discouraging competition among content providers. Consequently, the applications layer should

remain competitive. This should ensure that, absent other occurrences, the applications market

will remain efficient.

5.3.1 Endorsement by the FCC

In the year 2005, the FCC issued several policy statements that attempted to address the

issue of regulating the broadband industry. One such policy statement FCC 05-153, placed

cable and DSL under the same regulatory restrictions, and has been mentioned earlier. An-

other statement, FCC 05-151, which closely followed the previous one, sought to define the

role the FCC felt it should play in regulating the broadband industry in general. It outlined

four principles, which the FCC felt would ensure the best environment for consumers. Further-

more, it read that, “...the Commission has jurisdiction necessary to ensure that providers of

telecommunications for Internet access or Internet Protocol-enabled (IP-enabled) services are

operated in a neutral manner,”[2]. This is important because the Commission is stating that,

even without a law similar to the one proposed by Wu, the FCC has the authority to mandate

Net Neutrality. Furthermore, the four principles should also be familiar. They state that, to

encourage broadband deployment and preserve and promote the open and interconnected nature

of the public internet:

20
Principles[2]

1. Consumers are entitled to access the lawful In-

ternet content of their choice.

2. Consumers are entitled to run applications and

use services of their choice, subject to the needs

of law enforcement.

3. Consumers are entitled to connect their choice

of legal devices that do not harm the network.

4. Consumers are entitled to competition among

network providers, application and service

providers, and content providers.

Although these principles have been defined from a consumer perspective, and their applica-

tion is discretionary as opposed to mandatory, one cannot deny the influence of the Broadband

Discrimination Regime in this policy statement. Furthermore, the fourth principle of compe-

tition seems to be a recognition of the last mile argument, as does the general format of the

statement. The FCC acknowledges its authority to mandate Net Neutrality, but it does not

annouce its intention to do so. It also issues principles that it would like to uphold, as opposed

to rules it will enforce. This reflects a clear appreciation for the potential pitfalls of regulation

mentioned by the last mile supporters. Thus, as far as policy intentions can be gleaned from

this statement, the FCC appears to have largely accepted the wisdom of Net Neutrality from

Wu’s perspective, but have taken into account some last mile criticisms.

21
6 Economic Experiments and Wi-Fi

The market for Internet services has and continues to experience rapid technological progress.

The rise of large-scale broadband by wireless methods that began just after the year 2000 is an

example of such progress. These events lend credence to the claims by opponents of Net Neu-

trality that it would be difficult to regulate such a dynamic industry effectively. Understanding

the process that has generated technologies such as wireless Internet can provide greater insight

into the nature of the competition among broadband providers, and might lead some to question

the assumption that broadband infrastructure still creates a natural monopoly.

An economic experiment is an important concept in the case of a market undergoing techno-

logical shocks, such as the market for Internet service. Greenstein states that such experiments,

“...pertain to any market experience that alters knowledge about the market value of a good

or service,”[6, p.2]. Firms in new markets see them as a laboratory, where knowledge about

methods for distributing and pricing goods are gained at some cost. When weighing the benefits

and costs of engaging in certain types of economic experiments, firms must consider current and

expected regulation requirements. Thus, regulation not only affects the good a firm currently

supplys in a market with frequent technological shocks, but also the goods a firms expect to

supply in the future. This relationship between the firm and regulator is more complicated than

the one presented by Wu.

According to Greenstein, there are two kinds of economic experiments: directed and undi-

rected. The difference being that directed experiments represent the purposeful actions of in-

dividual firms to gain knowledge about their markets, whereas undirected experiments include

groups of firms or sectors engaged in learning about some external change in their market.

For example, AOL’s decision to use email and login names with natural language in the run

up to marketing web browsers to households can be considered a directed economic experi-

22
ment18 , while the rise of Wi-Fi followed from an undirected economic experiment. According

to Greenstein:

Wi-Fi did not arise from a single firm’s innovative experiment. Rather, Wi-Fi
began as something different that evolved through economic experiments at many
firms. The evolution arose from the interplay of strategic behavior, coordinated ac-
tion among designers, deliberate investment strategies, learning externalities across
firms, and a measure of simple and plain good fortune[6, p.12].

Wi-Fi is an example of the kind of fundamental innovation that last milers argue should be

fostered by public policy, while strict price or content regulations should be avoided. One could

argue that such regulations would limit the ability of firms to undertake economic experiments.

The result would be a kind of dynamic inefficiency, where firms have a disincentive to pursue

better technologies due to regulation. Furthermore, in their paper on wireless network neutrality,

Hahn and his fellow writers make a similar observation that, “Given the high level of competition

in the wireless industry, an individual operator should be entitled to experiment with different

business models, especially where there is unlikely to be any anticompetitive effect,”[7, p.7]. This

argument extends the reasoning of the last mile argument that promoting new technologies is

a superior policy.

Finally, the presence of Wi-Fi also calls for a reexamination of the requirements for subop-

timal vertical integration. If Wi-Fi is a viable alternative to land-line based Internet service,

the assumption that broadband providers are a natural monopoly is now questionable. If all

providers of wireless services can now be considered competitors because of the ability of in-

dividuals to substitute away from landlines for Wi-Fi, then there is no longer a market failure

in broadband. However, since wireless still has limited availability and a bandwidth cap sig-

nificantly lower, on average, than landlines[7], its present status as a competitor to broadband

service is limited.
18
This example was taken from page 13 of [6].

23
7 Game Theory Applied to Broadband

Since the main providers of broadband service are cable and DSL companies, and these

companies are assumed to have market power resulting from economies of scale, insights into

Net Neutrality arguments might be gained by examining the effect of regulation on the strategic

interactions of the firms. Game theory is an ideal and standard method for examining this kind

of imperfect competition between firms. In the following setup, assume there are two agents: a

cable and a DSL firm. The behavior of these two agents will be analyzed with several games:

the Differentiated Bertrand, the Cournot, the Stackelberg, and the Sequential Differentiated

Bertrand. The firms will either compete with respect to content (which will be the same as

quantity) or prices. The Broadband Discrimination Regime will be in place when price is

considered, and no regulation when considering content. The models presented can be found in

many intermediate textbooks. The following work follows from the analysis of the author, and

to the best of his knowledge has not been conducted in this fashion elsewhere19 .

7.1 Simple Bertrand

In a simple model of bertrand competition, two firms with market power and selling ho-

mogenous products set their prices simultaneously. There is no collusion, and the firm which

sets the lowest price captures the entire market. The mathematical setup of the model appears

as follows, where “BR” denotes the best response function of each firm:

(1) Maximize Π1 = F (p1, p2)


p1

(2) Maximize Π2 = F (p1, p2)


p2

. BRi = (pi ≡ M Ci ) and Πi = 0 for i = 1, 2

19
Information on a related analysis is available at http://news.ufl.edu/2007/03/07/net-neutrality/
where a more sophisticated analysis by Kenneth Cheng found that “Abandoning net neutrality discourages
improvements in service” and a treatment of the general models can be found here[11].

24
The resulting nash equilibrium of this model is that each firm sets its price equal to its marginal

cost, which mirrors the efficient outcome under perfect competition. In the event that the cable

and DSL firms had virtually identical cost structures and provided a virtually homogenous

product, this model implies that a Net Neutrality policy based on Wu’s analysis would be ideal.

Even with market power, allowing the firms to strategically set their prices while forbidding

discrimination of content would be efficient.

However, there are issues to applying this model to the broadband industry. Simply put,

while cable and DSL provide similar services, the products are not homogeneous and the cost

structures are fundamentally different. Another problem with this model is that, assuming

different marginal cost functions, the firm with the lowest marginal cost will win the entire

market share. Despite the differences in cost, this has not taken place in broadband. Still, it

could be the case that the services and cost structures of the industry will converge, in which

case this model would apply.

7.2 Cournot Duopoly

Another classic model involves firms with market power strategically setting the quantity of

the good which they will produce. This form of competition, called cournot competition, can

also be useful for the broadband industry. In this case, the setup is similar to the bertrand,

but the variable of choice is quantity. This mathematical model appears as follows, where “m”

denote monopoly outcomes and “b” bertrand outcomes:

(1) Maximize Π1 = F (q1, q2)


q1

(2) Maximize Π2 = F (q1, q2)


q2
X
. BRi = (qm < qi < qb ) and 0 < Πi < Πm for i = 1, 2

25
In this nash equilibrium, the resulting quantity and price are less efficient than the simple

bertrand, but more efficient than a monopoly. One advantage of this model for the broadband

industry is that it continues to work if the cost functions are different. This would be a setup

without Net Neutrality, where cable and DSL companies are able to control bandwidth or

content. The lesson here is that, in the absence of Net Neutrality, broadband providers do

have an incentive to act strategically to control content. Unlike the simple bertrand, there are

long-run profits to be made in this market, which means there is deadweight loss to society. If

steps could be taken to move this market towards a simple bertrand, there are potential parteo

improvements20 . Finally, another result of the cournot model is that, as the number of firms

goes to infinity, the market asymptotically approaches efficiency. Thus, the last mile argument’s

logic regarding promoting entry seems valid under these circumstances.

The application of the cournot model to the broadband industry also has some shortcomings.

It might be improved by introducing some probabilistic elements. For example, broadband

providers might have some fear that using their market power to its full cournot potential could

signal to regulators that steps should be taken to move them towards less profitable simple

bertrand competition. The threat of the Broadband Discrimination Regime could be a tool of

moral suasion, which would constrain cable and DSL companies.

7.3 Differentiated Bertrand

In order to address the heterogenaity of cable and DSL service, the bertrand model with

differentiated products offers another lense through which the broadband industry can be ex-

amined. In this model, the two firms compete with respect to price, but there is some level of

differentiation in the products that creates a kind of consumer loyalty. A consequence of this

assumption is that a firm does not immediately lose its entire market for raising its price above
20
I say, “potential”, because second best theory demonstrates that such improvements from incomplete adjust-
ments are sometimes uncertain.

26
its rival, only some variable portion. The setup of this model is as follows where “D1 ” and

“D2 ” represent the demand curve each firm faces:

(1) Maximize Π1 = F (p1, D1 = f (p1, p2)) where ∂D1 /∂p2 > 0


p1

(2) Maximize Π2 = F (p2, D2 = g(p1, p2)) where ∂D2 /∂p1 > 0


p2
X
. BRi = (pb < pi < pm ) and 0 < Πi < Πm for i = 1, 2

This nash equilibrium is similar to the cournot equilibrium. Since each firm sets a price above

the simple bertrand price, there will be profits, deadweight loss, and thus inefficiency. The size

of this inefficiency is dependent on the relative size of some of the parameters in the model.

Thus, by this model, there is no ready assurance that a broadband discrimination regime would

be superior to a last mile policy. Assuming that pricing differentiated services and controlling

content are analogous in the broadband industry, the parameters of each model would need to

be estimated and evaluated in order to decide which one would be best from a social welfare

perspective. If the parameters could be reliably estimated using structural econometrics, then

regulators could justify that price and output under one model would be significantly closer to

the efficient outcome than another. If accounting for regulatory costs did not change this result,

a clear choice between the last mile and broadband discrimination policies could be reached.

7.4 Market Entry: Sequential Games

Since a controversial point between supporters and opponenets of Net Neutrality is the

ability of other firms to enter the broadband industry, the point should be examined further.

Section 5.4 has already presented evidence that cable and DSL might face competition from

wireless internet services. However, it might be benefical to examine market entry in a more

general way under the two regimes. The Stackelberg and the Sequential Differentiated Bertrand

27
are the two most common models for market entry. The mathematical setup of these models are

similar to static coutnot and differentiated bertrand, but are now sequential games. They are

solved by backwards induction. In each game, there is an incumbant and a potential entrant,

and the incumbant can optimize after the entrant. Both models and their results will now be

presented where “qc” denote cournot outcomes “pdb” differentiated bertrand outcomes, and

“i*” a best response price or quantity:

Stackelberg Model

(1) (Player 2) Maximize Π2 = F (q1, q2)


q2

(2) (Player 1) Maximize Π1 = F (q1, q2 = t(q1 ∗))


q1
X
. Πi < Πc and Π1 > Π2 for i = 1, 2

Sequential Differentiated Bertrand

(1) (Player 2) Maximize Π2 = F (p1, D1 = f (p1, p2)) where ∂D2 /∂p1 > 0
p2

(2) (Player 1) Maximize Π1 = F (p2, D2 = g(p1, p2 = m(p1∗))) where ∂D1 /∂p2 > 0
p1
X
. Πi > Πdb p1 > p2 and Π2 > Π1 for i = 1, 2

In the stackelberg model, the incumbent has the advantage, and will make higher profits than

the entrant. However, the resulting price and quantity are closer to the efficient outcome than in

the cournot model. Conversely, the sequential bertrand results in the follower obtaining larger

profits than the incumbent, and yeilds an outcome that is less efficient than the differentiated

bertrand. From the perspective of broadband industry regulators, it appears that the last mile

proscription of fostering entry into the market might be complicated by the very act of allowing

the incumbent firms control over content. Their control over content weakens the incentive for

other firms to enter the market. A Broadband Discrimination Regime, on the other hand, is

friendlier to those seeking to enter the market, but the outcome is actually less efficient than

its static bertrand counterpart. As with the static models, making the optimal policy choice

28
would once again require a structural approach to estimating the parameters of the models,

and determining the relative distance from the efficient outcome. If one model of entry were

significantly closer to the competitive outcome (after accounting for regulatory costs), a correct

policy decision could be reached.

7.5 A Bayesian Interpretation

One of the issues not yet treated by this analysis is how the probability of Net Neutrality

regulation might affect the behavior of broadband providers. For example, given the assertion by

the policy statement FCC 05-151 that the Commission has the authority to mandate neutrality,

firms face a choice of which game they would voluntarily play, given their assessment of the

credibility of the FCC’s claim. Since the FCC only stated their ability but not their intention

to act, the firms can still legally pursue a route of content control. However, this might provoke

the FCC to act, forcing the broadband providers to accept regulation that will make them worse

off. Consequently, the firms will face a tradeoff between pursuing cournot profits and raising

the probability of intervention. For a simple mathematical illustration, assume the broadband

provider perceives the probability that the FCC will mandate Net Neutrality if they engage in

content control as δ, and δ is an increasing function in cournot profits i.e. ∂(δ)/∂(Πc ) > 0.

Also, assume that the firms are risk neutral.

(1) If δ = 1 then Π = Πdb

(2) And if δ = 0 then Π = Πc

(3) And E[Π] = δ(Πdb ) + (1 − δ)(Πc )

Clearly, the firm’s decision to engaged in open cournot competition depends on the size of the

cournot profits relative to the differentiated bertrand, and whether or not the corresponding

increase in δ will push the probability to unity. If cournot profits are larger, but not large enough

29
to prompt action, the firms should engage in cournot competition. Otherwise, the providers

would engage in differentiated bertrand competition by choice. Finally, if differentiated profits

are larger, broadband providers might prefer the regulation. However, this would make them

vulerable to falling into the simple bertrand, and make the entry of competitors more likely.

This analysis is key to the interpretation of the April 2010 ruling by the U.S. Court of

Appeals circuit in the District of Columbia that currently, the FCC legally “cannot support its

exercise of ancillary authority over Comcasts network management practices,”[4]. From a math-

ematical perspective, this might dramatically weaken the expected increase in the probability of

intervention by the FCC if broadband providers were to engage in cournot competition. Thus,

one consequence of this ruling, according to this analysis, is an increase in the possibility of

the broadband industry using its power to control content i.e. engage in cournot competition.

However, if the current ruling leads to actual legislation defining the powers of the FCC in

relation to Net Neutrality, then the bayesian game ends. In this case, the previous sequential

and static analyis would be of better use.

8 Conclusion

The arguments for and against Net Neutrality rest, as they should, on assumptions about

what kind of policy would be best for social welfare. Supporters of Net Neutrality contend

that it has already proven its worth in the success of the Internet, and should be protected if

threatened. Opponents argue that attempting to enforce Net Neutrality could be self defeating,

and a better policy would be to simply promote compeition where the original market failure

is– among broadband providers. However, examination of the structure of the industry reveals

a unique interconnectedness between broadband providers and content providers, which could

be exploited by broadband firms to the detrement of the content market. Furthermore, the

30
fast pace of technological change presents a challenge for network regulation in general. As the

rapid rise and spread of Wi-Fi demonstrates, even the imperfect competition among broadband

providers is not certain to last. Yet, there is no denying that these same providers could have

the capacity to hold off such technological improvements, and this would only be strengthened

by allowing their unregulated control of content providers.

A look into the broadband industry from the perspective of game theory reveals that de-

termining a clear best policy might be possible, but it rests on econometric and benefit-cost

analysis that has, to the knowledge of the author, not been undertaken. The actions of the FCC

until April of 2010 could be interpreted as a compromise between supporters and opponents.

By asserting the potential to mandate Net Neutrality, and the ability but not the intention to

do so, the FCC creating a bayesian game which created the possibility of having Net Neutrality

without the cost of regulation, while still acknowleding the importance of last mile competition.

From a social welfare perspective, this seems like a reasonable soulation. However, the decision

by the Court of Appeals in April 2010 changed the dynamic of the bayesian game. Without

legislative action, the probability of FCC intervention has fallen, and the incentive to control the

content market has risen. This assumes that profits from content control are superior to price

competition, which seems reasonable over the long run, although it has not been empirically

justified. If this is true, however, restoring the ability of the FCC to mandate Neutrality would

be more efficient, with the important qualification that it is not compelled to do so.

31
9 Appendix: Terms and Concepts
• IP or Internet Protocol: This includes the group of many standard systems of rules
that control the transformation and transmission of data across the Internet. It typically
converts raw data into a packet form, which is then sent to other users and reassembled
upon arrival by similar procedures. For more see TCP/IP or Internet Protocol Suite.

• ISP or Internet Service Provider: These are firms, typically privately owned, that
sell access to the Internet as a commodity. The form of this access is determined by the
physical infrastructure used to connect to the network. For example, cable companies
typically use cable, and phone companies use DSL. Finally, use of wireless technology by
ISP’s has made access by Wi-fi more common.

• Internet Backbone: Overarching network infrastructure that connects various local


networks around the world are part of the Internet backbone. These connections are owned
and operated by a mixture of private, government, and especially academic institutions.

• TCP/IP or Internet Protocol Suite: This set of protocols includes the fundamental
logical systems that govern the classical Internet. Email, simple web browsing, and small
file transfers are controlled globally by these protocols.

• TCP or Transmission Control Protocol: A higher level protocol that governs the
networks that make up the Internet. TCP oversees the activities of the many IPs, with
the goal of ensuring the accuracy of data sent by these IPs. While IPs operate on the
micro level, TCP exists to govern the macro level of networks. For more see TCP/IP or
Internet Protocol Suite.

• VoIP or Voice Over Internet Protocol: A bandwidth intensive technology for con-
ducting voice communication via the Internet. The process requires speech to be converted
into a format that can be transmitted in the form of data packets to other users, and then
reassembled with minimal delay. It serves as a substitute for traditional telephone com-
munication.

32
References
[1] 694 F2d 1132 Continental Tv Inc v. Gte Sylvania Incorporated P, September 1982. http:
//openjurist.org/694/f2d/1132/continental-tv-inc-v-gte-sylvania-incorporated-p.

[2] Fcc 05-151. Federal Communcations Commission, September 2005. http://fjallfoss.


fcc.gov/edocs_public/attachmatch/FCC-05-151A1.pdf.

[3] Fcc 05-153. Federal Communcations Commission, August 2005. http://www.askcalea.net/


archives/docs/20050923-fcc-05-153.pdf.

[4] United states court of appeals (d.c.) no. 08-1291. Comcast Corporation v FCC, April 2010.
http://pacer.cadc.uscourts.gov/common/opinions/201004/08-1291-1238302.pdf.

[5] Joseph Farrell and Phil Weiser. Modularity, vertical integration, and open access policies:
Towards a convergence of antitrust and regulation in the internet age. Harvard Journal
of Law and Technology, 7:87–134, 2003. http://ssrn.com/abstract=247737ordoi:10.2139/
ssrn.247737.

[6] Shane Greenstein. Economic experiments and neutrality in internet access. NBER Inno-
vation Policy and the Economy, pages 59–109, 2008.

[7] Robert E. Hahn, Robert W. Litan and J. Hal Singer. The economics of wireless net
neutrality, 2007. http://ssrn.com/abstract=98311.

[8] Robert W. Hahn and Scott Wallsten. The economics of net neutrality, 2006. http://www.
bepress.com/ev/vol3/iss6/art8.

[9] Mark A. Lemly and Lawrence Lessig. The end of end-to-end: Preserving the architecture
of the internet in the broadband era. 2000. http://ssrn.com/abstract=452220ordoi:10.
2139/ssrn.45222.

[10] Lawrence Lessig. Code and the commons. 1999. http://cyber.law.harvard.edu/works/


lessig/Fordham.pdf.

[11] Walter Nicholson and Christopher Snyder. Microeconomic theory: Basic principles and
extensions. 10, 2008. Thomas South-Western.

[12] Reed D. P. Saltzer, J. and D. Clark. End-to-end arguments in system design. ACM
Transactions on Computer Systems, 2:277–288, 1984. http://web.mit.edu/Saltzer/www/
publications/endtoend/endtoend.pdf.

[13] Kevin Werbach. A layered model for internet policy. Journal on Telecommuni-
cations and High-Tech Law, 2, 2002. http://www.pff.org/issues-pubs/pops/pop11.
11yoonetneutrality.pdf.

[14] Tim Wu. Network neutrality, broadband discrimination. Journal of Telecommunications


and High Technology Law, 2:141–175, 2003.

[15] C.S. Yoo. The economics of net neutrality: Why the physical layer of the internet should not
be regulated. 2004. http://www.pff.org/issues-pubs/pops/pop11.11yoonetneutrality.
pdf.

33

Das könnte Ihnen auch gefallen