« Documents

Reply Comments of Public Knowledge In the Matter of A National Broadband Plan for Our Future

July 29, 2009

Full comments are also available in
PDF format
.

Before the
Federal Communications Commission
Washington, D.C. 20554

GN Docket No. 09-51

July 21, 2009

Public Knowledge submits the following comments in the above captioned
proceeding. The opportunity to shape our nations’ first National
Broadband Plan drew hundreds of comments producing thousands of pages of
recommendations and evidence. Public Knowledge (PK) could not hope to
cover in detail every comment deserving of support or requiring
refutation. As the Commission continues to develop its broadband plan in
cooperation with the public, PK anticipates that it will provide more
detailed additional comments and responses. To maximize impact at this
early stage, PK focuses these reply comments on a set of foundational
issues that, in combination with the initial comments, provide a
framework for moving forward. Specifically:

  • The Commission Should Reject Calls for Copyright Filtering,
    or Any Other Discriminatory “Network Management
    Techniques”.
    The Commission should reject arguments
    from the entertainment industry to require or encourage ISPs to filter
    content in the name of preventing the unauthorized distribution of
    copyrighted material. As detailed in the attached Public Knowledge
    whitepaper, “Forcing the Net Through a Sieve,” copyright
    filtering simultaneously blocks non-infringing material while failing
    to stop a significant amount of infringing material from being
    distributed. This quest for a copyright filtering silver bullet
    threatens the First Amendment rights of users, imposes huge costs on
    providers, and potentially undermines the “safe harbor”
    provisions of the Digital Millennium Copyright Act. In the absence of
    any evidence that could justify these costs, the Commission should
    reject calls to include recommendations for copyright filtering in the
    National Broadband Plan.

    Similarly, the Commission should reject the false dichotomy between
    “non-discrimination” and “network management.”
    Providers offer no convincing rationale as to why they cannot engage in
    reasonable network management in a non-discriminatory way. To the
    contrary, the Commission’s lengthy proceeding involving Comcast
    and its degrading of peer-to-peer (P2P) protocols proved that absent
    Commission enforcement, providers will discriminate among applications
    without concern for their customers but when ordered to do so by the
    Commission, can and will manage their networks in a non-discriminatory
    manner.

  • The Commission Has an Obligation to Protect Consumers,
    Particularly With Regard to Consumer Privacy
    . While best
    practices and principles for industry self-regulation have a distinct
    role to play in the development of online privacy protections, the
    incredibly diverse number and type of interested business entities will
    make such processes toothless without effective enforcement mechanisms
    from the relevant agencies. The Commission should reject the
    self-serving arguments that consumers are so sophisticated that they do
    not need protection, or that privacy protection comes at too high a
    cost to the industry. The Commission’s history with
    telecommunications and cable demonstrates that strong privacy
    protection is both necessary and affordable for the industry.

  • The Deregulation of the Last Ten Years Broke the Broadband
    Market, and Only Aggressive Competition Policy Can Fix
    It.
    Outside the circle of incumbent providers and their
    traditional supporters, a surprisingly diverse set of commenters
    support PK’s initial analysis that the FCC’s
    ideologically-driven deregulatory policies have created a market
    lacking in competition. As a result, all sectors of the economy pay a
    “market power tax” in the form of higher prices, slower
    speeds, and poorer coverage. As the comments submitted by the
    Government of Japan show, only a national broadband policy that gives
    competitors access to necessary facilities such as unbundled access and
    interconnection can create a robustly competitive environment.

  • The FCC Has Broad Discretion To Reform USF.
    Over 50 commentors supported reforming the Universal Service Fund (USF)
    to facilitate broadband deployment. The FCC has broad authority to
    restructure the program to facilitate broadband even without further
    Congressional action.

  • The National Broadband Plan should reflect the valuable
    role of state and local government
    . Several commenters
    asked the Commission to focus on preemption of state and local
    government as a means of facilitating national broadband deployment.
    Although PK takes no position on pending proceedings, a focus on
    preemption as the basis for the National Broadband Plan would ill serve
    our digital future. Instead, the focus of the National Broadband Plan
    should be to engage government at every level in the challenge of
    universal deployment and adoption.

ARGUMENT

I. ISPs Should Not be Allowed or Encouraged to Engage in Copyright
Filtering

The National Broadband Plan should not permit or encourage Internet
Service Providers (ISPs) to use automated technologies in order to
“filter” their networks for copyright infringement­. Such techniques
would adversely affect the free speech interests of ordinary Americans,
undermine the safe harbors available to ISPs under copyright law, and
harm the open nature of the Internet. Furthermore, no filtering
technology currently in existence or likely to be developed in the future
will effectively prevent copyright infringement. Contrary to claims made
by some commenters in these proceedings, filtering networks for copyright
infringement is not a form of “network management;” instead,
it is a form of content management.

Some commenters urge that the National Broadband Plan should permit and
even encourage broadband service providers to use “network
management” techniques to prevent the infringement of copyrighted
works,[1] citing
purported benefits such as stemming the allegedly massive tide of
copyright infringement,[2] reducing the network congestion that illegal content
supposedly creates,[3] and facilitating greater broadband
adoption.[4] Several
of these commenters also argue that such practices are consistent with
the FCC’s Internet Policy Statement and with the federal policy
of discouraging illegal activity on the Internet.[5]

While these commenters are vague about what these “network
management” techniques would entail, joint comments filed by
American Federation of Television and Radio Artists et al.
identify blocking illegal sites, watermarking and acoustic and video
fingerprinting as some examples of “network management”
techniques.[6] As
the attached whitepaper on copyright filtering explains, watermarking and
fingerprinting are methods that can be used to mark certain content as
proprietary.[7] In
order to prevent copyright infringement, ISPs would then have to compare
these identifying marks against every bit of information travelling over
the Internet, and either prevent matching content from reaching its
destination or apply some other policy based on the copyright
owner’s preference.[8] ISPs might also employ another method known as
traffic inspection to block all content using certain protocols from
reaching its destination.[9] These techniques are designed to manage the flow of
content rather than traffic on the Internet and thus, constitute content
management, rather than network management techniques, despite the claims
of some commenters.

While preventing copyright infringement is an important objective, using
copyright filtering to achieve that goal would be counterproductive to
that purpose and harmful to businesses and ordinary Americans.

A. Filters Would Be Ineffective in Preventing Copyright Infringement:

Filtering technologies currently available or likely to be developed in
the future would be both overinclusive and underinclusive.[10] Filters would be
overinclusive because they would block a significant amount of lawful
content along with any infringing content. They would be underinclusive
because they would inevitably allow infringing content to pass through
unmolested.

In order to be effective, filters would have to identify infringing
content, check it against a database of protected content made available
by rights holders and then determine if a particular user was authorized
to send or receive that content. While in some situations, filters may be
able to prevent users from sending or receiving content they did not
license, they would fail to identify whether that content is being used
under an exception to copyright law that would allow for legal,
unlicensed use. The prime example of this is fair use. Because fair use
is a case-by-case determination and involves nuances in the application
of law, no automated filter would be able to effectively determine
whether a particular use is fair. Thus, filters would inevitably block
legal uses of content.

Filters would also be underinclusive. For example, complexities in
copyright licenses are likely to make it difficult for filters to
determine the existence of a license in many situations. This may be the
case where the license specifies the number of copies a user is allowed
to make and distribute.[11] In addition, users will undoubtedly devise methods
for circumventing copyright filters. For example, if a filter is designed
to block specific protocols, protocol obfuscation[12] would allow a user to escape the
filter. Similarly, encryption could be used to prevent detection by
filters that utilize content inspection technology.[13] Because it takes relatively little
effort on the part of end users to use these techniques, such practices
are likely to become common if ISPs were to start filtering their
networks, thereby undermining the efficacy of the filters.[14]

B. Filters Would Harm Users’ Free Speech Rights:

Both copyright law and communications law seek to regulate certain
aspects of speech while also promoting free speech. Copyright filtering
would disturb this structure by endangering fair use and imposing a prior
restraint on speech.

Under copyright law, a copyright holder is never granted complete control
over a copyrighted work.[15] Limitations on and exceptions to copyright prevent
copyright law from conflicting with the First Amendment rights of
citizens. Fair use and other limitations such as the requirement of
originality, the idea/expression dichotomy, and the doctrine of thin
copyright[16]
allow for free expression in many forms, including protected forms of
speech like parody and criticism. As the Supreme Court has explained,
“Copyright … does not impermissibly restrict free speech, for
it grants the author an exclusive right only to the specific form of
expression … and it allows for ‘fair use’ even of the expression
itself.”[17]

A fair use of copyrighted work is therefore protected free speech.
Proponents of copyright filtering suggest that the filtering of
copyrighted material would be a straightforward and entirely legal
process. However, the nuances of copyright law make distinguishing
between a lawful and infringing use of a piece of copyrighted content
challenging even for courts. As such, no filtering technology, no matter
how advanced, would ever be able to make fair use determinations with 100
percent accuracy. Furthermore, because filters would operate in the
middle of the network, users will find it difficult to determine whether
or not their transmission was blocked. Thus, the user would be precluded
from presenting any fair use defense.

Additionally, by prescreening content before it can ever reach its
destination, filtering would act as a prior restraint on speech, in
violation of principles of copyright and communications law. In
Suntrust Bank v. Houghton Mifflin Co.[18]the 11th Circuit Court of
Appeals refused to grant a preliminary injunction because the court felt
that the defendant presented a viable fair use defense and the copyright
owner suffered only monetary harm. The Court explained that the
“public interest is always served in promoting First Amendment
values…”[19]

Like the Copyright Act, the Communications Act seeks to protect and
promote free speech. Section 326 of the Act prohibits the censorship of
radio communications. As the Supreme Court has explained in FCC v.
Pacifica
,[20] while sanctioning indecent broadcasts after the
fact is permissible under the Communications Act, prescreening content
would be a violation of section 326. While this provision pertains to
radio communications, it reflects federal policy against censorship, a
policy that Congress sought not to disturb when it passed the
Communications Act.[21]

In Houghton Mifflin and in Pacifica, courts refused to
sanction a prior restraint on speech even though a court or an
administrative agency would have imposed the restraint after reaching
reasoned decisions. These concerns would become even more acute in the
case of copyright filtering conducted by private corporations that would
not have to provide any justification for their actions. Thus, contrary
to claims by proponents, filtering would contradict federal policy.

C. Copyright Filters Might Undermine the Safe Harbor Provisions Provided
by the DMCA

The safe harbor provisions of the Digital Millennium Copyright Act (DMCA)
codified in section 512 of the Copyright Act protect ISPs from monetary
liability for the infringement of their users. Congress enacted these
provisions as a means to protect ISPs from the specter of uncertain
copyright infringement liability, in order to allow the nascent Internet
to develop.[22]
These provisions represent a carefully crafted balance between the rights
of copyright owners and ISPs, a balance that should not be disturbed by
the National Broadband Plan. Because copyright filtering could undermine
these safe harbors, the National Broadband Plan should not encourage or
condone copyright filtering.

The DMCA’s safe harbor provisions are based on the premise that
ISPs act as mere conduits for information.[23] Thus, in order to qualify for the
safe harbors, ISPs are required to meet certain conditions. The first of
these conditions is that all material that travels over the network must
be “initiated by or at the direction of a person other than the
service provider.”[24] This means that any transaction that takes place
on the network must be initiated by someone at the edge of the
network–either a client or a server–but may not be initiated by someone
in the middle of the network. If an ISP implemented a copyright filter,
that ISP could arguably become an active participant in the chain of
transmission. Instead of merely passing a bit of data along, the ISP
would inspect, categorize, and possibly interrupt, delay or discard that
bit of data. In so doing, the ISP could potentially be disqualified from
the DMCA’s safe harbor protections and therefore, would be exposed to
liability for any infringement that takes place over its network.

The second requirement that an ISP must meet in order to qualify for DMCA
safe harbor protection could similarly be jeopardized by filtering. This
requirement states that the transmission of data must occur
“through an automatic technical process without selection of the
material by the service provider.”[25] This use of the word “selection”
is not further clarified in the statute, creating an open question as to
what degree of filtering would qualify as “selection”. Depending on the
level of sophistication of the prioritization process, certain packet
management techniques could be interpreted as constituting a
“selection” of material. If an ISP could be described as
actively selecting what material is allowed to travel over its network,
its safe harbor protection could be jeopardized. Furthermore, this
selection process could quickly rise to the level of an “editorial
function” (i.e. choosing to prioritize data from a preferred source
over a non-preferred source), which would indisputably disqualify an ISP
from DMCA safe harbor protection.[26]

D. Filters Would Harm the Open Nature of the Internet

The open nature of the Internet has fostered innovation and creativity
and has allowed the Internet to become the democratic medium that it is
today. Copyright filtering would change all of this. Contrary to claims
made by proponents,[27] filtering would not be consistent with either the
Internet Policy Statement adopted by the Commission or the
Comcast Order.

While the Comcast Order observes that blocking certain content
may be justified in some circumstances, it goes on to observe that
network management techniques that are not application- or
content-neutral pose a danger to the open nature of the Internet and that
the “danger of network management techniques being used for
anticompetitive ends is acute.”[28] Further, the order’s justification for
blocking infringing content cannot be read to condone the blocking of all
content regardless of legality. As explained above, copyright filtering
would indiscriminately block content, without regard to the lawful,
unlicensed uses and other rights guaranteed by copyright law.

Proponents of filtering seek to justify filtering on the grounds that
illegal content causes networks to be congested and that, as such,
filtering would “ensure ease of access and the provision of greater
services, including entertainment services.”[29] Network providers have not
provided sufficient bandwidth usage data to evaluate this claim and there
are indications that peer-to-peer traffic may not be consuming nearly as
much bandwidth as is suggested by these proponents.[30] Furthermore, filters based on
content inspection technology would have to download a substantial amount
of any one piece of content in order to inspect it for
infringement.[31]
The process will inevitably slow networks. Thus, instead of reducing
congestion and thereby increasing speed, filtering would reduce speed and
introduce greater latency, thereby undermining the goals of the National
Broadband Plan.

E. Conclusion

The purpose of the National Broadband Plan is to create a roadmap that
will facilitate greater broadband speeds and encourage widespread
adoption of broadband services throughout the country. These goals cannot
be achieved if ISPs are allowed to utilize techniques like copyright
filtering, which would reduce speeds and harm the fundamental rights of
citizens. While preventing copyright infringement is important, any
attempt to address this problem should not be made at the expense of
users, innovators and legitimate businesses. Therefore, the undersigned
organizations urge the Commission not to recommend that Congress
encourage or permit copyright filtering as part of a National Broadband
Plan.

II. Any Service Provider Who Connects to the Internet Should Not be
Allowed to Engage in Discriminatory Behavior

Several content providers have argued that the Commission must allow ISPs
to discriminate, monitor user communications, and block those
communications in order to deter online copyright
infringement.[32]
As we have explained in these reply comments[33] and the attached whitepaper on
copyright filtering,[34] this approach will increase Internet congestion,
decrease innovation and adoption, and have little to no effect on
copyright infringement. While the Songwriters Guild of America attempts
to justify filtering by claiming that “easily 90% of [peer-to-peer]
traffic is unlawful” and that “[n]eutral applications, such
as P2P and other file-sharing programs, have been taken over by illegal
file traffickers,” the opposite appears to be true: lawful uses of
these technologies are growing far faster than unlawful
ones.[35] This
demonstrates exactly why the innovation that occurs within the
Internet’s open architecture must not be hampered by those who
would cripple tomorrow’s innovative lawful uses while reacting to
yesterday’s bad actors.

Several parties conflate the idea that the Commission should only protect
lawful communications with the idea that the Commission should allow
network operators to search all communications for unlawful activities.
Congress has given copyright owners numerous legal tools to combat
copyright infringement, including online infringement. From statutory
damages which can reach up to $150,000 for a single copy of a single
work[36]
to
service provider takedown notices backed by that same potential
liability,[37]
copyright owners have no shortage of tools with which to enforce their
rights. Law enforcement has tools at their disposal, including lawful
interception of data with appropriate Fourth Amendment safeguards, to
address child pornography, fraud, and other kinds of online crime.

What the Commission and the law have never done, and should never do, is
allow private interests to search everyone’s data for unlawful
content. Although there are no doubt people who drive around playing
unlawfully copied CDs, we do not allow private interests to set up
checkpoints at highway onramps, in order to search every vehicle and its
occupants for evidence of unlawful activity. While fraud inevitably
occurs on phone networks, we do not allow phone carriers to listen in on
all phone calls to ensure that no one is engaging in any illegal
activity. We should not allow these interests to set up checkpoints on
the onramps to our communications networks, in order to search every
message for potentially infringing material. As much as an entity must
give up discrimination in order to reap the benefits of connecting to the
national phone network,[38] it should give up discrimination in order to
provide access to the global network that is the Internet.

A. Allowing Service Providers to Discriminate is Dangerous and
Unnecessary

The danger of allowing continuous monitoring of our nations’
communications goes far beyond overzealous copyright enforcement. These
same technologies are used by governments to monitor the communications
of their citizens.[39] In fact, during the recent events surrounding
Iranian elections, it was easy for the Iranian government to take a
filtering infrastructure that had been purportedly put in place for law
enforcement purposes and leverage it for political purposes and the
suppression of speech.[40] Even our own government is not immune from the
allure of joining forces with private parties to monitor their
citizens.[41] In
fact, the U.S. government sought and received the cooperation of a
private company in order to monitor the communications of its
citizens.[42]
Building these capabilities directly into the networks makes it far
easier for both private and governmental parties to abuse them; a
National Broadband Plan must take into account the full implications of
allowing or encouraging technologies whose primary purpose is to monitor
and interfere with the public’s communications.

Even if the public could ultimately discipline any monitoring and
filtering which harmed the network, these types of interference are
extremely difficult to detect. Comcast engaged in targeted, potentially
anticompetitive interference with customer communications as well as the
forgery of data for an unknown period of time before being discovered by
a network engineer trying to utilize the BitTorrent protocol – for
lawful purposes – on his home Comcast connection. Perhaps worse,
when he and other researchers identified the nature and source of the
problem and made evidence of Comcast’s activities public, Comcast
denied them completely. In fact, for months after a legal proceeding had
been initiated at the Commission, Comcast denied its activities or
attempted to obfuscate the consequences of those activities. For more
than a year
after those activities had been made public by third
parties,[43]
Comcast neither disclosed nor changed its basic behavior,[44] and change was only
forthcoming after the Commission stepped in and ordered it.[45] As technology advances,
interference is likely to become even more difficult to recognize, and
discriminatory, anticompetitive, or otherwise unfair practices even
harder to identify and correct without regulatory protections.

In helping to design a National Broadband Plan, the Commission must not
conflate “network management” with “content
management.” ISPs can manage their networks to ensure fair
distribution of bandwidth without choosing between different
types, protocols, or providers of data – activity which really
constitutes the management of content, not of the network. Suggestions
that there is only a choice between discrimination and reasonable network
management are a red herring; reasonable, non-discriminatory network
management can and should be the requirement.

B. The Market Does Not Solve the Problems Created by Discrimination

In its comments, Time-Warner Cable argues that “recent history
confirms that network operators will be responsive to consumer demands,
including in particular when their business practices are perceived as
unreasonable,”[46] pointing to Verizon Wireless’ reversal of
its blocking of NARAL’s SMS messages as evidence that the market solves
these problems, and others have reiterated the claim that the market will
fix potential problems.[47] That example is inapposite at best and misleading
at worst. In the NARAL case, there was a politically powerful
victim[48] and
the background threat of a regulatory remedy.[49] Further, despite that single party
finding a public remedy, Verizon Wireless continues to publicly
discriminate against other parties who wish to engage in lawful
communications.[50]

Other carriers, like AT&T, explicitly forbid their users from using
lawful video services such a SlingBox[51] that might compete with their own offerings.
Carriers attempt to justify these restrictions as blunt instruments for
“network management” to prevent congestion caused by
bandwidth-intensive applications. But while they restrict some
applications, those same carriers continue to allow comparable services
on the network, as long as those services are from corporate partners
like Major League Baseball.[52] Further, while they restrict competing
applications, they impose bandwidth caps, which more directly address
concerns about “too much use” of the network and artificially
reduce the demand for competing products.[53]

Time-Warner also completely ignores the events surrounding
Comcast’s throttling of BitTorrent traffic, where no amount of
public outcry and Commission attention could reverse (or even force
Comcast to admit its conduct) until the Commission ordered
Comcast to respond and subsequently to cease.[54] For over one year, as Comcast
interfered with BitTorrent traffic, subscribers and innovators suffered.
This serves as proof positive that the non-competitive broadband market
and the high switching costs for broadband customers simply do not
constrain such behavior, which threatens the very innovation and
creativity that made the Internet the most important communications
medium of our time. Further, even in cases where – after extensive
litigation – ISPs worked with application providers to find
solutions to problems, these solutions have occurred at the application
layer without requiring ISP discrimination, pointing to the need for
standards-driven, application-side solutions rather than unfettered ISP
“management.”[55]

C. Nondiscrimination is Compatible With Edge-Based Security and Parental
Control Solutions

It is also important to distinguish between “no discrimination by
service providers” and “no discrimination by users and
applications.” We need not foreclose the possibility of
standards-based, user-requested prioritization of a
user’s own traffic,[56] which is supported by at least one Internet
Engineering Task Force Standard.[57] Nor should we read a nondiscrimination
requirement, as some suggest, to prevent ISPs and others from offering
services that allow users to “shield themselves or their children
from certain sites or from online security threats”[58] through activities at the
edge of the network. ISPs are welcome and encouraged both to participate
in the standards processes and to provide subscribers with access to
tools to help them choose which data they want or prefer – but not
to choose for the customer by applying filters at the network level.

D. Nondiscrimination Protects and Encourages Innovation

Rather than “threaten[ing] to harm consumers by thwarting the
continued deployment of broadband networks,”[59] nondiscrimination principles
prevent service providers from choosing which data they prefer. It is
discrimination that suppresses innovation on the Internet, discourages
the adoption of new services, and restricts the very demand which makes
broadband providers deploy and improve their networks.

Verizon argues that “there is no reason to assume that alternative
platforms and environments that are more managed cannot also foster
innovation.”[60] Whether or not alternative platforms which compete
with the Internet might foster innovation is irrelevant to our national
broadband plan; to protect the innovation which the open Internet has
brought, our national broadband infrastructure must continue to be open
and attempts to close it must be resisted.[61] To this end, service providers must
not be allowed to leverage their control of access to the Internet to
competitively favor their own offerings. If “alternative platforms
and environments” are to compete with the Internet, they must
compete with the Internet as the democratic platform it is now –
not with an Internet selectively crippled to favor other offerings.

Most importantly, we must not allow service providers to interfere with
the functioning of the network itself, in an attempt to turn back the
clock to eliminate the Internet as a competitive service platform. A
service provider might attempt to do this by engaging in behavior that
creates winners and losers at the edge of the network.[62] By virtue of their
special placement in between users and application providers, ISPs are
uniquely positioned to engage in this sort of behavior. If “[s]ome
services – such as backing up data online – may require lots
of capacity, but be less time sensitive or less affected by latency or
jitter [while] [o]ther services – such as VoIP – may not
require much bandwidth, but may suffer if network conditions result in
latency,” we should continue to allow innovation to address these
problems at the application level and standards level – not to
allow service providers to make the Internet behave differently for users
of different providers and make it harder for application
providers to solve these challenges. And while “all bits” may
not be “created equal,”[63] users – not service providers – should
decide which bits are more important or time-sensitive.

If Verizon Wireless degraded your phone connection to Comcast’s customer
service center, you might be more likely to order Verizon FiOS broadband
services. If they degraded all your calls to the family and friends on
and unlimited calling plan, you might use less minutes, costing Verizon
less money. If they delayed calls to a business customer’s
overloaded call center–essentially making a value judgment that the
call center was too congested and calls needed to be managed by the
provider–they would actually hinder that center’s ability to
increase its capacity and provide better service to the customer. And
even if you somehow knew these things were happening – which is by
no means a given –depending on where you live and work, you might
have few or no options even if you were willing to accept the high
transaction cost of switching carriers. We do not allow this type of
discrimination on the phone network, and we should not allow it on the
Internet either.

E. The Problem of Open Networks and Nondiscrimination Has Not Been Solved
by the Market or the Commission’s Current Rules

Several parties wish to dial back the clock, arguing that the
Commission’s Internet Policy Statement does not or should
not apply to wireless broadband providers.[64] The assertion that this policy was
designed only for wired networks is without support beyond the fact that
it was originally released in a proceeding regarding wireline
providers.[65]
Nothing in the text of the statement indicates any relationship between
the type of network and the users’ rights; to the contrary, the
allowance for “reasonable network management” is deliberately
platform-neutral and flexible. Further, as none of the characteristics of
wireless networks mandate that those networks must be
“managed” by engaging in content- or type-based
discrimination, wireless networks should be equally subject to
nondiscrimination principles.

Nor are networks already open, as some parties suggest.[66] The examples provided
prove that the opposite is true. For instance, while Google’s
Android platform will “soon be available from
Verizon,”[67] phones built on that platform for Verizon will not
be offered on or transferrable to Sprint, despite the fact that Sprint
uses the same network technologies. Further, while the iPhone does allow
third-party applications, those applications must be approved and the
phone itself is not available on any U.S. carrier but
AT&T.[68]
These are hardly the hallmarks of “open networks.”

In fact, the current state of the wireless industry demonstrates why the
open model currently enjoyed by the Internet world fosters far more
innovation than the closed, fragmented model that the wireless carriers
seek to preserve. Although there is undoubtedly some innovation
in the wireless world, it is hindered by exclusive contracts and
extensive permission-based approvals processes for devices and software.
Imagine a world in which a Comcast broadband customer (a) could only buy
an Apple computer (b) couldn’t bring their Apple computer to
Time-Warner Cable and (c) needed permission from Comcast in order to use
the web browser, chat program, or email program of their choice. Such a
fragmented Internet would never have produced the explosion of innovation
and wealth that the standardized Internet has, yet this is the wireless
market that many Americans are forced to navigate. The National Broadband
Plan is an opportunity to ensure that that we continue to work toward the
goal of an open broadband ecosystem and all that that offers.

The Commission should not be distracted by red-herring arguments that
openness principles would require every device to run every application
and force devices like the Amazon Kindle off of shelves or that every
application provider would be forced into untenable
situations.[69]
Openness is about preventing broadband access providers from
restricting which devices attach to the network and which applications
can run on those devices, either directly or through alleged
“network management” techniques. It is not about controlling
how devices are made or telling application providers–who are users of
the Internet just like broadband subscribers–how to act. When the
Commission held that AT&T had to allow non-AT&T devices on the
phone network, AT&T was not then forced to design and offer every
device that its competitor did, nor were other companies who used the
phone network forced to behave any differently. AT&T simply was not
allowed to exclude their competitors or deny them access to the
network.[70] As
it did for the wireline phone market in the past, a cross-platform
“open access requirement would promote customer choice without
inhibiting market forces.”[71]

Far from “produc[ing] uncertainty and potentially undermin[ing]
future private investment in broadband deployment,”[72] non-discrimination rules
provide certainty about how an ISP may act, and more importantly provide
certainty that innovative new applications will receive a fair chance to
succeed in the market, driving investment in the application space that
that has been the source of true economic growth in the Internet economy.

III. A National Broadband Plan Needs Meaningful ISP Privacy Regulations

Some commenters have asserted that it is in the consumers’ interest
for regulators to do nothing further in the realm of regulating privacy.
The U.S. Chamber of Commerce, for instance, has cited privacy regulation
as a potential cost to providers, which might raise prices for certain
products and services.[73] However, consumers, as citizens, have rights more
fundamental than low prices, and the need for regulation to protect
communications privacy has been recognized in America from the inception
of the United States.[74] The right to privacy in telecommunications has
evolved in various statutes and regulations through the centuries, but it
is either naive or disingenuous to claim that these require no updating
to reflect the technological and social changes of the past decades.
Existing laws may vary widely depending upon the outdated distinctions
between different types of infrastructure. The very inconsistency and
loophole-filled nature of the law widely lamented in other areas of
communications law is present in privacy protections as well.

While best practices and principles for industry self-regulation have a
distinct role to play in the development of online privacy protections,
the incredibly diverse number and type of interested business entities
will make such processes toothless without effective enforcement
mechanisms from the relevant agencies. New market niches are constantly
developing, and within new niches, competition would be insufficient to
police bad actors. Even in more mature markets, such as last-mile
Internet service, a paucity of competitors means that competition for
privacy would have little effect on consumers voting with their wallets.

For example, it would be entirely unhelpful if all of the competitors in
an industry adopted the same policy protecting their interests at the
expense of the consumer. Unlike in applications, such as Facebook, where
consumers can easily migrate to different platforms and a user revolt has
real teeth, consumers cannot shop for ISP based on privacy policy. This
is especially true where (a) consumers have little choice of provider to
start with, (b) consumers are generally informed of privacy policies
immediately before signing (after they have already invested significant
time and effort into making a choice), and (c) providers can change
privacy policy at will, requiring consumers to remain vigilant for
changes and overcome research costs and significant switching costs to
try to find another provider with better privacy terms—who may
change those terms at will.

A number of comments, mostly from major information service providers and
the groups that represent them, have opined that “notice and
choice” regimes are sufficient to protect privacy. Citing the
increasing sophistication of their customers, these commenters maintain
that, given adequate notice and choice, consumers will opt for a
sensible, customized balance of privacy and disclosure.

For instance, Time Warner cites the fact that consumers expect providers
to post privacy policies as a sign of sophistication.[75] However, the more
relevant question of how well the consumers understand that language of
those policies is not addressed in its comments.[76] Researchers have found that large
percentages of consumers believe that the mere existence of a privacy
policy prevents such commonplace activities as affiliate sharing and
online behavioral analysis. Such results clearly show that the
marketplace and existing regulations have failed to provide consumers
with the tools necessary for them to make informed decisions.

Mere transparency is not enough for informed decisions. An incredible
amount of information flows over broadband networks. This information can
be categorized into varying levels of privacy sensitivity with nearly
infinite granularity. The number of entities interested in varying layers
of this information is vast and continually growing. Add to this the lack
of standard interfaces or methods for managing preferences, and even a
hypothetical, ideally knowledgeable consumer would have to invest a great
deal of time and energy to exercise his right to privacy.

Within its section on privacy, Verizon comments on the need for
regulation to “apply to all businesses and all
technologies.”[77] Certainly, a difference in the devices or the type
of cabling that carries information should not affect the principles
protecting users’ legal rights. However, Verizon’s examples of two
different technologies—cookies and deep packet inspection—may
not be the best illustration for technological neutrality.

Communications privacy is at its core about the flow of information. If,
regardless of the technology, the information is processed, stored, and
disclosed in the same way, there should be no difference in its
regulatory treatment. However, to the extent that different technologies
handle information differently, different rules are necessary.

Verizon claims, for instance, that the difference between gathering data
via cookies and via deep packet inspection is not so wide as it might at
first appear. In both cases, a large amount of data regarding browsing
preferences and online behavior can be compiled. However, Verizon seems
to assume that DPI, in this case, is being used solely for the
advertising purposes claimed by its proponents to date.

One of several major differences between cookie and DPI-based behavioral
analysis is the information that is accessible to the party collecting
consumer data.[78] While a legitimate advertiser may only have an
interest in the same sort of data available from a cookie-based
behavioral advertising network, that is cold comfort to the consumer
whose every packet is subject to interception by a third party.

Verizon says, and we agree, that it is not technologies themselves but
the uses to which they are put that should be of concern to regulators.
However, this does not require that regulators turn a blind eye to the
practical differences between technologies, or their differing potentials
for abuse.

IV. The national broadband plan should reverse the Commission’s Policy of
Deregulation

The first round of comments in this proceeding made one point abundantly
clear: The broadband market is broken and needs fixing. This repair work
begins with the recognition that the policy of vigorous deregulation set
forth most explicitly in the 2005 Wireline Framework.[79] The Commission has
authority to reexamine the classification of residential broadband access
and reclassify it as a Title II “telecommunications” service,
or to impose sufficient structural separation requirements,
interconnection requirements and other safeguards as it deems necessary
to enhance the current moribund state of competition. But unless the
National Broadband Plan recognizes the current dysfunctional state of the
market and attacks the problem at its root, the effort to develop a
universally accessible, affordable national broadband infrastructure will
remain equally dysfunctional and ineffective.

A. The Comments Clearly Document A Broken Broadband Market That Fails To
Provide For Our National Broadband Needs

In numerous comments, parties outside of the orbit of the incumbent
carriers told story after story about a situation that favors incumbents
but leaves out businesses, consumers and competitors, with the attendant
ills that are brought on by a market failure and a lack of competition.
Comptel summarized the matter as succinctly as possible:

The short answer is that Congress would not have needed to allocate 7.2
billion dollars of taxpayer money to increase broadband deployment and
affordability if existing mechanisms had been effective and efficient at
ensuring broadband access for all Americans.[80]

The lack of affordable, high-speed access that arises from the current
concentrated market structure acts as a restraint and a “duopoly
tax” on businesses well beyond the traditional telecommunications
and technology sectors. For example, small business is often cited as the
mainstay of the U.S. economy. The inability of many small businesses to
get broadband access at affordable prices and useful speeds therefore
restrains the broader economy and depresses its ability to create jobs
and expand business opportunities. The Broadband Institute of California
summarized this predicament clearly:

Access to broadband at appropriate upload speeds is essential to small
urban and rural businesses. However, the lack of significant competition
in the market for broadband services provided to small businesses has
resulted in higher prices and lower speeds for small businesses. As a
result, either due to a lack of affordability or availability most small
businesses do not subscribe to the types of broadband technologies (T-1
lines) best suited to their needs.[81]

The New Jersey Division of Rate Counsel, speaking on behalf of ratepayers
of that state and, by extension, of consumers across the country,
accurately portrayed the kind of pseudo-competitive environment that
exists even in areas considered well-served, and the harm this imposes on
consumers:

In some locations, two suppliers are present, but this duopoly does
not equate to effective competition:
although it is preferable that
consumers can choose between two suppliers (as opposed to having a
single option), a duopoly does not represent effective
competition.[82]

Comptel placed a dollar value on this “duopoly tax” in terms
of direct impact to retail customers. In California, the adoption of
deregulatory policies similar to those adopted by the Commission resulted
in ever-increasing rates for local service that cost consumers more than
$100 million annually, while other charges have increased as
well.[83]

How did this appalling situation come about? As PK and others documented
in the initial comments, this environment is the inevitable consequence
of the Commission’s determination to abandon proven regulatory
means used by Congress and the Commission to promote competition in the
past, such as line sharing and the regulation of dominant providers.
Instead, the Commission pursued the untested and unsubstantiated theory
that requiring potential competitors to create redundant networks would
somehow spur “intermodal” competition superior to the already
existing “intramodal” competition brought about by structural
and behavioral regulation. While ideologically appealing, this method
proved an utter failure in addressing the realities of the costs of
network construction. The ability of incumbents to use their existing
assets and market power to thwart competitive entry and raise the cost to
competitors doomed this approach from the beginning. Only a reversal of
these policies can restore a competitive environment. As industry veteran
Fred Goldstein told the Commission in the first round of comments:

The Commission over the past eight years essentially gutted over two
decades of procompetitive, pro-consumer policies. It destroyed workable
elements of the regulatory structure and created instead an unworkable
mess.[84]

B. Open Access, Either by Reclassification or Other Means, is a Necessary
Pre-Requisite to Fixing the Existing Dysfunctional Dynamic

Many commenters agreed not only with PK’s diagnosis of market
failure, but also with PK’s prescription for a better future:
reclassification of broadband access as a Title II service and/or a
return to proven pro-competitive policies of structural separation. For
example, the Consumer Federation of America (CFA) and Consumers Union
(CU) argued:

The FCC must change course, if it is to advance the nation toward
universal broadband service by adopting the following principles and
specific measures.

The FCC must get back to basics and define broadband as Title II service
eligible for universal service support as the means to ensure that all
people of the United States have adequate facilities at charges that are
just, reasonable, affordable and nondiscriminatory. The Commission
should adopt an experiential approach to defining broadband, with any
technology capable of supporting the range of activities in which
broadband users engage being eligible for support with universal service
funds.[85]

The National Association of Telecommunications Officers and Advisers
(NATOA) likewise observed that open network policies would provide
numerous consumer benefits through increased competition and increased
efficiency overall.[86] PK echoes NATOA’s arguments that open
network regulation would dramatically improve the market to the advantage
of the public in the following ways:

  • Multiple service providers competing head to head over a common
    platform is a more efficient use of resources and will fuel innovation
    in broadband services, which will accelerate economic growth and
    benefit local communities.

  • Open access can enable network neutrality through the benefits of
    competition and consumer choice, without requiring complex regulatory
    oversight of neutrality compliance.

  • Open access negates the inherent monopoly nature of next generation
    fiber networks.

  • In open networks, new service providers will market the network.

  • Open access will maximize utilization of network capacity, allowing
    full realization of the incredible potential of technologies such as
    fiber optics for providing tremendous amounts of
    bandwidth.[87]

The Government of Japan provided a stark, if embarrassing, example of
what the proper regulatory structure can accomplish in contrast to our
supposedly “vibrantly competitive free market.” In 1999 and
2000, the government made certain that copper-based local loop unbundling
and line-sharing were established, that collocation rules and followed
that by unbundling the fiber network as well.[88] Earlier this year, Japan made certain
that even that nation’s most advanced networks would be unbundled.
Similar policies are also in place for wireless networks in Japan. The
result is that consumers pay a mere six cents per hundred
kilobits.[89]
Further, despite the prediction by proponents of deregulation that such a
regulatory framework would discourage investment in capacity and slow
adoption, speeds for service have increased[90] and market penetration has as
well.[91]

Japan is not alone in taking action to improve its broadband through
sensible, pro-competitive regulation. Viviane Reding, the European
Union’s commissioner for information society and media, recently
announced a new regulatory proposal aimed at building on the unbundling
rules she helped to institute in the EU. In a June 25 speech entitled,
“Towards a European Strategy of High Speed Broadband for All: How
to Reward the Risk of Investment into Fibre in a Competitive
Environment,”[92] Reding said of the new proposed rule (known there
as a Recommendation):

From the one side, I have heard the criticism that the Recommendation
does not recommend a generalized roll-back or even a dismantlement of
ex ante regulation.

Firms, it is said, need to be given regulatory holidays – by means
of the law, by means of overly broadly defined markets, by means of new
markets or by means of a dogmatic preference for passive over active
remedies – otherwise they will simply not invest.

We all know that this is a criticism which simply is not going to fly. I
have spent the last years fighting for effective competition in telecoms
markets. I am not going to turn my back on our policy of liberalization
and pro-competitive ex ante regulation. After all, this policy
has led to successful and often deep market entry in the past, and it
has contributed to wide usage and take-up of services. The last thing we
need is new monopolies, and the poverty and artificial scarcity of
services that would inevitably go with it.

Other nations have learned the lessons that we have forgotten, to their
benefit and to our detriment. Now is the time, as the FCC considers a
national broadband plan, to take a focused, fact-based, experience-based,
clear-eyed look at what works and what doesn’t. The dismal experience in
our own country with deregulation, contrasted with the experiences of our
global competitors under a regulatory regime better calibrated to ensure
affordable high-speed broadband access, should provide us with both a
clear path forward and a warning, if we should fail to act while other
nations move ahead.

V. Meaningful Universal Access Requires Affirmative Government Action At
Every Level

As is recognized by many commenters, a successful National Broadband Plan
will fully engage government at every level and in a variety of roles.
Understandably, commenters in the NBP focus on the federal role and how
the federal government can make meaningful broadband access available to
all citizens of the United States in a timely manner.[93] In particular, a broad
consensus emerged around the prospect of universal service reform, with
well over 50 sets of comments supporting reforming the existing Universal
Service Fund (USF) for broadband–-a position PK also supported in
its initial comments.

PK fully supports adopting mechanisms that would shift existing USF funds
from supporting legacy services to providing support for broadband
infrastructure, network operation, and digital inclusion programs. As is
discussed below, the language of Section 254 provides the Commission with
considerable flexibility to shape the USF to embrace these objectives.
When read in concert with other provisions of the Act, it is clear that
the Commission has the necessary statutory authority to pursue USF reform
along the recommended lines, even absent new authorization from Congress.

In addition to this positive role of supporting broadband through USF,
several commenters have urged the Commission to preempt state authority
in a number of areas.[94] The unfortunate impression gleaned from these
comments and from previous FCC action preempting local franchising, is
that local governments are invariably a hindrance to broadband
deployment.

While taking no position on the specific proceedings cited by various
parties, PK therefore emphasizes in these replies that a successful
National Broadband Policy must engage government at every
level–-Federal, state and local. While some situations may require
federal standards and preemption, these should be regarded as the
exception, rather than the rule. As the Commission’s experience in
preempting local franchising authority has demonstrated, the preemption
of local authority can have unanticipated consequences, such as the
negative impact on Public Educational and Governmental Access Channels
(“PEG channels”).

As PK observed in its initial comments, the National Broadband Plan
should ensure that all levels of government play a positive role in
ensuring not only universal coverage but also effective outreach and
training sufficient to make access meaningful. Rather than view local
governments with suspicion, a successful national broadband plan will
include genuine efforts to engage local and state governments as partners
in the greatest infrastructure investment in the history of our country
since the development of the federal highway system. While the Commission
should certainly move expeditiously to resolve outstanding proceedings on
preemption of local authority, it would be a mistake to regard preemption
as the sole focus of the National Broadband Plan and its relationship
with state and local government.

A. The Commission Has Authority To Act On USF Reform

The Commission has broad authority to create, manage and operate the USF.
Although the roots of the USF are older than the Communications Act
itself, the FCC created the antecedent to the existing USF in response to
the break up of the AT&T monopoly.[95] Concerned that the considerable changes
occasioned by the creation of a competitive long distance market and the
replacement of a national monopoly with seven Regional Bell Operating
Companies (“RBOCs”) would make phone service unaffordable in high-cost
areas or for the poor, the Commission created a “Universal Service
Fund . . . to ensure that telephone r

Download