No Neutral Ground: The Problem of Net Neutrality
By Dr. Brian Dellinger
On Nov. 21, the Federal Communications Commission announced plans to revisit its Obama-era Internet regulations. It seems likely that the resulting vote will repeal the policies often referred to as net neutrality. The name is, perhaps, misleading; to support net neutrality is to support placing the Internet more fully under government supervision. The related political debate often divides traditional allies with arguments for free expression pitted against defenses of small government.
To understand net neutrality, one must see its position in technical history. Traditionally, Internet service providers (ISPs), such as Comcast and Verizon, have guaranteed their customers a certain quantity of bandwidth — that is, a certain amount of data per unit of time. It was assumed that even a voracious user would rarely use his maximum bandwidth, and services were priced under this assumption. ISPs also de facto allowed customers to access whatever websites they wished. While there was no legal protection for this behavior, technical complexities made discrimination by website infeasible. The result was a largely open web: anyone with a blog could potentially reach millions.
In the early 2000s, the situation changed. Technological innovations enabled providers to determine which site a user visited. In principle, an ISP could now sell “packages” of websites, in a fashion resembling cable television: “basic Internet” for news and Facebook, say, or “premium Internet” for those who wanted more. These years also saw the rising popularity of streaming video services like Netflix and YouTube. Users now binge-watched videos, consuming their maximum available bandwidth for hours at a stretch. Such trends increased costs for the ISPs, leading them to investigate new responses: restricted access to high-usage sites, artificially slow downloads, and so on.
Net neutrality stands in opposition to these changes. Broadly, under net neutrality, the government requires ISPs to treat all web traffic in the same way: no limiting access, no reducing speed. Since 2005, the FCC has several times established net neutrality regulations; inevitably, the courts struck down such rules on the grounds that the FCC lacked the authority to regulate ISPs. In response, in 2015 the FCC redefined broadband Internet as a telecommunications service, placing it under FCC jurisdiction, and promptly passed net neutrality rules. With the political shift of the 2016 elections, new FCC Chairman Ajit Pai began rolling back these regulations — hence the upcoming vote.
Both sides of the debate have merit. Concerns that ISPs might slow targeted websites are not idle speculation. Comcast did precisely that to Netflix in 2014. Indeed, Comcast and others have done little to engender public trust in their behavior. Comcast had pledged for years not to “prioritize Internet traffic or create paid fast lanes.” That pledge disappeared from its website less than a day after Pai announced policy changes.
It is also true that the meritocratic nature of the Internet — its enabling of anyone to win a following through quality work — has been one of its most notable virtues. A world of “basic Internet,” in which new entrants might be simply unreachable, would reduce its value as a platform for new ideas.
Despite these fair concerns, arguments against the FCC rollback seem insufficient. It is difficult to deny that price incentives have drastically shifted over the last decade. If streaming video is generating much of the ISPs’ expenses, it makes intuitive sense that providers might demand Netflix share those costs or might price service by total consumption rather than maximum bandwidth. Nor are the corporations supporting net neutrality any more trustworthy than the ISPs. Setting Netflix aside, supporters such as Google and Facebook seek to block ISPs from trading in users’ private information — a trade on which these companies themselves depend. For them, net neutrality eliminates the competition.
Other objections rely too heavily on speculation. While a “fast lane” Internet would be a marked shift, the brief history of the Web is one of constant change. Indeed, the rise of mobile browsing, which often limits the user to app-specific websites and now constitutes a majority of all web usage, may produce a greater alteration than that net neutrality would prevent.
Further, the Internet is historically the result of market activity rather than top-down regulations. If one approves of its remarkable evolution to this point, it seems peculiar to assert that this is the moment to freeze it through government action. Given how few accurately predicted that evolution, it seems hubristic to assert how it will change next. Perhaps, as the ISPs argue, the increased revenue from a non-neutral Internet would enable the expansion of broadband networks, ending regional monopolies of service providers. Such a change might ultimately produce a faster, more accessible Internet — or it might not. But the experiment seems worth the risk.
Finally, whatever one’s feelings on net neutrality, the 2015 rules should be seen for what they are: a staggering expansion of bureaucratic power by decree of the bureaucracy itself. The result is an ugly patchwork of overlapping authority between the FCC and the Federal Trade Commission, with ISPs disfavored over similar services. This reclassification can never be a stable solution; it will always be vulnerable to precisely the kind of unilateral repeal currently occurring. If the public supports net neutrality, then let it be defended through the proper channel: by laws, not bureaucratic fiat.
Dr. Brian Dellinger is an assistant professor of computer science at Grove City College. His research interests are artificial intelligence and models of consciousness.