A constant challenge for those who seek to limit the size and scope of government is the common failure amongst the public to understand the difference between intentions and consequences. Although problematic for all aspects of life, this is a particularly pernicious problem in the realm of governance, where it is easy for the average American who sees little of what actually occurs in Washington DC to take an idealized view of the abilities of government and the motives of those who make up its workforce.
A prime example of this problem is the debate over net neutrality. Most agree that it would be bad for the internet if the service providers (ISPs) that connect users to the internet arbitrarily blocked or throttled access to certain sites. The internet has thrived as a bastion of freedom, and no one who appreciates the vast economic and social benefits derived from its emergence wants that to change. Yet due to their misunderstanding of both the architecture of the internet and the government’s interest in it, it is those who claim most loudly to want to save the internet that have put it in jeopardy.
By seeking to make the government arbiter of the net, agitators for regulation to enforce net neutrality would put responsibility for the net’s protection in the hands of those least capable of dealing with its complex and continuously evolving nature. To make matters worse, they would do so to fight off a largely imagined problem.
The primary stated concern of the pro-regulation camp is that ISPs will abuse their positions to block or slow particular content. For instance, in Esquire’s fever dreams that includes slowing access to forums that might help those grieving for deceased loved ones and then soliciting account upgrades to restore speeds to normal. As hysterical as this imagined scenario is, it’s also not hard to see the legitimate problems should ISPs liberally pick high-speed winners and losers. So to protect against this hypothetical problem, they demand all data be treated equally. What their demand ignores is that internet data has never been equal, and that we should not want it to be if the goal is to keep the internet innovative and robust.
Some data is simply more important than other data. Sometimes this is because the content is more important, such as a 9-1-1 call placed through a VoIP service versus a typical YouTube cat video. In other cases the difference is due to different user experiences inherent to particular mediums. A few seconds of delay in loading a page of text is less bothersome to the customer, in other words, than a constantly buffering video stream.
Prioritization which recognizes varying degrees of data importance is not some future threat, it is current reality. ISPs already recognize that data importance varies for the reasons mentioned above and reacts accordingly. Netflix, despite seeking regulations to prohibit paid prioritization, is even now negotiating with ISPs to ensure their customers receive quality service. Yet we don’t see today the internet dystopia net neutrality agitators suggest is inevitable when data is treated unequally. Rather, it has been a positive for consumers who benefit through access to more dependable services, and could be a bigger boon still with a more certain regulatory environment.
Generally, it wouldn’t much matter that consumers don’t entirely understand how the internet works. However, fear-monger from politicians and special interests make public ignorance a fertile ground to pad their power or advance special interests.
For instance, content providers like YouTube and Netflix make a lot of money through bandwidth-heavy video distribution services. At peak hours these two companies alone can account for about half of all traffic. Unsurprisingly, they don’t want to pay for an equivalent share of bandwidth costs, preferring instead to continue the business model that has made them successful – namely, forcing ISPs to bear part of their operating costs, which get passed on to all internet consumers regardless of whether they use YouTube or Netflex.
And like all businesses, they want legal protections to lock in the current system and protect them from disruptive market changes. So they push for so-called net neutrality regulations that would enable them to continue freeloading on existing infrastructure without being asked to pony up for the costs of maintenance and improvements to better handle the heavy strain their service place upon it. But that’s a case where they should be careful what they wish for, as law moves much slower than markets. The same law or regulation that might offer protection today, by trying to force implementation of a system long after it’s been rendered obsolete, will be their chains tomorrow. That’s just the nature of government, and why market planning always fails.
Allied with these special business interests are power-hungry politicians and bureaucrats who never miss an opportunity to expand their personal authority. President Obama is among them, and is urging the FCC to regulate the internet as a utility by invoking Title II of the Communications Act of 1934. That’s right, the people who claim to want to save the internet think it should be governed under legislation written before it existed and while television was still in its infancy. Even generously counting the update to the law that took place 18 years ago – though the Title II authority comes from the original legislation – it still outdates modern high speed technology.
The Telecommunications Act of 1996 that was used to update the Communications Act demonstrates perfectly the slow pace of government vis-à-vis markets, as it is already severely outdated. Its authors sought to foster competition within distinct markets, or between companies providing the same service using the same underlying technology. What happened in reality is that various technologies advanced to the point that multiple solutions were available to solve the same problem – VoIP could compete with wired and wireless telephone companies, and online video with cable television. As a result, many companies presently competing to offer the same services are operating under different regulatory regimes, which distorts the market and discourages innovation.
Under Title II, a small, unelected bureaucracy would hold vast power over the internet. And an internet where the FCC is free to effectively set rates and control infrastructure would look much different than the expansive, innovative sphere to which we’ve grown accustomed.
If such a regime had been in place over recent decades, we simply would not enjoy many of the conveniences that we do today. ISPs would be forced to beg political appointees unlikely to posses much technical knowledge before introducing new products and services. That’s a recipe for stagnation. This is the same government, after all, which managed to spend billions just on a website – and they couldn’t even get it to work on time.
Much of the fervor for regulating ISPs is understandably driven by widespread perceptions that existing providers are generally awful. This is not without merit, as Comcast is often considered one of the worst companies in America. But this is in large part due to the many existing rules and regulations that work to suppress competition in the industries in which it operates. The solution then is not to tighten control and erect yet more barriers, but to remove those limiting competition.
It is a mostly laissez-faire approach that has given the internet room to grow and thrive, and enabled its culture of radical freedom. A socialist-style forced equality to promote net neutrality would kill the innovative spirit of the internet and strangle it into mediocrity.
Don’t miss last week’s column: A Republican Failure to Reform CBO and JCT Would Be Ideological Surrender.
Brian Garst is a political scientist, commentator, and advocate for free markets and individual liberty. He also blogs at BrianGarst.com and you can find him on Twitter @BrianGarst.