[Once again there has been a lot of press discussion about network neutrality with FCC Chair Genachowski’s announcement to put a proposal to his fellow commissioners “Preserving a Free and Open Internet” and EC Commissioner Neelie Kroes statements in Europe on the same issue. The recent debacle with Comcast charging Level 3 a peering fee to deliver NetFlix and Comcast’s intent to purchase NBC (see below) reinforces the need to protect and enshrine an Open Internet given the huge concentration of market power of the cable/telco duopoly and the lack of competition in the Internet telecom marketplace. This is not only true for US, but other countries like Canada where restrictions on foreign ownership have created a similar vortex of market concentration between media and telecom/cable companies.
The highly respected Internet pioneer David Reed I think summed up the issue quit well in his recent blog (http://www.reed.com/blog-dpr/?p=64) that the Internet is a “separate” thing. It is often confused with and equated with broadband, but the two are not the same. The Internet is essentially an agreed upon set of protocols for the sharing and transmission of data over virtually any medium, while broadband is one of many possible infrastructures for delivering the Internet to users.
To my mind treating the Internet as a separate “thing”, especially different from broadband is an important concept and why regulation of an Open Internet needs to be treated differently than broadband. This is not the first time that regulators and policy makers have recognized a new technology as requiring special treatment. As I have blogged before cable TV, in the early days was also given special regulatory treatment in Canada and US, in recognition that it’s a separate thing than being just another telecom service. (http://billstarnaud.blogspot.com/2010/11/how-will-we-know-when-internet-is-dead.html) The outcome of regulation of cable TV as being a separate thing was the creation of a strong and vibrant cable TV industry in North America. Countries that allowed the telcos to compete with cable companies such as Australia largely killed off this important industry sector in those early years.
I fear the same thing will happen to the Open Internet today as happened in Australia with cable TV in the 1970s. The cablecos and telcos will continue to push for ways to control and modify the Internet, especially in the wireless domain. They will morph it into many different variants of “Internet-like” architectures and “special” services, and essentially kill the Open Internet as we know it today. This is why I also agree with David Reed that making a special exemption in terms of network neutrality for wireless broadband is a bad idea. The case for treating wireless Internet differently is based on the misguided assumption that wireless is a narrow, single-channel, low-bandwidth service. But in reality there is an incredible wave of innovation occurring in the wireless market with multi-channel, cognitive radio integrating WiFi, WhiteFi, mesh radio etc that will more than enable sufficient bandwidth to treat the Internet over wireless the same as Internet over wires.
However I am not as hopeful as David Reed in terms of regulatory protection for an Open Internet. The incumbents will emasculate any regulation through the courts or by lobbying their political friends . A case in point is Canada where the regulator has probably imposed some of the most stringent requirements anywhere, in terms of open access on both cable and telephone industry, and yet for the most part these requirements have been thwarted in gaming of the system by the incumbents.
From the lessons we have learned about cable television, I believe if we want a truly Open Internet we need to deploy an infrastructure that is independent of the telcos and cablecos. Fortunately we have most of the important components of such an infrastructure in place thanks to the deployment of R&E networks nationally and regionally. With the added capabilities of the many community networks funded by BTOP such as UCAN it is well within the realm of possibility to deploy both a wired and wireless National Public Internet (NPI)– that is committed to the principles of an Open Internet. I am not advocating that we replace the telcos and cableco and their “Internet-like” service. But much like PBS and NPR provides an alternate voice to the mainstream broadcasters, NPI could ensure that there remains an independent and open Internet with all the benefits that entails in terms of innovation, freedom of speech and freedom of assembly.
I also believe that deployment of a NPI is critical for the future of R&E networks, as the mandate for many R&E networks is to provide services to researchers and educators that are not available on the commercial Internet. The Internet would have never been created by the telcos/cablecos in the first place as its basic principles of openness and intelligence at the edge are fundamentally counter to that of large bureaucratic monopolies. New advances in wireless 5G networking, green Internet, cloud computing and next generation optical technologies are in many ways are even a greater threat to the incumbents. The next wave of Internet innovation, especially in the wireless domain, is likely to be even more profound consequences than what we saw over the last two decades.
To my mind a NPI is more than deploying a network, but should also about providing services like Transit exchanges, Peering routes and free Internet at all public institutions. New technologies such as 100G and 1000G wavelengths will insure that there is plenty of bandwidth on the backbones, and technologies like distributed federated forwarding tables will allow deployment of low cost routing (and hopefully zero carbon) using hundreds, if not thousands of ordinary PCs, much in the same way Google revolutionized data center computing.
Initially NPI may not be accessible to all users, because of the current duopoly in broadband access. But with the growing number of community fiber networks the ability to deliver next generation 5G wireless using hubs at schools and libraries. By obtaining its own IMSI codes, as advocated by Rudoplh van der Berg (http://internetthought.blogspot.com/) and open source GSM base stations, coverage to most citizens and machines (e.g.sensor networks, grids etc) should be within the realm of possibility.
An early example of what a NPI may look like is the R&E network in Alberta Canada – Cybera (www.cybera.ca). Cybera has installed a Transit Exchange, which allows Cybera to aggregate members’ commercial Internet traffic and pass it directly to an Internet Service Provider (ISP) of their choice. This group buying setup will secure Cybera members the kind of low-cost Internet rates usually reserved for large corporations. Also, Cybera has set up initial peering connections with the Toronto Internet Exchange (TorIX) and the Seattle Internet Exchange (SIX) – where users can take advantage of these direct connections and avoid the inevitable queuing for bandwidth that takes place during peak use periods on the regular commercial Internet. These services are not only available to academic community but to small businesses and communities that are connected through the Alberta province wide broadband network – SuperNet. Cybera is quite clever in that rather than trying to establish their own peering connections at the TorIX and SIX they are sharing peering routes with other R&E networks. This is something I have been advocating for some time amongst all international R&E and community networks – such an arrangement could reduce Internet costs for users by as much as 90%. The advent of 100G and soon 1000G waves will obviate any concern about bandwidth congestion.
In conclusion while we should continue to press on the regulatory front for an Open Internet, if nothing else to prevent egregious harm to the Internet and society by the incumbents, I think ultimately the only way we will protect and insure an Open Internet, is if we deploy the technology ourselves. We have the tools. We have the means.
For more information:
A personal perspective on the evolving Internet and Research and Education Networks
By choosing to make a stand on traffic ratios in peers, Comcast is fighting directly against over-the-top video, pure and simple. This way they can state to regulators that they will not interfere with over the top traffic, as they did recently during the NBC merger oversight, even while trying to create a world where they *always* get paid on both ends for that same traffic. And while they must compete in a duopoly for the consumer end, they can set whatever price they like for transit because there is no way to bypass the consumer connections they have at any one time. Quite elegant actually. Why build a new toll booth when you can just re-purpose the one you have and close down all other gates.
There has long been a dispute about whether traffic ratios are a real and important criteria in peering, or else simply a conveniently labeled bargaining chip in a game of power. But we’re going to see that debate move beyond the traditional crowd of IP nerds now, I think.
Bill St. Arnaud
- Bill St. Arnaud is a consultant and research engineer who works with clients around the world on a variety of subjects such as next generation Internet networks and developing practical solutions to reduce CO2 emissions such as free broadband and dynamic charging of eVehicles. He is an author of many papers and articles on these topics and is a frequent guest speaker. For more details on my research interests see https://www.researchgate.net/profile/Bill_Arnaud