Monday, April 27, 2009

FTTH provider’s customers bury their own fiber

NAB: FTTH provider’s customers bury their own fiber

LAS VEGAS -- A Norwegian triple-play provider has a unique solution to the pesky problem of digging up consumers' yards to bury fiber-to-the-home. Lyse Tele, an overbuilder that launched its fiber-based all-IP solution in 2002, installs the fiber right to the edge of a customer's lawn, then gives the customer instructions on how to bury their own fiber cable to the house.


Thursday, April 23, 2009

Why sites like Pirate Bay will continue to be popular

[Despite the attempts of the Swedish courts (where now the judge in the Pirate Bay case had found to have a serious conflict of interest) sites like Pirate Bay will continue to pop throughout the net because they meet a need and requirement of consumers for easy and convenient access to movies and songs. Michael Geist’s excellent blog documents why the content industry is trapped in a business model that prefers to punish their customers rather than providing them with a service they clearly want. So called, “piracy” would easily be eliminated if they listened more to their customer’s actions rather than their lawyer’s edicts. The other major challenge in this area is geo-blocking where content availability is limited to certain domestic markets. Some people feel this is just another example of undermining network neutrality where all sites on the Internet should be accessible to all users equally. Here are some excerpts from some great articles from pointers in Michael Geist’s blog – BSA]

Michael Geist Blog
Big Entertainment Wants to Party Like It's 1996

Written by Cory Doctorow
The entertainment industry wants to retreat to the comfort of 1996. […]And most importantly, the laws regulating copyright and technology were almost entirely designed by the entertainment industry. They could write anydamnfoolthing and get it passed in Congress, by the UN, in the EU.
In 2009, the world is populated by people who no longer believe that "Thou shalt sell media on plastic discs forever" came down off the mountain on two stone tablets. It's populated by people who find the spectacle of companies suing their own customers by the thousands indefensible. It's populated by activists who've figured out that the Internet is worth saving and that the entertainment industry is prepared to destroy it.
And the entertainment industry hasn't figured that out, and that's why they're doomed.

Why Hollywood is so slow to catch up on offering all of its movies and shows online
I would gladly pay a hefty monthly fee for [movie download] service—if someone would take my money. In reality, I pay nothing because no company sells such a plan. Instead I've been getting my programming from the friendly BitTorrent peer-to-peer network. Pirates aren't popular these days, but let's give them this—they know how to put together a killer on-demand entertainment system.
I sometimes feel bad about my plundering ways. Like many scofflaws, though, I blame the system. I wouldn't have to steal if Hollywood would only give me a decent online movie-streaming service. In my dreams, here's what it would look like: a site that offers a huge selection—50,000 or more titles to choose from, with lots of Hollywood new releases, indies, and a smorgasbord of old films and TV shows. …Don't gum it up with restrictions, like a requirement that I watch a certain movie within a specified time after choosing it. The only reasonable limit might be to force me to stream the movies so that I won't be able to save the flicks to my computer. Beyond that, charge me a monthly fee and let me watch whatever I want, whenever I want, as often as I want.
The current offerings are nowhere close to this dream service. [..]So why won't anyone in Hollywood build my service? The reason isn't stupidity. When I called people in the industry this week, I found that many in the movie business understand that online distribution is the future of media. But everything in Hollywood is governed by a byzantine set of contractual relationships between many different kinds of companies—studios, distributors, cable channels, telecom companies, and others.
A movie will stay in the pay-per-view market for just a few months; after that, it goes to the premium channels, which get a 15- to 18-month exclusive window in which to show the film. That's why you can't get older titles through Apple's rental plan—once a movie goes to HBO, Apple loses the right to rent it. (Apple has a much wider range of titles available for sale at $15 each; for-sale movies fall under completely different contracts with studios.) Between them, Starz and HBO have contracts to broadcast about 80 percent of major-studio movies made in America today. Their rights extend for seven years or more. After a movie is broadcast on Starz, it makes a tour of ad-supported networks (like USA, TNT, or one of the big-three broadcast networks) and then goes back to Starz for a second run. Only after that—about a decade after the movie came out in theaters—does it enter its "library" phase, the period when companies like Netflix are allowed to license it for streaming. For most Hollywood releases, then, Netflix essentially gets last dibs on a movie, which explains why many of its films are so stale.
Couldn't the studios just sign new deals that would give them the right to build an online service? Well, maybe—but their current deals are worth billions, and a new plan would mean sacrificing certain profits for an uncertain future. Understandably, many are unwilling to take that leap.
Just like in the music business, eventually the entire home-video market is sure to move online, and many consumers will abandon pirate sites in favor of easy-to-use legal services. The music industry lost a lot of money when it dithered over this transition, and now the movie business seems to be making the same mistake. It could be raking in a lot of cash by selling us easy online rentals. Until it works out a plan to do so, there's always BitTorrent.

Monday, April 20, 2009

Building a better Internet - the Pouzin Society

[Here is a group worth following – especially some of their white papers with contributions from luminaries such as John Day, David Meyer, Mike O’Dell – BSA]

The Pouzin Society, named after Louis Pouzin, the inventor of datagrams and connectionless networking, announces its initial organizing meeting. The society’s purpose is to provide a forum for developing viable solutions to the current Internet architecture crisis.

About 15 years ago, it became clear that IPv4 was reaching its limits, and the IETF responded by creating IPv6. In 2006 came the tacit admission that there continue to be fundamental scaling problems in the Internet routing architecture which would only be exacerbated by IPv6, and that Moore's Law could not save us this time. Several solutions were proposed, all based on revising IPv6 addressing using the concept of a locator/identifier split. Work has proceeded diligently, but a few months ago, it became clear that not only was this approach fatally flawed, but by implication, so was IP, or any variation of it. Academic efforts, beginning with NewArch and continuing with FIND and GENI are no closer to finding a solution than we were a decade ago.

In the meantime, “Patterns in Network Architecture” has appeared, describing a simple new architecture that not only solves existing problems but also predicts capabilities as yet unconsidered.

Initial meetings will be held in conjunction with FutureNet in Boston, May 4 - 7. There will be a one day organizing meeting on May 4 to discuss collaboration and next steps. On May 6 and 7 there will be a working meeting at Boston University on the specific topic of the current addressing crisis. There will be considerable work refining architectural details, but the central goal of the effort is to form a group that builds implementations of this new network architecture to evaluate its scalability, security, and other pertinent characteristics.

Wednesday, April 15, 2009

World's first real time map of science activity may predict scientific innovation

[From International Science Grid this week—BSA]

Scientists at Los Alamos National Laboratory (LANL) in New Mexico have produced what they call the world's first "Map of Science" — a high-resolution graphic depiction of the virtual trails scientists leave behind whenever they retrieve information from online services.
The research, led by Johan Bollen of LANL, and his colleagues at the Santa Fe Institute, collected usage-log data gathered from a variety of publishers, aggregators, and universities from 2006 to 2008. Their collection totaled nearly 1 billion requests for online information. Because scientists usually read articles in online form well before they can be cited in print, usage data reveal scientific activity nearly in real-time, the map's creators say.
“This research will be a crucial component of future efforts to study and predict scientific innovation, as well novel methods to determine the true impact of articles and journals,” Bollen said to the Public Library of Science.

Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities

[Another excellent report by Educause on the challenges and strategies of developing a coherent national cyber-infrastructure strategy. I particularly note the observation that “The use of conventional perimeter firewalls, which might be appropriate for parts of the campus constituency, must not burden high-speed flows between on-campus users and resources and those off campus. “ To address this problem CANARIE, working in partnership with regional networks has extended the backbone optical network to a number of individual labs on various campuses across Canada. In some cases we have even installed backbone ROADM equipment right on campus to allow researchers access to potentially hundreds of lambdas. To do this we had to extend dark fiber connections across the campus, bypassing the campus network, from the local regional and/or CANARIE POP. This is necessary not only for high end users, but also health and government data users who are subject to HIPAA requirements. This is only possible with facilities owned networks as universities are not likely to make this investment where the underlying backbone service provider changes every 5 years or so. It also goes without saying that developing a comprehensive cyber-infrastructure strategy will be essential for those institutions who are likely to pay substantially more money in electricity because of upcoming cap and trade. Cyber-infrastructure accounts between 30-50% of the electrical consumption at a research intensive university.—BSA]

Developing a Coherent Cyberinfrastructure from Local Campuses to National Facilities: Challenges and Strategies

Tuesday, April 14, 2009

Will bandwidth caps be the next battle for Network Neutrality

[Increasingly we are seeing carriers look to impose bandwidth caps and a variety of tiered services for Internet usage. Although I think some sort of bandwidth cap may be necessary for egregious users, there proliferation and adoption by cablecos and telco flies in the face of the fact that growth of Internet traffic is slowing down substantially as evidenced by the data provided by Andrew Olydzko. The cablecos and telcos seem to be the only industry that intentionally punishes their biggest customers when given declining growth rates they should be rewarding them. One suspects other motives may be at play as detailed in the following blogs and e-mails. From David Farber and Lauren Weinstein’s lists – BSA]

Andrew Oldzyko Presentation
Internet traffic growth and implications for access technologies

[From a posting by Richard Forno on David Farber’s list]

Download capping is the new DRM,7530.html#xtor

Much like everyone reading this article, I'm a genuine supporter of advancement in hardware and technology services. Suffice to say, I was happy with the progression of Internet connection services over the years.
Recently, however, I would have to say that Internet connection advancement in the U.S. and Canada has been purely an interest of the corporations that provide them and not about serving the consumer-- you--and the advancement of technology in America in general.

In late March, I wrote an article on Tom's Hardware explaining why HDCP (high definition content protection) is the bane of movie watchers everywhere. Not only is HDCP an invasive technology that kills the enjoyment of movies for enthusiasts, it does nothing to stop pirates. We all know this to be true.

Don't think for a moment though, that big media doesn't know this--they absolutely do. Now, they have a new plan. Since big media can't directly go after pirates, they've decided to go after to after the group of people who they think can't do a thing about it: anyone using an Internet connection.

< - >

Download capping is the new DRM.

It ensures several things:

- You will be more hesitant to download movies and music legitimately-- even though you've paid to watch/listen.
- You will watch more cable TV (so you can see all those great ads).
- You will accidentally pay more for less.
- Pirates get a whacking.

Big media and ISPs can't effectively eliminate piracy by going after pirates directly or stop online video and music streaming services. So they have a better plan now: go after everyone.

How Much Time Warner's Broadband Caps Will Screw You - Broadband caps

The money quote comes from Time Warner's SEC filing as highlighted in this article and reveals the anti-competitive issue for consumers and video producers.

One position from Time Warner's COO,

Moreover it's clear that Time Warner fully expects its customers to keep climbing that bandwidth ladder over time, ratcheting themselves into sweeter and sweeter profit positions thanks to the tiered strategy plus known usage increases, thanks to a money quote (literally) from Hobbs: "Broadband data is such a great product. I think there will be some customers who don’t use much that will select the lower tier. But over time, they will use more and move up to the higher price plans." This too reveals that metered broadband isn't really about saving the internet or ensuring great customer experience for the more "polite," less bandwidth-hungry users on the network — it's about setting up a tiered scheme in which Time Warner stands to make an incredible amount of bank as general demand for internet usage is increasing.

And now the anti-competitive "money quote" from TW:

Thanks also to TW's SEC filing we know it's additionally about thwarting the increasing competition their cable video business faces from competitors delivering video content over the web: "TWC faces competition from a range of other competitors, including, increasingly, companies that deliver content to consumers over the Internet, often without charging a fee for access to the content. This trend
could negatively impact customer demand for TWC’s video services, especially premium and On-Demand services."

Time Warner's data caps could cost us much more than high monthly fees, if the company has it way.

On Fri, Apr 10, 2009 at 10:08 AM, David Farber wrote:
Like the virus in 28 Days Later, Time Warner's internet-strangling broadband caps is spreading all over the country. They've got brand new pricing plans too and they yep, they suck. Let's look.
The old cap scheme was pretty limited, only going up to a max of 40GB. Now they've got a whole Skittles bag of caps. Here's how Time Warner Cable's COO Landel Hobbs breaks it down, all while breaking out the familiar warning that the internet is about to die if you don't limit your porn consumption to two times a day—MAX:
Internet demand is rising at a rate that could outpace capacity within a few years. According to industry analysts, the infrastructure may not be able to accommodate the explosion of online content by 2012. This could result in Internet brownouts.

• 1GB with 768kbps downstream for $15/month with $2/GB overcharges
• 10, 20, 40 and 60GB will go with Roadrunner Lite, Basic, Standard and Turbo packages, respectively, and maintain the same pricing. Overage is $1/GB.
• 100GB will be the new Road Runner...Turbo (I'm not sure why there are two Turbo packages) which is 10Mbps downstream and 1Mbps upstream for $75/month. This is still an order of magnitude more restrictive than AT&T and Comcast, who have caps of 150GB and 250GB, respectively.
• A 50Mbps/5Mbps down/up speed tier is coming for $100/month wh

Tuesday, April 7, 2009

The economic benefits of R&E networks and impact on broadband

[Two excellent reports on the future of R&E networks have just been released. The first is the New Zealand research and education network which has undertaken a in-depth study on the economic benefits of R&E networks. This is one of the most comprehensive and in-depth studies I have seen on the subject, and yet I think only scratches the surface in terms of the economic benefits. Besides the direct monetizable and indirect benefits, I believe R&E networks will have an even greater importance in society going forward in working with industry to pilot new ways to accelerate research outcomes from academia to industry through the use of cyber-infrastructure, validating new network business models, facilitating new broadband solutions for consumers and most importantly in helping address the biggest challenge of all- climate change. The second report “Unleashing Waves of Innovation” documents these contributions that R&E networks have made from the very inception of the Internet. Not only did the Internet start with this community, but they have continued to be at the forefront of adopting new business models and architectures. For example, at one time radical concepts such as condominium fiber and wavelengths started with this community and have now been adopted by forward thinking carriers. The standard for network neutrality and the need for a public Internet that is for everyone has largely been carried by the R&E network community. Thanks to Donald Clark for this pointer – BSA]

Economic evaluation of NRENs

Unleashing Waves of Innovation

Deep Packet Inspection and Privacy, Liberty and Freedom of Association

[Although Deep Packet Inspection (DPI) is often represented as no more than a simple traffic management tool used by cablecos, telcos and some ISPs, it is one of those technologies that can have profound social, economic and political consequences. The Internet is now so pervasive, that now in a few short years it has become the most important tool for exercising our most cherished rights including privacy, freedom of speech and freedom of association. I am pleased to see that the Canadian Privacy Commissioner and many others recognize this profound dichotomy between technology and social responsibility. Although DPI is often used to block or limit what is deemed by the carriers to be abusive traffic such as P2P, its unchecked and unconditional usage presages concern amongst many that it can easily be subverted into a tool of repression and interception. I hope that the upcoming hearings at the Canadian Radio and Television Commission (CRTC) on Network Neutrality will address these issues as well – BSA]

Canadian Privacy Commissioner Collection of Essays on DPI and Privacy

Ralf Bendrath also has just compiled a little reading list with the limited non-computer-engineering academic literature around DPI, and issued a sort-of-call-for-papers for social sciences / law / humanities colleagues who are working on this:

The next killer app for the Internet - dematerialization

[I am here at the fantastic Freedom to Connect conference which is well worth watching on webcast ( Many scientists are warning that the planet may be close to a tipping point where we will experience run away global warming (see Andy Revkin’s recent blog from the NYTimes for a summary of several studies on this issue). We simply may not have the luxury time for small incremental adaptations to address the challenge of global climatic disruption. One of the major ways we can reduce our CO2 footprint is through de-materialization where we replace physical products with virtual ones delivered over the Internet. Some studies indicated that we can reduce Co2 emissions by as much as 20% with materialization. I argue that dematerialization can be further amplified through carbon rewards (instead of Carbon taxes) where consumers are rewarded with a variety of virtual products in exchange of reducing their carbon footprint in other walks of their lives. But to take advantage of this opportunity of dematerialization we need open high speed broadband networks everywhere. Hence the importance of conferences such as Freedom to Connect. From a pointer by Tim OReilly on Dave Farber’s list – BSA]

Andy Revkin blog on the planet being at a tipping point

Interesting seybold piece on the environmental impact of publishing:

Have you ever considered what the carbon footprint of a magazine or an eReader is? What about the carbon footprint of your publication? Not everyone cares about carbon footprints or defers to the authority of science on climate change, but when Coke, Pepsi and Apple begin to carbon footprint their products, and Taco-Bell begins to open LEED-certified restaurants with low carbon footprints, it may be time to start.
According to information recently released by Apple, the lifecycle carbon footprint of an iPhone is responsible for the emission of 121 pounds of CO2-equivalent greenhouse gas emissions over the course of a three year expected lifetime of use. Over 10 million iPhones have been sold to date. Though it is not a direct comparison, it is interesting to note that Discover magazine estimated that the lifecycle carbon footprint of each copy of its publication is responsible for 2.1 pounds of carbon dioxide emissions, the same amount produced by twelve 100-watt light bulbs glowing for an hour or a car engine burning 14 ounces of gasoline. Over the next few years it can be expected that the reporting of lifecycle data and the carbon labeling of all products will move from the margins to the mainstream - including the footprinting of print and digital media products. Welcome to the age of low carbon everything.

There are billions of kilowatt hours of electricity embodied in the paper, ink and digital technologies we use each day, and close to a kilogram of CO2 is emitted for each kilowatt, but the energy and greenhouse gas emissions associated with print and digital media supply chains have typically been overlooked, misunderstood or underestimated. Those days are drawing to an end. Increasingly major brands like Walmart, Pepsi, Coke, TacoBell and Timberland see carbon footprinting and carbon disclosure as an opportunity to differentiate themselves and grow - even in the face of a global recession.
Before you use one of the many carbon calculators popping on the Web to measure the carbon footprint of whatever medium you use, it's important to realize that the results can vary dramatically - as do their underlying assumptions. Most fail to employ standards. Until now, lots of calculators and "carbon neutral" companies have made promises to help you reduce your footprint. But there's been no single authority or regulatory agency to dictate how carbon usage should be calculated or disclosed. Standards and specifications for carbon footprinting such as ISO 14040, ISO 14064 and PAS 2050 now do exist, and open standards-based Web 2.0 platforms like AMEE are now available that enable accurate carbon footprinting, like-for-like comparisons and large-scale supply chain analysis.

Carbon rewards instead of carbon taxes