[Just released report on the Recovery Act Investments in Broadband stresses the critical role of the middle mile in connecting small rural communities using anchor institutions such as universities, schools, hospitals, etc. The report cites the investment NSF made in the early regional networks as a precedent. NRENs and/or RONs can play a critical role in this regard -- BSA]
http://bit.ly/8nTBBA
RECOVERY ACT INVESTMENTS IN BROADBAND:
LEVERAGING FEDERAL DOLLARS TO CREATE JOBS AND CONNECT AMERICA
Monday, December 21, 2009
Wednesday, December 16, 2009
Bill St. Arnaud is leaving CANARIE
All:
With mixed emotions and many fond memories I will be leaving CANARIE as of January 8.
Over the past 15 years of tenure at CANARIE I am very proud to feel that I have made a small contribution to several significant developments in the areas of customer owned networks, user controlled lightpaths, development of infrastructure as service, various broadband initiatives and most recently in looking at how networks and cyber-infrastructure can help address the challenge of climate change.
I strongly believe that research and education networks will continue to play a critical in today's society not only in supporting next generation research such as cyber-infrastructure but in continuing to demonstrate new Internet and broadband architectures and business models.
I now look forward to pursing new opportunities related to my on going passion for Internet networking, especially in the area of developing network and ICT tools to mitigate climate change.
I will continue to personally blog, tweet and e-mail as usual on the subject of Internet, climate change and R&E networking in general as per the following coordinates:
e-mail: Bill.St.Arnaud@gmail.com
twitter: BillStArnaud
Facebook: Bill St. Arnaud
skype: pocketpro
blog: http://billstarnaud.blogspot.com/
or http://green-broadband.blogspot.com
It has been a fantastic experience working at CANARIE and I will depart with long memories of working with so many engaging and brilliant colleagues within CANARIE and throughout the world.
Until we meet again
Bill
With mixed emotions and many fond memories I will be leaving CANARIE as of January 8.
Over the past 15 years of tenure at CANARIE I am very proud to feel that I have made a small contribution to several significant developments in the areas of customer owned networks, user controlled lightpaths, development of infrastructure as service, various broadband initiatives and most recently in looking at how networks and cyber-infrastructure can help address the challenge of climate change.
I strongly believe that research and education networks will continue to play a critical in today's society not only in supporting next generation research such as cyber-infrastructure but in continuing to demonstrate new Internet and broadband architectures and business models.
I now look forward to pursing new opportunities related to my on going passion for Internet networking, especially in the area of developing network and ICT tools to mitigate climate change.
I will continue to personally blog, tweet and e-mail as usual on the subject of Internet, climate change and R&E networking in general as per the following coordinates:
e-mail: Bill.St.Arnaud@gmail.com
twitter: BillStArnaud
Facebook: Bill St. Arnaud
skype: pocketpro
blog: http://billstarnaud.blogspot.com/
or http://green-broadband.blogspot.com
It has been a fantastic experience working at CANARIE and I will depart with long memories of working with so many engaging and brilliant colleagues within CANARIE and throughout the world.
Until we meet again
Bill
Thursday, December 3, 2009
New digital strategy launched for UK universities by JISC
[JISC has been at the forefront of many global technology
developments for education and research. They have been one of the
early champions of the use of clouds and understanding the impact of
Green IT-- BSA]
http://www.jisc.ac.uk/Home/news/stories/2009/12/strategy.aspx
JISC launches 2010-2012 strategy
The UK is at risk of losing its world-leading reputation for
education, unless it continues to invest in digital technologies to
meet the ever-changing needs of modern learners, researchers and the
academic community says JISC, in its three-year strategy which
launches today.
The strategy1 outlines a vision of the future whereby a robust
technological infrastructure is required to meet the shifting needs of
the 21st century education community. JISC believes it is crucial that
the UK’s education system continues to compete on the international
stage by investing in innovation, research and increasing the
availability of online resources.
Recent JISC projects, such as the Google Generation and sustainable
ICT studies, have defined a new world for teaching and learning and
have outlined the infrastructure needed to support it. With new
technologies constantly evolving, sustained investment is needed to
pioneer their use. Over the last decade JISC has invested its research
and development funds in around 200 universities and colleges to help
uncover new products, approaches and systems as well as increase
skills and capacity.
JISC through JANET has developed a world-leading computer network and
technical backbone which has transformed the way that technology is
used and understood. Now, a network once used by a select few purely
for cutting-edge research allows millions in education and research to
share, manipulate, analyse and reuse digital content from around the
world. It is also the first national research and education network in
the world to complete a 100Gbit/s network trial that is nearly two
hundred thousand times faster than the average broadband connection.
As the web continues to transform life in the education sector, JISC,
through its services will guide individuals and organisations to make
effective use of digital technology through training and staff
development
JISC’s strategy outlines four key areas of investment:
* effective, creative approaches to teaching and an enhanced
learning experience;
* increased research quality and innovative approaches to support
the research process;
* efficient and effective institutions;
* shared infrastructure and resources.
Within these four areas, focus will be given to online learning,
management information systems, cloud computing, innovation and
impact.
The launch of JISC's strategy 2010-2012, follows a period of
consultation in which UK higher and further education institutions,
membership bodies, mission groups, and key partners were invited to
respond and help inform the strategy’s final direction. The strategy
has been written to ensure JISC’s planned future investment
priorities focus on the areas of greatest importance to those in
education and research.
developments for education and research. They have been one of the
early champions of the use of clouds and understanding the impact of
Green IT-- BSA]
http://www.jisc.ac.uk/Home/news/stories/2009/12/strategy.aspx
JISC launches 2010-2012 strategy
The UK is at risk of losing its world-leading reputation for
education, unless it continues to invest in digital technologies to
meet the ever-changing needs of modern learners, researchers and the
academic community says JISC, in its three-year strategy which
launches today.
The strategy1 outlines a vision of the future whereby a robust
technological infrastructure is required to meet the shifting needs of
the 21st century education community. JISC believes it is crucial that
the UK’s education system continues to compete on the international
stage by investing in innovation, research and increasing the
availability of online resources.
Recent JISC projects, such as the Google Generation and sustainable
ICT studies, have defined a new world for teaching and learning and
have outlined the infrastructure needed to support it. With new
technologies constantly evolving, sustained investment is needed to
pioneer their use. Over the last decade JISC has invested its research
and development funds in around 200 universities and colleges to help
uncover new products, approaches and systems as well as increase
skills and capacity.
JISC through JANET has developed a world-leading computer network and
technical backbone which has transformed the way that technology is
used and understood. Now, a network once used by a select few purely
for cutting-edge research allows millions in education and research to
share, manipulate, analyse and reuse digital content from around the
world. It is also the first national research and education network in
the world to complete a 100Gbit/s network trial that is nearly two
hundred thousand times faster than the average broadband connection.
As the web continues to transform life in the education sector, JISC,
through its services will guide individuals and organisations to make
effective use of digital technology through training and staff
development
JISC’s strategy outlines four key areas of investment:
* effective, creative approaches to teaching and an enhanced
learning experience;
* increased research quality and innovative approaches to support
the research process;
* efficient and effective institutions;
* shared infrastructure and resources.
Within these four areas, focus will be given to online learning,
management information systems, cloud computing, innovation and
impact.
The launch of JISC's strategy 2010-2012, follows a period of
consultation in which UK higher and further education institutions,
membership bodies, mission groups, and key partners were invited to
respond and help inform the strategy’s final direction. The strategy
has been written to ensure JISC’s planned future investment
priorities focus on the areas of greatest importance to those in
education and research.
Monday, November 30, 2009
AARNet salutes the 20th anniversary of the Internet in Australia
[Congratulations to AARnet. Australia has been undertaking some very innovative approaches to networking - first with AARnet and now NBN. Great video on Youtube celebrating their 20th anniversary- http://www.youtube.com/watch?v=fPDZd4VNIWI. Some excerpts -- BSA]
http://www.computerworld.com.au/article/327868/aarnet_salutes_20th_anniversary_internet_australia
Pioneer of the Internet launches book to commemorate historical milestones
Sydney, AUSTRALIA – 26 November 2009 – The Governor-General of Australia, Ms Quentin Bryce, AC, will launch a book today at Admiralty House, commissioned by AARNet (Australia’s Academic and Research Network) to commemorate the 20th anniversary of the Internet in Australia.
AARNet – 20 years of the Internet in Australia documents the history of how the Internet network was established in Australia through AARNet. The book explores how Australia’s commercial Internet network, as we know it today, was originally developed by AARNet. It also documents key individuals, events and milestones that led to the growth and development of a high-speed Internet network dedicated to Australia’s research and education institutions.
[...]
The need for a dedicated high-speed Internet network to serve the research and education community was developed out of the special demands for a network that had the speed and capacity to manage innovative projects and collaboration between Australian and international researchers. AARNet’s unique governance and funding arrangements meant its network was always more technically advanced and affordable. AARNet has showcased how innovation and collaboration is possible and is future proofing potential applications for the National Broadband Network into the future.
[..]
Today, AARNet serves over one million users in Australia’s research, tertiary education and scientific sectors. AARNet continues to demonstrate its relevance and importance in promoting collaboration and innovation in Australia through its high-speed network, which will complement the advent of the National Broadband Network.
http://www.computerworld.com.au/article/327868/aarnet_salutes_20th_anniversary_internet_australia
Pioneer of the Internet launches book to commemorate historical milestones
Sydney, AUSTRALIA – 26 November 2009 – The Governor-General of Australia, Ms Quentin Bryce, AC, will launch a book today at Admiralty House, commissioned by AARNet (Australia’s Academic and Research Network) to commemorate the 20th anniversary of the Internet in Australia.
AARNet – 20 years of the Internet in Australia documents the history of how the Internet network was established in Australia through AARNet. The book explores how Australia’s commercial Internet network, as we know it today, was originally developed by AARNet. It also documents key individuals, events and milestones that led to the growth and development of a high-speed Internet network dedicated to Australia’s research and education institutions.
[...]
The need for a dedicated high-speed Internet network to serve the research and education community was developed out of the special demands for a network that had the speed and capacity to manage innovative projects and collaboration between Australian and international researchers. AARNet’s unique governance and funding arrangements meant its network was always more technically advanced and affordable. AARNet has showcased how innovation and collaboration is possible and is future proofing potential applications for the National Broadband Network into the future.
[..]
Today, AARNet serves over one million users in Australia’s research, tertiary education and scientific sectors. AARNet continues to demonstrate its relevance and importance in promoting collaboration and innovation in Australia through its high-speed network, which will complement the advent of the National Broadband Network.
Tuesday, November 17, 2009
Harnessing Openness to Improve Research, Teaching and Learning in Higher Education
[A very useful overview of the impact of open courseware, open source software, open collaboration, and much more. The report emphasizes the point that institutions should be focusing on developing new tools and policies to support openness rather those that restrict access or require prior permission such as federated access, Shibboleth, Eduroam etc. While some applications will always require such permission based technologies, they should always be seen as a last resort subject to identifying alternative open solutions. Thanks to Mike Nelson for this posting on Dave Farber’s IPer list – BSA]
"Harnessing Openness to Improve Research, Teaching and Learning in Higher Education," a report by the Digital Connections Council of the Committee for Economic Development Committee
http://www.ced.org/images/library/reports/digital_economy/dcc_opennessedu09.pdf
CED’s Digital Connections Council (DCC), a group of information technology experts from trustee-affiliated companies, was established to advise CED on the policy issues associated with cutting-edge technologies.
The rise of the Internet and the digitization of information are affecting every corner of our lives. In a series of reports we have examined how these two changes are increasing the “openness” of information, processes and institutions.
The degree of openness of information, for example, can differ dramatically. To the extent that people have access to information, without restrictions, that information is more open than information to which people have access only if they are subscribers, or have security clearances, or have to go to a particular
location to get it. But accessibility, quite similar to the concept of transparency, is only one aspect of openness. The other is responsiveness. Can one change the information, repurpose, remix, and redistribute it? Information (or a process or an institution) is more open when there are fewer restrictions on access, use, and responsiveness.
The Internet, in particular, has vastly expanded openness. It is changing the nature of information, processes and institutions by making them more accessible to people next door and around the world. It also makes information more responsive—capable of being enhanced, or degraded, through the digital contributions of anyone interested enough to make the effort, be they experts, devoted amateurs, people withan ax to grind, or the merely curious.
In this report we examine higher education through the lens of openness. Our goal is to understand the potential impact of greater openness on colleges and universities. Like other service industries such as finance or entertainment, higher education is rooted in information—its creation, analysis, and transmission
—and the development of the skills required to utilize it for the benefit of individuals and society.
But finance and entertainment have been transformed by greater openness while higher education appears, at least in terms of openness, to have changed much less. We aim, in this report to identify some of the potential gains from making higher education more open. We also make a series of concrete recommendations for
policy makers and for institutions of higher education that should help harness the benefits of greater openness.
[…]
"Harnessing Openness to Improve Research, Teaching and Learning in Higher Education," a report by the Digital Connections Council of the Committee for Economic Development Committee
http://www.ced.org/images/library/reports/digital_economy/dcc_opennessedu09.pdf
CED’s Digital Connections Council (DCC), a group of information technology experts from trustee-affiliated companies, was established to advise CED on the policy issues associated with cutting-edge technologies.
The rise of the Internet and the digitization of information are affecting every corner of our lives. In a series of reports we have examined how these two changes are increasing the “openness” of information, processes and institutions.
The degree of openness of information, for example, can differ dramatically. To the extent that people have access to information, without restrictions, that information is more open than information to which people have access only if they are subscribers, or have security clearances, or have to go to a particular
location to get it. But accessibility, quite similar to the concept of transparency, is only one aspect of openness. The other is responsiveness. Can one change the information, repurpose, remix, and redistribute it? Information (or a process or an institution) is more open when there are fewer restrictions on access, use, and responsiveness.
The Internet, in particular, has vastly expanded openness. It is changing the nature of information, processes and institutions by making them more accessible to people next door and around the world. It also makes information more responsive—capable of being enhanced, or degraded, through the digital contributions of anyone interested enough to make the effort, be they experts, devoted amateurs, people withan ax to grind, or the merely curious.
In this report we examine higher education through the lens of openness. Our goal is to understand the potential impact of greater openness on colleges and universities. Like other service industries such as finance or entertainment, higher education is rooted in information—its creation, analysis, and transmission
—and the development of the skills required to utilize it for the benefit of individuals and society.
But finance and entertainment have been transformed by greater openness while higher education appears, at least in terms of openness, to have changed much less. We aim, in this report to identify some of the potential gains from making higher education more open. We also make a series of concrete recommendations for
policy makers and for institutions of higher education that should help harness the benefits of greater openness.
[…]
Friday, November 6, 2009
Larry Lessig on the "culture of permission" versus Eduroam and Shibb
Today at Educause Larry Lessig, as usual, gave a brilliant talk on
the "culture of permissions" and how the Hollywood interpretation of
copyright is distorting the sharing of knowledge, culture and science.
Increasingly we are entering a world where you need to a priori
"permission" to do anything including accessing networks or sharing
and knowledge regardless of whether the underlying information is in
the public domain or not.
Larry Lessig's talk can be seen at:
http://www.educause.edu/Resources/EDUCAUSE2009FacetoFaceConferen/ItIsAboutTimeGettingOurValuesA/175767
Paradoxically just prior to Larry's talk, Ken Klingenstein, on behalf
of the Shibboleth team received a special award of recognition from
Educuase. I have great admiration for what Ken and his team have done,
and I fully appreciate we will need federated access tools like
Shibboleth and Eduroam for certain applications such as shared
computational resources, etc. But on the other hand I worry that these
technologies represent the thin edge of the wedge in terms of
deploying a "permission" culture on our campuses. They may be a
necessary evil, but we must be vigilant to ensure that they are
limited to only those applications that truly need federated identity
management and do not become a proxy for publishers and software
companies to control access and distribution of their products and
effectively become a tool to limit access information at our
universities.
The essence of universities is to allow uncensored access to to
information and data, not only for researchers and educators but to
the greater community in which they serve. Most institutions freely
allow members of the public to use the library and browse the stacks
including reading journals and other material. But increasingly, as we
move into the digital age where everything is on line, this important
public service is being restricted through various permission tools
like identity management or closed wireless networks. Although there
are legitimate privacy and security concerns of allowing open access
let us not sacrifice openness and innovation on the alter of security
and privacy
Eduroam, in particular, to my mind exemplifies this culture of
permissions. In the spirit of providing open access to the community
in which they serve, I have always argued that universities should
provide open wireless networks for any visitor to the campus, just not
visiting academics from another institution. Many airports and
municipalities provide open access wireless networks and I am puzzled
why this is so rarely found at our universities and colleges. Airports
probably have much greater security concerns then universities and yet
many feel secure in offering open wireless access.
Let us avoid in getting caught up in the technology wizardry and for every
application and service really think hard if there is a way to deliver
a service in an open manner whether it is a network, data or journal.
Only as a last resort should we look to "permission" technologies
whether the it is networks or federated access. End of rant.
Here is a another great blog on this subject
Innovation in Open Networks - Creative Commons, the Next Layer of
Openness
-------------------------------------------------------------------------------------------------------------------
http://joi.ito.com/weblog/2009/10/30/innovation-in-o.html
The explosion of innovation around the Internet is driven by an
ecosystem of people who work in an open network defined by open
standards. However, the technical ability to connect in an
increasingly seamless way has begun to highlight friction and failure
in the system caused by the complicated copyright system that was
originally designed to "protect" innovation. Just as open network
protocols created an interoperable and frictionless network, open
metadata and legal standards can solve many of the issues caused by
copyright and dramatically reduce the friction and cost that it
currently represents.
Hell hath no fury as a vested interested masquerading as a public
issue
the "culture of permissions" and how the Hollywood interpretation of
copyright is distorting the sharing of knowledge, culture and science.
Increasingly we are entering a world where you need to a priori
"permission" to do anything including accessing networks or sharing
and knowledge regardless of whether the underlying information is in
the public domain or not.
Larry Lessig's talk can be seen at:
http://www.educause.edu/Resources/EDUCAUSE2009FacetoFaceConferen/ItIsAboutTimeGettingOurValuesA/175767
Paradoxically just prior to Larry's talk, Ken Klingenstein, on behalf
of the Shibboleth team received a special award of recognition from
Educuase. I have great admiration for what Ken and his team have done,
and I fully appreciate we will need federated access tools like
Shibboleth and Eduroam for certain applications such as shared
computational resources, etc. But on the other hand I worry that these
technologies represent the thin edge of the wedge in terms of
deploying a "permission" culture on our campuses. They may be a
necessary evil, but we must be vigilant to ensure that they are
limited to only those applications that truly need federated identity
management and do not become a proxy for publishers and software
companies to control access and distribution of their products and
effectively become a tool to limit access information at our
universities.
The essence of universities is to allow uncensored access to to
information and data, not only for researchers and educators but to
the greater community in which they serve. Most institutions freely
allow members of the public to use the library and browse the stacks
including reading journals and other material. But increasingly, as we
move into the digital age where everything is on line, this important
public service is being restricted through various permission tools
like identity management or closed wireless networks. Although there
are legitimate privacy and security concerns of allowing open access
let us not sacrifice openness and innovation on the alter of security
and privacy
Eduroam, in particular, to my mind exemplifies this culture of
permissions. In the spirit of providing open access to the community
in which they serve, I have always argued that universities should
provide open wireless networks for any visitor to the campus, just not
visiting academics from another institution. Many airports and
municipalities provide open access wireless networks and I am puzzled
why this is so rarely found at our universities and colleges. Airports
probably have much greater security concerns then universities and yet
many feel secure in offering open wireless access.
Let us avoid in getting caught up in the technology wizardry and for every
application and service really think hard if there is a way to deliver
a service in an open manner whether it is a network, data or journal.
Only as a last resort should we look to "permission" technologies
whether the it is networks or federated access. End of rant.
Here is a another great blog on this subject
Innovation in Open Networks - Creative Commons, the Next Layer of
Openness
-------------------------------------------------------------------------------------------------------------------
http://joi.ito.com/weblog/2009/10/30/innovation-in-o.html
The explosion of innovation around the Internet is driven by an
ecosystem of people who work in an open network defined by open
standards. However, the technical ability to connect in an
increasingly seamless way has begun to highlight friction and failure
in the system caused by the complicated copyright system that was
originally designed to "protect" innovation. Just as open network
protocols created an interoperable and frictionless network, open
metadata and legal standards can solve many of the issues caused by
copyright and dramatically reduce the friction and cost that it
currently represents.
Hell hath no fury as a vested interested masquerading as a public
issue
Tuesday, October 20, 2009
The Economics of Federal Government Cloud Computing Analyzed
[From Slashdot. I would argue that "green" clouds using follow the
wind/follow the sun architectures would have even more dramatic
savings as documented in recent MIT and Rutgers papers--BSA]
http://it.slashdot.org/story/09/10/19/0053207/The-Economics-of-Federal-Cloud-Computing-Analyzed
"With the federal government about to spend $20B on IT
infrastructure, this highly analytical article by two Booz Allen
Hamilton associates makes it clear that cloud computing has now
received full executive backing and offers clear opportunities for
agencies to significantly reduce their growing expenditures for data
centers and IT hardware. From the article: 'A few agencies are already
moving quickly to explore cloud computing solutions and are even
redirecting existing funds to begin implementations... Agencies should
identify the aspects of their current IT workload that can be
transitioned to the cloud in the near term to yield "early wins" to
help build momentum and support for the migration to cloud
computing.'"
http://govcloud.ulitzer.com/node/1147473
/"Of the investments that will involve up-front costs to be recouped
in outyear savings, cloud-computing is a prime case in point. The
Federal Government will transform its Information Technology
Infrastructure by virtualizing data centers, consolidating data
centers and operations, and ultimately adopting a cloud-computing
business model. Initial pilots conducted in collaboration with Federal
agencies will serve as test beds to demonstrate capabilities,
including appropriate security and privacy protection at or exceeding
current best practices, developing standards, gathering data, and
benchmarking costs and performance. The pilots will evolve into
migrations of major agency capabilities from agency computing
platforms to base agency IT processes and data in the cloud. Expected
savings in the outyears, as more agencies reduce their costs of
hosting systems in their own data centers, should be many times the
original investment in this area." [2]/
The language in the budget makes three key points: (1) up-front
investment will be made in cloud computing, (2) long-term savings are
expected, and (3) the savings are expected to be significantly greater
than the investment costs.
Booz Allen Hamilton has created a detailed cost model that can create
life-cycle cost (LCC) estimates of public, private, and hybrid clouds.
We used this model, and our extensive experience in economic analysis
of IT programs, to arrive at a first-order estimate of each of the
three key points in the President's budget.
wind/follow the sun architectures would have even more dramatic
savings as documented in recent MIT and Rutgers papers--BSA]
http://it.slashdot.org/story/09/10/19/0053207/The-Economics-of-Federal-Cloud-Computing-Analyzed
"With the federal government about to spend $20B on IT
infrastructure, this highly analytical article by two Booz Allen
Hamilton associates makes it clear that cloud computing has now
received full executive backing and offers clear opportunities for
agencies to significantly reduce their growing expenditures for data
centers and IT hardware. From the article: 'A few agencies are already
moving quickly to explore cloud computing solutions and are even
redirecting existing funds to begin implementations... Agencies should
identify the aspects of their current IT workload that can be
transitioned to the cloud in the near term to yield "early wins" to
help build momentum and support for the migration to cloud
computing.'"
http://govcloud.ulitzer.com/node/1147473
/"Of the investments that will involve up-front costs to be recouped
in outyear savings, cloud-computing is a prime case in point. The
Federal Government will transform its Information Technology
Infrastructure by virtualizing data centers, consolidating data
centers and operations, and ultimately adopting a cloud-computing
business model. Initial pilots conducted in collaboration with Federal
agencies will serve as test beds to demonstrate capabilities,
including appropriate security and privacy protection at or exceeding
current best practices, developing standards, gathering data, and
benchmarking costs and performance. The pilots will evolve into
migrations of major agency capabilities from agency computing
platforms to base agency IT processes and data in the cloud. Expected
savings in the outyears, as more agencies reduce their costs of
hosting systems in their own data centers, should be many times the
original investment in this area." [2]/
The language in the budget makes three key points: (1) up-front
investment will be made in cloud computing, (2) long-term savings are
expected, and (3) the savings are expected to be significantly greater
than the investment costs.
Booz Allen Hamilton has created a detailed cost model that can create
life-cycle cost (LCC) estimates of public, private, and hybrid clouds.
We used this model, and our extensive experience in economic analysis
of IT programs, to arrive at a first-order estimate of each of the
three key points in the President's budget.
Thursday, October 15, 2009
Most Internet traffic bypasses tier-one networks
[Once again the world's R&E networks have been at the forefront of this revolution. Most R&E networks around the world arrange for direct peering with major content providers and Tier 2 networks. This saves anywhere from 40-50% of Internet transit costs for their customers and is also a major, if not primary source of income for most R&E networks. Some R&E networks are now talking about exchanging their respective peering routes to create a global Tier 1 peering consortium which will further reduce costs for their connected institutions. Of course, this is only possible if you have an extensive optical backbone with lots of capacity to add wavelengths etc - another example of how the optical revolution is changing the market dynamics of the Internet. CANARIE's UCLP was originally designed for this scenario to enable R&E networks and institutions to do low cost remote peering. From a posting on Dewayne Hendricks list -- BSA]
From: (Dewayne Hendricks)
Study: Most Internet traffic bypasses tier-one networks
Telephony Online
By Ed Gubbins
The majority of Internet traffic now goes through direct peers and
does not flow through incumbent tier-one telecom networks, according
to a recent report from Arbor Networks, which sells network management
and security products.
Tier-one incumbents were once the chief providers of connectivity
between content companies like Google and local or regional broadband
providers like Comcast. But over time, Google and other content
providers have built out their own infrastructure, connecting more
directly to end users and bypassing those tier-one intermediaries.
RSS Feed:
From: (Dewayne Hendricks)
Study: Most Internet traffic bypasses tier-one networks
Telephony Online
By Ed Gubbins
The majority of Internet traffic now goes through direct peers and
does not flow through incumbent tier-one telecom networks, according
to a recent report from Arbor Networks, which sells network management
and security products.
Tier-one incumbents were once the chief providers of connectivity
between content companies like Google and local or regional broadband
providers like Comcast. But over time, Google and other content
providers have built out their own infrastructure, connecting more
directly to end users and bypassing those tier-one intermediaries.
RSS Feed:
Wednesday, September 2, 2009
My submission at CRTC traffic throttling hearing
Full submission is available at:
http://www.cippic.ca/uploads/File/Attachment_C_pt_1_-_St_Arnaud_Report.pdf
Recommendations:
• Avoid use of DPI. The use of DPI raises serious privacy concerns that have not been resolved.
• Traffic management techniques should be applied uniformly to all users. Targeting a sub-set of users can be arbitrary and therefore unfair. Application targeting is an example of this. For example, throttling all P2P file-sharing application users will also catch many users of P2P file-sharing applications that generate only a small amount of bandwidth.
• Disproportionate targeting of lower tier customers should be avoided. If, for example, a customer with a 1 Mbps line and a customer with a 10 Mbps line were, in combination, generating enough traffic to congest a link they shared, they should each be throttled in proportion to the bandwidth they have paid for.
• Targeting newly developed protocols or applications should be avoided. Such innovations may make easy targets at first, while they are only employed by a small subset of users, as interfering with such traffic will only impact on less customers. This is true of P2P throttling. Telcos/Cablecos can now say that a relatively smaller portion of users are generating a large amount of P2P traffic and so this type of traffic should be targeted. However, given the efficiency in bandwidth distribution P2P offers customers, its use is only likely to increase in the future. As P2P use becomes more ubiquitous, the rationale that a small number of users are generating large amounts of P2P traffic will be become more inaccurate. More importantly, allowing telcos/cablecos to target a newly developed application or protocol because a.) it currently has a small number of users and b.) it happens to be very effective and so generates a large amount of bandwidth traffic, is likely to hinder innovation.
http://www.cippic.ca/uploads/File/Attachment_C_pt_1_-_St_Arnaud_Report.pdf
Recommendations:
• Avoid use of DPI. The use of DPI raises serious privacy concerns that have not been resolved.
• Traffic management techniques should be applied uniformly to all users. Targeting a sub-set of users can be arbitrary and therefore unfair. Application targeting is an example of this. For example, throttling all P2P file-sharing application users will also catch many users of P2P file-sharing applications that generate only a small amount of bandwidth.
• Disproportionate targeting of lower tier customers should be avoided. If, for example, a customer with a 1 Mbps line and a customer with a 10 Mbps line were, in combination, generating enough traffic to congest a link they shared, they should each be throttled in proportion to the bandwidth they have paid for.
• Targeting newly developed protocols or applications should be avoided. Such innovations may make easy targets at first, while they are only employed by a small subset of users, as interfering with such traffic will only impact on less customers. This is true of P2P throttling. Telcos/Cablecos can now say that a relatively smaller portion of users are generating a large amount of P2P traffic and so this type of traffic should be targeted. However, given the efficiency in bandwidth distribution P2P offers customers, its use is only likely to increase in the future. As P2P use becomes more ubiquitous, the rationale that a small number of users are generating large amounts of P2P traffic will be become more inaccurate. More importantly, allowing telcos/cablecos to target a newly developed application or protocol because a.) it currently has a small number of users and b.) it happens to be very effective and so generates a large amount of bandwidth traffic, is likely to hinder innovation.
My testimony at FCC broadband workshop
[The FCC has been tasked to develop a national broadband strategy and are holding a series of workshops. I was invited to give a short presentation on some of the ideas we have been working in Canada and elsewhere. Here are my speaking notes – Bill]
www.broadband.gov
My presentation and background slides can be found at
http://www.slideshare.net/bstarn/fcc-broadband-workshop
Good morning
First all I would like to thank the FCC staff inviting me to give speak at this event and I applaud their initiative in this area. These workshops will be very critical in defining a national broadband vision not only for the US but other countries around the world as well
I am Bill St Arnaud Chief Research Officer for CANARIE
CANARIE is the Canadian equivalent of Internet 2.
Our mandate is a bit broader in that we have been tasked to advance Canada’s telcom and Internet networks and applications
We work closely with organizations like Internet 2, NLR , Educuase in the US and institutions like UCSD
As everyone knows the Internet originated with the R&E community.
Not many people realize however that R&E community is also a major pioneer in new broadband architectures and business models
The R&E community has long experience in operating their own networks national and locally and many university networks are equivalent to those that would be deployed in a small city
New broadband Concepts like condominium networks, customer owned and controlled networks, hybrid networking, etc all started with the R&E community
[First slide]
In my opinion the biggest challenge in developing a national broadband vision is defining a business case
Many people think that government is going to invest billions of dollars in a national broadband deployment
In this era of trillion deficits and near bankrupt state and local governments I very much doubt that governments will be able to make any significant investments in broadband
So we have to look at the private sector as the primary vehicle for deploying broadband
But the business case for private sector to deploy national broadband is also very weak, especially if we want multiple facilities based competitors
I think there is general agreement that multiple facilities based competition is the ideal solution as competition drives innovation, lower prices and more choices for the consumer
But the business case for traditional NGA deployment is very weak and is predicated on 40% takeup and triple play revenues of $130
And of course revenues from triple play are gradually being undermined as video and voice service migrate to the internet in the coming years
Even with those numbers high speed broadband based on fiber will only reach about 40% of customers
So what we need is to experiment with new business models to underwrite the cost of next generation broadband
NEXT SLIDE
Some good examples are the “Home with Tails” concept that some Google analysts are advocating where the customer owns the last mile
Another one is Green Broadband where the cost of the broadband infrastructure and service is bundled with the customers’ energy bill, and the customer is encouraged to reduce their energy consumption, while the service provider makes money from the energy bill rather than triple play. There are now several pilots around the world adopting this model
As you may have heard CANARIE has launched a modest Green IT pilot program to help industry and academia capture new business opportunities in this field
Other examples include the condominium fiber deployment in Netherlands being lead by KPN in partnership with Reggenfiber
Another good example is the Swisscom national condo fiber project being deployed in partnership with numerous energy companies in that country
So my number one suggestion to FCC is that they work with R&E community and fund a number of NGA pilots that promote facilities based competition
For more information please see the links on your screen
www.broadband.gov
My presentation and background slides can be found at
http://www.slideshare.net/bstarn/fcc-broadband-workshop
Good morning
First all I would like to thank the FCC staff inviting me to give speak at this event and I applaud their initiative in this area. These workshops will be very critical in defining a national broadband vision not only for the US but other countries around the world as well
I am Bill St Arnaud Chief Research Officer for CANARIE
CANARIE is the Canadian equivalent of Internet 2.
Our mandate is a bit broader in that we have been tasked to advance Canada’s telcom and Internet networks and applications
We work closely with organizations like Internet 2, NLR , Educuase in the US and institutions like UCSD
As everyone knows the Internet originated with the R&E community.
Not many people realize however that R&E community is also a major pioneer in new broadband architectures and business models
The R&E community has long experience in operating their own networks national and locally and many university networks are equivalent to those that would be deployed in a small city
New broadband Concepts like condominium networks, customer owned and controlled networks, hybrid networking, etc all started with the R&E community
[First slide]
In my opinion the biggest challenge in developing a national broadband vision is defining a business case
Many people think that government is going to invest billions of dollars in a national broadband deployment
In this era of trillion deficits and near bankrupt state and local governments I very much doubt that governments will be able to make any significant investments in broadband
So we have to look at the private sector as the primary vehicle for deploying broadband
But the business case for private sector to deploy national broadband is also very weak, especially if we want multiple facilities based competitors
I think there is general agreement that multiple facilities based competition is the ideal solution as competition drives innovation, lower prices and more choices for the consumer
But the business case for traditional NGA deployment is very weak and is predicated on 40% takeup and triple play revenues of $130
And of course revenues from triple play are gradually being undermined as video and voice service migrate to the internet in the coming years
Even with those numbers high speed broadband based on fiber will only reach about 40% of customers
So what we need is to experiment with new business models to underwrite the cost of next generation broadband
NEXT SLIDE
Some good examples are the “Home with Tails” concept that some Google analysts are advocating where the customer owns the last mile
Another one is Green Broadband where the cost of the broadband infrastructure and service is bundled with the customers’ energy bill, and the customer is encouraged to reduce their energy consumption, while the service provider makes money from the energy bill rather than triple play. There are now several pilots around the world adopting this model
As you may have heard CANARIE has launched a modest Green IT pilot program to help industry and academia capture new business opportunities in this field
Other examples include the condominium fiber deployment in Netherlands being lead by KPN in partnership with Reggenfiber
Another good example is the Swisscom national condo fiber project being deployed in partnership with numerous energy companies in that country
So my number one suggestion to FCC is that they work with R&E community and fund a number of NGA pilots that promote facilities based competition
For more information please see the links on your screen
Tuesday, June 23, 2009
American R&E networks are the lynch pin for broadband strategy
American R&E networks are the lynch pin for broadband strategy
[It is exciting to see the important leadership role that R&E networks in the USA and other countries are playing in the development and deployment of a national broadband strategy. R&E networks have the unique skill sets and knowledge on how to deploy low cost open infrastructure network through their existing extensive fiber deployments. They already connect many of the key public institutions such as schools, libraries, etc that are essential part of any broadband strategy. More importantly the R&E networks were the original organizations to bring Internet to the masses and have played a critical innovation role ever since in new network business models such as customer owned & controlled networks, green broadband, etc. Thanks to Mike Totten for these pointers –BSA]
New Coalition Will Work to Bring Broadband Internet Access to the Public (United States)
http://chronicle.com/wiredcampus/article/3821/new-coalition-will-work-to-bring-broadband-internet-access-to-the-public
[June 11]
New Coalition Will Work to Bring Broadband Internet Access to the Public
A group of education, health, and library advocates has formed a new coalition to expand broadband Internet access. It will focus on how to most efficiently bring access to the public by using community institutions — including community colleges and other higher-education institutions — as a base.
The new Schools, Health and Libraries Broadband Coalition is made up of 28 commercial and not-for-profit groups, including the American Library Association, Internet2, and and Educause. It will seek federal money to provide broadband access first through “anchor institutions,” such as colleges, schools, libraries, and hospitals, since millions of people rely on those institutions already. The coalition says the high-speed connections could help schools and community colleges offer specialized courses and distance learning, could help health-care facilities make better use of telemedicine, and could help colleges and universities advance research.
“There’s not enough money in the stimulus bill to bring fiber optics to everybody’s home,” said the coalition’s coordinator, John Windhausen Jr. “One of the best ways is to bring the broadband to where the most people are likely to get it.”
[…]
New Coalition to Promote the Deployment of High-Capacity Broadband to Anchor Institutions and the Community (United States)
https://wiki.internet2.edu/confluence/display/realtime/2009/06/11/Schools%2C+Health+and+Libraries+Broadband+Coalition+Announced
[June 11]
NLR Offers Available, Leading-Edge Infrastructure for America's Network
http://www.nlr.net/release.php?id=45
National LambdaRail (NLR), the cutting-edge network for advanced research and innovation owned by the U.S. research and education community, announces "America's Network" by offering its coast-to-coast platform and demonstrated expertise as the foundation for a national broadband infrastructure.
NLR has submitted this offer to the Federal Communications Commission, and urges the Commission to take advantage of NLR's readily available, secure, high-speed and user-neutral national backbone as a giant leap forward towards the realization of the goals of the American Recovery and Reinvestment Act (ARRA) and broadband in the U.S.
The NLR proposal presents a concrete roadmap for realizing the ARRA vision of stimulating innovation, research and economic development by rapidly upgrading the country's broadband infrastructure.
The nearly 30 state and multi-state regional optical networks (RONs) interconnected to NLR provide the nucleus of a truly nationwide broadband backbone, America's Network. With federal assistance, for a fraction of the cost of constructing a new nationwide network infrastructure, broadband connectivity at the highest speeds can be brought quickly to Americans in every state in the Union.
Just as a broad range of community organizations across the country -- from schools to public libraries to hospitals -- today enjoy broadband services by connecting to NLR through their regional network, NLR, as the infrastructure for America's Network, would enable all kinds of users and providers to benefit from inexpensive broadband access to an already existing, leading-edge network at state-of-the-art speed and efficiency.
[It is exciting to see the important leadership role that R&E networks in the USA and other countries are playing in the development and deployment of a national broadband strategy. R&E networks have the unique skill sets and knowledge on how to deploy low cost open infrastructure network through their existing extensive fiber deployments. They already connect many of the key public institutions such as schools, libraries, etc that are essential part of any broadband strategy. More importantly the R&E networks were the original organizations to bring Internet to the masses and have played a critical innovation role ever since in new network business models such as customer owned & controlled networks, green broadband, etc. Thanks to Mike Totten for these pointers –BSA]
New Coalition Will Work to Bring Broadband Internet Access to the Public (United States)
http://chronicle.com/wiredcampus/article/3821/new-coalition-will-work-to-bring-broadband-internet-access-to-the-public
[June 11]
New Coalition Will Work to Bring Broadband Internet Access to the Public
A group of education, health, and library advocates has formed a new coalition to expand broadband Internet access. It will focus on how to most efficiently bring access to the public by using community institutions — including community colleges and other higher-education institutions — as a base.
The new Schools, Health and Libraries Broadband Coalition is made up of 28 commercial and not-for-profit groups, including the American Library Association, Internet2, and and Educause. It will seek federal money to provide broadband access first through “anchor institutions,” such as colleges, schools, libraries, and hospitals, since millions of people rely on those institutions already. The coalition says the high-speed connections could help schools and community colleges offer specialized courses and distance learning, could help health-care facilities make better use of telemedicine, and could help colleges and universities advance research.
“There’s not enough money in the stimulus bill to bring fiber optics to everybody’s home,” said the coalition’s coordinator, John Windhausen Jr. “One of the best ways is to bring the broadband to where the most people are likely to get it.”
[…]
New Coalition to Promote the Deployment of High-Capacity Broadband to Anchor Institutions and the Community (United States)
https://wiki.internet2.edu/confluence/display/realtime/2009/06/11/Schools%2C+Health+and+Libraries+Broadband+Coalition+Announced
[June 11]
NLR Offers Available, Leading-Edge Infrastructure for America's Network
http://www.nlr.net/release.php?id=45
National LambdaRail (NLR), the cutting-edge network for advanced research and innovation owned by the U.S. research and education community, announces "America's Network" by offering its coast-to-coast platform and demonstrated expertise as the foundation for a national broadband infrastructure.
NLR has submitted this offer to the Federal Communications Commission, and urges the Commission to take advantage of NLR's readily available, secure, high-speed and user-neutral national backbone as a giant leap forward towards the realization of the goals of the American Recovery and Reinvestment Act (ARRA) and broadband in the U.S.
The NLR proposal presents a concrete roadmap for realizing the ARRA vision of stimulating innovation, research and economic development by rapidly upgrading the country's broadband infrastructure.
The nearly 30 state and multi-state regional optical networks (RONs) interconnected to NLR provide the nucleus of a truly nationwide broadband backbone, America's Network. With federal assistance, for a fraction of the cost of constructing a new nationwide network infrastructure, broadband connectivity at the highest speeds can be brought quickly to Americans in every state in the Union.
Just as a broad range of community organizations across the country -- from schools to public libraries to hospitals -- today enjoy broadband services by connecting to NLR through their regional network, NLR, as the infrastructure for America's Network, would enable all kinds of users and providers to benefit from inexpensive broadband access to an already existing, leading-edge network at state-of-the-art speed and efficiency.
The Internet has emerged as a key driver of economic growth
[Here are two interesting pointers on how the Internet has emerged as a key engine for economic growth. Thanks to Rob Tai from a posting on Google policy blog and Dan Murphy –BSA]
http://www.viddler.com/explore/geoffdaily/videos/2/117.192/
Economic Impacts from Investments in Broadband
Broadband infrastructure is essential for the effective participation of businesses and organizations in today's economy. Research conducted by Strategic Networks Group (SNG), finds that the local economic growth and secondary investment enabled by broadband is 10 times the initial broadband investment and the contribution to Gross Domestic Product (GDP) is 15 times the initial investment. Broadband today is as vital as electrification was in the 1930's and increased participation in the digital economy results in economic and quality of life improvements.
Investments in broadband infrastructure strengthen regional economies by improving skills, competitiveness, and service delivery, thus enhancing the local business environment. Broadband provides communications and information-sharing capabilities that have trans-formative effects on business operations and relationships. Opportunities and additional value are created from new or improved products and services, the ability to create and leverage new business models, better management of resources, and increased operational efficiencies. The ‘playing field' is leveled between large vs. small, urban vs. rural, established businesses and entrepreneurs.
• For businesses of any size, broadband and e-business applications enable them to expand their market reach and revenue opportunities, improve service to customers, and become more competitive and profitable in the global economy. In particular, broadband delivers affordable e-business solutions for small and medium enterprises through hosted applications and Internet-based service innovations.
• For organizations of any size, broadband and e-business applications enable them to improve service, become more cost effective and continue to be effective in the face of increased service demand and shrinking budgets.
• For individuals and residences, broadband delivers services such as Internet telephony (VoIP), Internet connectivity, movies and IP TV, tele-health, as well as new opportunities such as being able to work from home.
The impacts from investments in broadband infrastructure and broadband-enabled applications are measurable and significant. The immediate effects from an investment in broadband are twofold. The first comes from the construction of the broadband infrastructure. The affected residents, businesses and organizations then invest in the skills, know-how, tools and facilities to take advantage of broadband and e-business solutions so they can more effectively participate in the digital economy. These secondary economic effects take longer, but have far greater impact as they improve productivity, competitiveness and quality of service on an ongoing basis.
The table below provides two examples of the economic impact of broadband investments (infrastructure and applications such as e-business, tele-medicine and remote learning) from studies conducted by SNG in Canada in 2004 and 2003. Broadband impact data on new revenues, reduced costs, and new jobs were collected directly from businesses and organizations. Investment in broadband increased the retention and expansion of existing businesses and organizations within the local area. As businesses became more profitable and competitive, they expanded, which increased local employment, local training, local spending and investment in facilities and equipment. These outcomes flow through to the rest of the local economy creating a multiplier effect from the initial broadband investment. Input-Output impact models were used to calculate the contribution to GDP:
Investment and Impacts of Broadband Infrastructure Case example[1] from investment in community broadband infrastructure Case example from investment in e-learning, tele-medicine and broadband infrastructure
Initial Investment in broadband infrastructure by government $10 million $10 million
Leveraged Investment (i.e. first round effect) from other sources (private sector, other levels of government, etc.) $116 million $101 million
Total Investment $126 million $111 million
Contribution to GDP from Total Investment $164 million $150 million
Contribution to Total Employment* 2,100 4,800
Contribution to Taxes*
(State and Federal) $61 million $32 million
*Note – impacts on Total Employment and Taxes vary according to the types of jobs created and investments made. Adapted from a study conducted by SNG in 2003 for Department of Trade & Industry, United Kingdom. Adapted from a study conducted by SNG in 2004 for Industry Canada.
Infrastructure investment in broadband provides a healthy return in its own right, even for relatively small and rural communities. Broadband infrastructure construction has job impacts immediately in the local economy like typical shovel-ready investments. More significantly, as the table above shows, investment in broadband infrastructure increases the retention and expansion of businesses and organizations to a community or region.
The positive effects from broadband infrastructure become even more important in times of economic uncertainty, especially for smaller, rural communities who often have less economic diversity and resilience to withstand the effects of an economic slowdown. The availability of broadband gives enterprises, in smaller or rural communities, the options to expand their market reach and thereby becoming more efficient during difficult economic times. Individuals can use broadband find new opportunities, including supplemental education and skills training, out-of-region employment through tele-working / working from home, and from starting-up new businesses.
In parallel with addressing broadband gaps in coverage, capacity and quality of service, local businesses and organizations need to understand how to ‘connect the dots' and move having broadband towards effective participation in the digital economy. This involves promoting awareness of the benefits of broadband and e-business solutions with the follow-on of supporting implementations that benefit business operations and service delivery. This often requires intensive engagement and repeated visits, however such a ‘grass roots' approach with individual businesses and organizations is needed for them to understand and apply the lessons of "best of class" e-business solutions and success models. Such local linkages are needed to show how investing in broadband infrastructure offers a significant and immediate economic lift to a community or region. Addressing local broadband infrastructure gaps and accelerating adoption of e-business solutions are fundamental to promoting regional economic development that is built on competitive businesses and skilled human resources.
Just like electrification in the '30s, broadband service (coverage, capacity and quality of service) is critical to the long term prosperity of communities and gaps need to be addressed, whether for un-served rural areas or under-served regions. Service providers that have not made investments in these regions during good economic time are even less likely to do so during an economic slowdown. Furthermore, just like during the electrification expansion of the 1930's, many municipalities do not have the financial or human resources to make the broadband investment.
In summary, investing in broadband infrastructure now can alleviate the challenges faced by communities and regions in the short term, while providing a positive impact on long term economic and community development.
[From Google Policy Blog..]
With news of bankruptcies and bailouts dominating the headlines recently, it's easy to lose sight of one of the bright spots in our economy: the Internet. In an incredibly short amount of time the Internet has emerged as a key driver of economic growth, creating millions of American jobs that generate hundreds of billions of dollars in economic activity.
… According to Harvard Business School professors John Deighton and John Quelch, the Internet is responsible for 3.1 million American jobs and $300 billion in economic activity spread throughout the United States. As Professors Deighton and Quelch put it, the web "has created unprecedented opportunities for growth among small businesses and individual entrepreneurs."
As the report makes clear, it's difficult to overstate the social and economic benefits of the Internet on the United States. Unlike any other platform in history, it has empowered entrepreneurs to start new businesses and connect with customers around the world, and has provided users with access to unprecedented amounts of information.
We think it's important for policymakers to understand the social and economic benefits of the Internet. That's why I was happy to see IAB also announce this afternoon the launch of the Long Tail Alliance, a group of small independent online businesses working to educate policymakers about the benefits of online advertising and to advocate against burdensome restrictions that would damage the Internet economy. In conjunction with the release of the new study, a group of Long Tail Alliance members representing 25 Congressional districts and 13 states took a maiden voyage to Washington to tell Congress their story. Check out some of what they have to say at "I Am the Long Tail."
http://www.iab.net/about_the_iab/recent_press_releases/press_release_archive/press_release/pr-061009-value
WASHINGTON, D.C. (June 10, 2009) – Interactive advertising is responsible for $300 billion of economic activity in the U.S., according to a new study released today by the Interactive Advertising Bureau (IAB). The advertising-supported Internet represents 2.1% of the total U.S. gross domestic product (GDP). It directly employs more than 1.2 million Americans with above-average wages in jobs that did not exist two decades ago, and another 1.9 million people work to support those with directly Internet-related jobs. A total of 3.1 million Americans are employed thanks to the interactive ecosystem. These are the key findings of the first-ever research to analyze the economic importance, as well as the social benefits, of the Internet.
“This is the first time anyone has undertaken a comprehensive analysis of the size and scope of the Internet economy and measurement of its economic and social benefits,” said Professor Deighton, the Harold M. Brierley Professor of Business Administration at Harvard Business School, and an author of the study. “I am convinced the results of this study will prove useful for business leaders, legislators and the educational community.”
“This study underscores that the Internet ecosystem is generating an increasing level of economic activity in every corner of the nation,” said Professor Quelch, the Lincoln Filene Professor of Business Administration at Harvard Business School and a co-author of the study.
The study looks at the entire interactive marketing ecosystem, which includes:
• The ad-supported Internet, narrowly defined as the content and usage supported by an estimated $23.4 billion of Internet advertising in 2008
• E-commerce
• E-mail, the cornerstone of lead generation and customer care for many companies
• Enterprise websites, the Web sites that businesses, large and small, develop and maintain for communication.
Among some of the other important findings:
• Small businesses have thrived as a result of the Internet:
o There are more than 20,000 Internet-related small businesses in the U.S. that provide a variety of services such as web hosting, ISP services, web design, publishing, and Internet-based software consulting. Many of these businesses have 10 or fewer employees.
• Internet-related employment is particularly important to certain areas of the country but exists in every one of the 435 U.S. Congressional Districts. Some Congressional Districts have more than 6,000 Internet-related employees.
• Interactive advertising has substantially reduced what consumers have to pay for access to the Internet and for e-commerce products and services. In addition to its financial contribution to the U.S. economy, the Internet has produced large social consequences as an infrastructure and platform, providing American society comprehensive qualitative benefits that include:
o Universal access to an almost unlimited source of information
o Increased productivity (output per unit of capital or labor, or increased consumer utility at a lower cost)
o Innovation in business practices, consumer behavior, commerce and media
o Empowerment of entrepreneurs to start small businesses, find customers and grow
o Environmental benefits derived from saving natural resources lowering pollution through the reduced use of petroleum-based fuels and paper
•
http://www.viddler.com/explore/geoffdaily/videos/2/117.192/
Economic Impacts from Investments in Broadband
Broadband infrastructure is essential for the effective participation of businesses and organizations in today's economy. Research conducted by Strategic Networks Group (SNG), finds that the local economic growth and secondary investment enabled by broadband is 10 times the initial broadband investment and the contribution to Gross Domestic Product (GDP) is 15 times the initial investment. Broadband today is as vital as electrification was in the 1930's and increased participation in the digital economy results in economic and quality of life improvements.
Investments in broadband infrastructure strengthen regional economies by improving skills, competitiveness, and service delivery, thus enhancing the local business environment. Broadband provides communications and information-sharing capabilities that have trans-formative effects on business operations and relationships. Opportunities and additional value are created from new or improved products and services, the ability to create and leverage new business models, better management of resources, and increased operational efficiencies. The ‘playing field' is leveled between large vs. small, urban vs. rural, established businesses and entrepreneurs.
• For businesses of any size, broadband and e-business applications enable them to expand their market reach and revenue opportunities, improve service to customers, and become more competitive and profitable in the global economy. In particular, broadband delivers affordable e-business solutions for small and medium enterprises through hosted applications and Internet-based service innovations.
• For organizations of any size, broadband and e-business applications enable them to improve service, become more cost effective and continue to be effective in the face of increased service demand and shrinking budgets.
• For individuals and residences, broadband delivers services such as Internet telephony (VoIP), Internet connectivity, movies and IP TV, tele-health, as well as new opportunities such as being able to work from home.
The impacts from investments in broadband infrastructure and broadband-enabled applications are measurable and significant. The immediate effects from an investment in broadband are twofold. The first comes from the construction of the broadband infrastructure. The affected residents, businesses and organizations then invest in the skills, know-how, tools and facilities to take advantage of broadband and e-business solutions so they can more effectively participate in the digital economy. These secondary economic effects take longer, but have far greater impact as they improve productivity, competitiveness and quality of service on an ongoing basis.
The table below provides two examples of the economic impact of broadband investments (infrastructure and applications such as e-business, tele-medicine and remote learning) from studies conducted by SNG in Canada in 2004 and 2003. Broadband impact data on new revenues, reduced costs, and new jobs were collected directly from businesses and organizations. Investment in broadband increased the retention and expansion of existing businesses and organizations within the local area. As businesses became more profitable and competitive, they expanded, which increased local employment, local training, local spending and investment in facilities and equipment. These outcomes flow through to the rest of the local economy creating a multiplier effect from the initial broadband investment. Input-Output impact models were used to calculate the contribution to GDP:
Investment and Impacts of Broadband Infrastructure Case example[1] from investment in community broadband infrastructure Case example from investment in e-learning, tele-medicine and broadband infrastructure
Initial Investment in broadband infrastructure by government $10 million $10 million
Leveraged Investment (i.e. first round effect) from other sources (private sector, other levels of government, etc.) $116 million $101 million
Total Investment $126 million $111 million
Contribution to GDP from Total Investment $164 million $150 million
Contribution to Total Employment* 2,100 4,800
Contribution to Taxes*
(State and Federal) $61 million $32 million
*Note – impacts on Total Employment and Taxes vary according to the types of jobs created and investments made. Adapted from a study conducted by SNG in 2003 for Department of Trade & Industry, United Kingdom. Adapted from a study conducted by SNG in 2004 for Industry Canada.
Infrastructure investment in broadband provides a healthy return in its own right, even for relatively small and rural communities. Broadband infrastructure construction has job impacts immediately in the local economy like typical shovel-ready investments. More significantly, as the table above shows, investment in broadband infrastructure increases the retention and expansion of businesses and organizations to a community or region.
The positive effects from broadband infrastructure become even more important in times of economic uncertainty, especially for smaller, rural communities who often have less economic diversity and resilience to withstand the effects of an economic slowdown. The availability of broadband gives enterprises, in smaller or rural communities, the options to expand their market reach and thereby becoming more efficient during difficult economic times. Individuals can use broadband find new opportunities, including supplemental education and skills training, out-of-region employment through tele-working / working from home, and from starting-up new businesses.
In parallel with addressing broadband gaps in coverage, capacity and quality of service, local businesses and organizations need to understand how to ‘connect the dots' and move having broadband towards effective participation in the digital economy. This involves promoting awareness of the benefits of broadband and e-business solutions with the follow-on of supporting implementations that benefit business operations and service delivery. This often requires intensive engagement and repeated visits, however such a ‘grass roots' approach with individual businesses and organizations is needed for them to understand and apply the lessons of "best of class" e-business solutions and success models. Such local linkages are needed to show how investing in broadband infrastructure offers a significant and immediate economic lift to a community or region. Addressing local broadband infrastructure gaps and accelerating adoption of e-business solutions are fundamental to promoting regional economic development that is built on competitive businesses and skilled human resources.
Just like electrification in the '30s, broadband service (coverage, capacity and quality of service) is critical to the long term prosperity of communities and gaps need to be addressed, whether for un-served rural areas or under-served regions. Service providers that have not made investments in these regions during good economic time are even less likely to do so during an economic slowdown. Furthermore, just like during the electrification expansion of the 1930's, many municipalities do not have the financial or human resources to make the broadband investment.
In summary, investing in broadband infrastructure now can alleviate the challenges faced by communities and regions in the short term, while providing a positive impact on long term economic and community development.
[From Google Policy Blog..]
With news of bankruptcies and bailouts dominating the headlines recently, it's easy to lose sight of one of the bright spots in our economy: the Internet. In an incredibly short amount of time the Internet has emerged as a key driver of economic growth, creating millions of American jobs that generate hundreds of billions of dollars in economic activity.
… According to Harvard Business School professors John Deighton and John Quelch, the Internet is responsible for 3.1 million American jobs and $300 billion in economic activity spread throughout the United States. As Professors Deighton and Quelch put it, the web "has created unprecedented opportunities for growth among small businesses and individual entrepreneurs."
As the report makes clear, it's difficult to overstate the social and economic benefits of the Internet on the United States. Unlike any other platform in history, it has empowered entrepreneurs to start new businesses and connect with customers around the world, and has provided users with access to unprecedented amounts of information.
We think it's important for policymakers to understand the social and economic benefits of the Internet. That's why I was happy to see IAB also announce this afternoon the launch of the Long Tail Alliance, a group of small independent online businesses working to educate policymakers about the benefits of online advertising and to advocate against burdensome restrictions that would damage the Internet economy. In conjunction with the release of the new study, a group of Long Tail Alliance members representing 25 Congressional districts and 13 states took a maiden voyage to Washington to tell Congress their story. Check out some of what they have to say at "I Am the Long Tail."
http://www.iab.net/about_the_iab/recent_press_releases/press_release_archive/press_release/pr-061009-value
WASHINGTON, D.C. (June 10, 2009) – Interactive advertising is responsible for $300 billion of economic activity in the U.S., according to a new study released today by the Interactive Advertising Bureau (IAB). The advertising-supported Internet represents 2.1% of the total U.S. gross domestic product (GDP). It directly employs more than 1.2 million Americans with above-average wages in jobs that did not exist two decades ago, and another 1.9 million people work to support those with directly Internet-related jobs. A total of 3.1 million Americans are employed thanks to the interactive ecosystem. These are the key findings of the first-ever research to analyze the economic importance, as well as the social benefits, of the Internet.
“This is the first time anyone has undertaken a comprehensive analysis of the size and scope of the Internet economy and measurement of its economic and social benefits,” said Professor Deighton, the Harold M. Brierley Professor of Business Administration at Harvard Business School, and an author of the study. “I am convinced the results of this study will prove useful for business leaders, legislators and the educational community.”
“This study underscores that the Internet ecosystem is generating an increasing level of economic activity in every corner of the nation,” said Professor Quelch, the Lincoln Filene Professor of Business Administration at Harvard Business School and a co-author of the study.
The study looks at the entire interactive marketing ecosystem, which includes:
• The ad-supported Internet, narrowly defined as the content and usage supported by an estimated $23.4 billion of Internet advertising in 2008
• E-commerce
• E-mail, the cornerstone of lead generation and customer care for many companies
• Enterprise websites, the Web sites that businesses, large and small, develop and maintain for communication.
Among some of the other important findings:
• Small businesses have thrived as a result of the Internet:
o There are more than 20,000 Internet-related small businesses in the U.S. that provide a variety of services such as web hosting, ISP services, web design, publishing, and Internet-based software consulting. Many of these businesses have 10 or fewer employees.
• Internet-related employment is particularly important to certain areas of the country but exists in every one of the 435 U.S. Congressional Districts. Some Congressional Districts have more than 6,000 Internet-related employees.
• Interactive advertising has substantially reduced what consumers have to pay for access to the Internet and for e-commerce products and services. In addition to its financial contribution to the U.S. economy, the Internet has produced large social consequences as an infrastructure and platform, providing American society comprehensive qualitative benefits that include:
o Universal access to an almost unlimited source of information
o Increased productivity (output per unit of capital or labor, or increased consumer utility at a lower cost)
o Innovation in business practices, consumer behavior, commerce and media
o Empowerment of entrepreneurs to start small businesses, find customers and grow
o Environmental benefits derived from saving natural resources lowering pollution through the reduced use of petroleum-based fuels and paper
•
Thursday, June 4, 2009
How R&E networks can help small business
[Here is an excellent blog from the BCnet web site describing the benefits of transit exchanges for small businesses in British Columbia. BCnet and several other R&E networks such as KAREN in New Zealand operate transit and/or peering exchanges for the benefit of their research and education institutions. Not only do these facilities provide a valuable service to their members in terms of providing competitive access to multiple service providers locally, they also help small businesses and community networks to access a number of competitive commercial providers. BCnet has established a number of transit exchanges at several communities throughout British Columbia and operates them on the community’s behalf—BSA]
https://wiki.bc.net/atl-conf/display/BCNETPUBLIC/2009/05/27/Transit+Exchange+helps+Novus+Entertainment+Save+on+Internet+Costs+and+Improve+Performance
Transit Exchange helps Novus Entertainment Save on Internet Costs and Improve Performance
Novus Entertainment Inc., a Vancouver-based premier service provider for television, high speed Internet and digital phone, is claiming cost savings and improved performance by connecting to BCNET's Vancouver Transit Exchange.
Chris Burnes, Manager of ISP Operations with Novus Entertainment, is responsible for connecting Novus to the BCNET Transit Exchange. Burnes spoke to us about his experience, "We came into BCNET's Transit Exchange at Harbour Centre on a one month trial to test out the waters. During that first month, we immediately received customer feedback about the improved network performance. Particularly, we have a number of university students residents and they have found the network to be extremely fast when they are collaborating and sharing data with the universities in Vancouver.'' Novus Entertainment has been using the BCNET Transit Exchange for over two years.
Novus delivers advanced high speed Internet services to Metro Vancouver's high-rise residents. Their claim to fame is delivering the fastest residential service in Western Canada and having built their own state-of-the-art fibre optic network throughout downtown Vancouver. Their residential network boasts speeds of up to 50 Mbps without a modem.
Peering Delivers Cost Savings
Novus is directly peering with BCNET and over 40 different research and higher education organizations, as well as industry and government, at the Vancouver Transit Exchange. Burnes commented, "One of the biggest advantages to peering with BCNET and its members is that we are realizing large cost savings by reducing our data trafficking fees. We can siphon off a portion of our traffic through BCNET, rather than the traffic going to one of our upstream providers, which would incur a megabit per second fee. Through BCNET's Transit Exchange, we have the most direct access."
A Marketplace of Internet Services
In addition, the Exchange provides a marketplace to purchase network services from nine commercial Internet upstream providers. The advantage of having a "shopping mall" of service providers at one central location, according to Burnes, is the simplicity and low cost of switching providers. Burnes says, "If we want to get the most competitive transit fees, we can easily switch providers at the Vancouver Transit Exchange. It is as simple as getting a VLAN to a new provider."
Finally, Burnes has a larger vision for the future. He sees the growth potential of the Exchange and the opportunity to form new alliances with organizations that are connected to the Exchange, "Interconnecting opens up new possibilities for partnerships and collaborations with Vancouver organizations." Burnes is currently talking to a few Vancouver organizations that are peering at the Exchange about potential business ventures.
https://wiki.bc.net/atl-conf/display/BCNETPUBLIC/2009/05/27/Transit+Exchange+helps+Novus+Entertainment+Save+on+Internet+Costs+and+Improve+Performance
Transit Exchange helps Novus Entertainment Save on Internet Costs and Improve Performance
Novus Entertainment Inc., a Vancouver-based premier service provider for television, high speed Internet and digital phone, is claiming cost savings and improved performance by connecting to BCNET's Vancouver Transit Exchange.
Chris Burnes, Manager of ISP Operations with Novus Entertainment, is responsible for connecting Novus to the BCNET Transit Exchange. Burnes spoke to us about his experience, "We came into BCNET's Transit Exchange at Harbour Centre on a one month trial to test out the waters. During that first month, we immediately received customer feedback about the improved network performance. Particularly, we have a number of university students residents and they have found the network to be extremely fast when they are collaborating and sharing data with the universities in Vancouver.'' Novus Entertainment has been using the BCNET Transit Exchange for over two years.
Novus delivers advanced high speed Internet services to Metro Vancouver's high-rise residents. Their claim to fame is delivering the fastest residential service in Western Canada and having built their own state-of-the-art fibre optic network throughout downtown Vancouver. Their residential network boasts speeds of up to 50 Mbps without a modem.
Peering Delivers Cost Savings
Novus is directly peering with BCNET and over 40 different research and higher education organizations, as well as industry and government, at the Vancouver Transit Exchange. Burnes commented, "One of the biggest advantages to peering with BCNET and its members is that we are realizing large cost savings by reducing our data trafficking fees. We can siphon off a portion of our traffic through BCNET, rather than the traffic going to one of our upstream providers, which would incur a megabit per second fee. Through BCNET's Transit Exchange, we have the most direct access."
A Marketplace of Internet Services
In addition, the Exchange provides a marketplace to purchase network services from nine commercial Internet upstream providers. The advantage of having a "shopping mall" of service providers at one central location, according to Burnes, is the simplicity and low cost of switching providers. Burnes says, "If we want to get the most competitive transit fees, we can easily switch providers at the Vancouver Transit Exchange. It is as simple as getting a VLAN to a new provider."
Finally, Burnes has a larger vision for the future. He sees the growth potential of the Exchange and the opportunity to form new alliances with organizations that are connected to the Exchange, "Interconnecting opens up new possibilities for partnerships and collaborations with Vancouver organizations." Burnes is currently talking to a few Vancouver organizations that are peering at the Exchange about potential business ventures.
Wednesday, May 27, 2009
Must read - radically new data transfer protocol - SNDTP
From International Science Grid Week
http://www.isgtw.org/?pid=1001810
________________________________________
The snail-based system in feed-forward action. Image courtesy Herbert Bishko. Photo on front page courtesy Lysanne Ooteman, stock.exchng
If you think you have problems with the sometimes slow pace at which information travels from one computer to another, then consider the solution offered by this scientific paper: “Snail-based Data Transfer Protocol.”
It describes an experiment in data transfer using real, genuine, live snails, along with a “lettuce-based guidance system.”
No lie.
The papers’ authors, Shimon Schocken, dean of Efi Arazi School of Computer Science Herzliya, and Revital Ben-David-Zaslow of the Department of Zoology at Tel Aviv University, Israel, reported that their experiment delivered a 37 million bits-per-second data transfer rate — faster than ADSL.
[….]
The paper does admit to a drawback: “In some regions, most notably France, culinary habits may pose a denial-of-service (DOS) problem.”
http://www.isgtw.org/?pid=1001810
________________________________________
The snail-based system in feed-forward action. Image courtesy Herbert Bishko. Photo on front page courtesy Lysanne Ooteman, stock.exchng
If you think you have problems with the sometimes slow pace at which information travels from one computer to another, then consider the solution offered by this scientific paper: “Snail-based Data Transfer Protocol.”
It describes an experiment in data transfer using real, genuine, live snails, along with a “lettuce-based guidance system.”
No lie.
The papers’ authors, Shimon Schocken, dean of Efi Arazi School of Computer Science Herzliya, and Revital Ben-David-Zaslow of the Department of Zoology at Tel Aviv University, Israel, reported that their experiment delivered a 37 million bits-per-second data transfer rate — faster than ADSL.
[….]
The paper does admit to a drawback: “In some regions, most notably France, culinary habits may pose a denial-of-service (DOS) problem.”
Tuesday, May 19, 2009
Economic Benefits of Broadband in age of peak oil and cap and trade
[A very good comprehensive analysis of economic benefits of broadband. I would also argue that broadband is going to more critical to our future economy as we enter age of peak oil and cap and trade. As the famous economist Jeff Rubin argues the end of the age of global trade is coming to end. Only virtual products and services on broadband networks will be able to transcend this new economic transition– BSA]
http://www.nyls.edu/user_files/1/3/4/30/84/187/245/Brogan,%20SPRING%202009,%2018%20MEDIA%20L.%20&%20POL%E2%80%99Y.pdf
Source: USTelecom
New USTelecom analysis reveals that the broadband-fueled Information, Communications and Technology (ICT) industries far outpace all other sectors in contributions to U.S. economic growth and provide among the highest-earning, fastest-growing jobs in the country. The analysis, “The Economic Benefits of Broadband & Information Technology,” was published in the latest edition of New York Law School’s Media Law & Policy journal.
The converging sectors of broadband, media and information technology add nearly $900 billion annually to the nation’s economy and are expanding at a rate that is two to five times faster than the overall U.S. economy. The USTelecom analysis also points out that jobs in the broadband/ICT sector pay 50% more than the hourly national average, and broadband-enabled jobs are projected to remain among the leading high-growth areas for at least the next 10 years.
Based on our analysis of the Bureau of Labor Statistics’ Occupational Employment Statistics, there were more than 10 million broadband/ICT jobs in 2007, with 5.7 million in ICT industries such as broadband service providers, content producers and equipment manufacturers. Additionally, the data shows 4.4 million ICT-centric jobs in industries outside of the ICT sector (e.g., network administrators in schools and hospitals). And, these employment tallies do not count the many more jobs made possible by broadband as an enabling technology—for example, self-employed rural Americans working from home over the high-speed Internet.
http://beta.theglobeandmail.com/globe-investor/a-coming-world-thats-a-whole-lot-smaller/article1141752/
Mr. Rubin has taken his long-standing forecast that inevitably declining production and rising demand will send oil prices inexorably higher - over $200 (U.S.) a barrel by 2012 or earlier, just for a start - and imagines how the world will have to change to adjust to such a reality.
“ I think, ironically, it's going to be a return to the past ... in terms of the re-emergence of local economies. ”— Jeff Rubin
Like many oil crisis prophets, Mr. Rubin is a disciple of "peak oil" theory - the concept that world oil production is near its peak, and is destined to a long, slow decline, as existing low-cost oil fields dry up and new supplies become harder and more expensive to unlock.
But unlike many previous peak oil books, which typically don't get much past "we're in big trouble," Mr. Rubin's conclusions are refreshingly optimistic. His world of the oil-starved future, at least for Western societies, looks a lot like the bygone years of our fond memory, where people work and vacation nearer to home, eat locally grown foods and buy locally produced goods, and suburban sprawl is replaced by revitalized cities.
"I think it will really restructure the economy in ways that people haven't even begun to imagine," he said. "But I think, ironically, it's going to be a return to the past ... in terms of the re-emergence of local economies."
Indeed, the book's title is derived from this central argument - that expensive fuel will force a reversal of globalization, as long-distance trade becomes increasingly expensive and impractical. The only alternative may be a relentless cycle of economic shocks triggered by oil price surges.
"Chances are, we're going to bang our head on this oil constraint very soon in an economic recovery, unless certain things change. And I don't think we're going to have to wait five years to test that Rubin hypothesis. I think in the next six to 12 months you're going to see that," he said.
"There's a lot of historical context to suggest that we can change, that things have evolved in response to economic signals," he said. "But in order to change ... we're going to have to rearrange a whole lot more things than perhaps we recognize."
http://www.nyls.edu/user_files/1/3/4/30/84/187/245/Brogan,%20SPRING%202009,%2018%20MEDIA%20L.%20&%20POL%E2%80%99Y.pdf
Source: USTelecom
New USTelecom analysis reveals that the broadband-fueled Information, Communications and Technology (ICT) industries far outpace all other sectors in contributions to U.S. economic growth and provide among the highest-earning, fastest-growing jobs in the country. The analysis, “The Economic Benefits of Broadband & Information Technology,” was published in the latest edition of New York Law School’s Media Law & Policy journal.
The converging sectors of broadband, media and information technology add nearly $900 billion annually to the nation’s economy and are expanding at a rate that is two to five times faster than the overall U.S. economy. The USTelecom analysis also points out that jobs in the broadband/ICT sector pay 50% more than the hourly national average, and broadband-enabled jobs are projected to remain among the leading high-growth areas for at least the next 10 years.
Based on our analysis of the Bureau of Labor Statistics’ Occupational Employment Statistics, there were more than 10 million broadband/ICT jobs in 2007, with 5.7 million in ICT industries such as broadband service providers, content producers and equipment manufacturers. Additionally, the data shows 4.4 million ICT-centric jobs in industries outside of the ICT sector (e.g., network administrators in schools and hospitals). And, these employment tallies do not count the many more jobs made possible by broadband as an enabling technology—for example, self-employed rural Americans working from home over the high-speed Internet.
http://beta.theglobeandmail.com/globe-investor/a-coming-world-thats-a-whole-lot-smaller/article1141752/
Mr. Rubin has taken his long-standing forecast that inevitably declining production and rising demand will send oil prices inexorably higher - over $200 (U.S.) a barrel by 2012 or earlier, just for a start - and imagines how the world will have to change to adjust to such a reality.
“ I think, ironically, it's going to be a return to the past ... in terms of the re-emergence of local economies. ”— Jeff Rubin
Like many oil crisis prophets, Mr. Rubin is a disciple of "peak oil" theory - the concept that world oil production is near its peak, and is destined to a long, slow decline, as existing low-cost oil fields dry up and new supplies become harder and more expensive to unlock.
But unlike many previous peak oil books, which typically don't get much past "we're in big trouble," Mr. Rubin's conclusions are refreshingly optimistic. His world of the oil-starved future, at least for Western societies, looks a lot like the bygone years of our fond memory, where people work and vacation nearer to home, eat locally grown foods and buy locally produced goods, and suburban sprawl is replaced by revitalized cities.
"I think it will really restructure the economy in ways that people haven't even begun to imagine," he said. "But I think, ironically, it's going to be a return to the past ... in terms of the re-emergence of local economies."
Indeed, the book's title is derived from this central argument - that expensive fuel will force a reversal of globalization, as long-distance trade becomes increasingly expensive and impractical. The only alternative may be a relentless cycle of economic shocks triggered by oil price surges.
"Chances are, we're going to bang our head on this oil constraint very soon in an economic recovery, unless certain things change. And I don't think we're going to have to wait five years to test that Rubin hypothesis. I think in the next six to 12 months you're going to see that," he said.
"There's a lot of historical context to suggest that we can change, that things have evolved in response to economic signals," he said. "But in order to change ... we're going to have to rearrange a whole lot more things than perhaps we recognize."
Moving to IPv4 address market trading - critical role for R&E networks?
[There are several excellent articles in this month’s issue of The Internet Protocol Journal. Of particular note is the article by Niall Murphy of Google and David Wilson of HEAnet (Irish R&E network) on the issues and challenges of moving to a white market for the trading of IPv4 address space. There is growing recognition that may not be enough time or market momentum to move to IPv6. Some also argue that the new common bearer service is HTTP and that URI endpoints and routing instead of IPv6 addresses provides a lot more flexibility in terms of network architecture. Companies like SolaceSystems already build URI/XML routers. As such, at least as an interim strategy, the RIRs may need to institute some sort of market trading of IPv4 addresses. This will have major significance for university and research institutes as they hold the largest number of unallocated and unused address blocks in the world. As the article points out without a carefully planned market trading system, fragmentation of the routing tables could grow significantly. R&E networks could play a critical role working with RIRs in ensuring route aggregation without violating PI address space. And as R&E networks expand into peering and transit services they might be able to offer aggregation of routes to a larger community outside of academia. All this points to a future where R&E networks may have to play a critical national coordination role for both academia and the global Internet community, especially as we see demands for community networks and the science community to deliver citizen science services. For example R&E networks could assign address sub blocks from a local university to a local community that is committed to building an open infrastructure network or transit exchange. This would maintain route aggregation and yet allow for use of unallocated address space –BSA]
Other excellent must read articles in this months issue are Resource Certification by Geoff Huston and Host Identity Protocol: Identifier/Locator Split by Andrei Gurtov and Miika Komu
--BSA]
http://www.cisco.com/web/about/ac123/ac147/about_cisco_the_internet_protocol_journal.html
Other excellent must read articles in this months issue are Resource Certification by Geoff Huston and Host Identity Protocol: Identifier/Locator Split by Andrei Gurtov and Miika Komu
--BSA]
http://www.cisco.com/web/about/ac123/ac147/about_cisco_the_internet_protocol_journal.html
Monday, April 27, 2009
FTTH provider’s customers bury their own fiber
http://telephonyonline.com/residential_services/news/lyse-tele-burying-fiber-cable-0421/
NAB: FTTH provider’s customers bury their own fiber
LAS VEGAS -- A Norwegian triple-play provider has a unique solution to the pesky problem of digging up consumers' yards to bury fiber-to-the-home. Lyse Tele, an overbuilder that launched its fiber-based all-IP solution in 2002, installs the fiber right to the edge of a customer's lawn, then gives the customer instructions on how to bury their own fiber cable to the house.
[..]
NAB: FTTH provider’s customers bury their own fiber
LAS VEGAS -- A Norwegian triple-play provider has a unique solution to the pesky problem of digging up consumers' yards to bury fiber-to-the-home. Lyse Tele, an overbuilder that launched its fiber-based all-IP solution in 2002, installs the fiber right to the edge of a customer's lawn, then gives the customer instructions on how to bury their own fiber cable to the house.
[..]
Thursday, April 23, 2009
Why sites like Pirate Bay will continue to be popular
[Despite the attempts of the Swedish courts (where now the judge in the Pirate Bay case had found to have a serious conflict of interest) sites like Pirate Bay will continue to pop throughout the net because they meet a need and requirement of consumers for easy and convenient access to movies and songs. Michael Geist’s excellent blog documents why the content industry is trapped in a business model that prefers to punish their customers rather than providing them with a service they clearly want. So called, “piracy” would easily be eliminated if they listened more to their customer’s actions rather than their lawyer’s edicts. The other major challenge in this area is geo-blocking where content availability is limited to certain domestic markets. Some people feel this is just another example of undermining network neutrality where all sites on the Internet should be accessible to all users equally. Here are some excerpts from some great articles from pointers in Michael Geist’s blog – BSA]
Michael Geist Blog
http://www.michaelgeist.ca/
Big Entertainment Wants to Party Like It's 1996
http://www.internetevolution.com/document.asp?doc_id=175415&
Written by Cory Doctorow
The entertainment industry wants to retreat to the comfort of 1996. […]And most importantly, the laws regulating copyright and technology were almost entirely designed by the entertainment industry. They could write anydamnfoolthing and get it passed in Congress, by the UN, in the EU.
[…]
In 2009, the world is populated by people who no longer believe that "Thou shalt sell media on plastic discs forever" came down off the mountain on two stone tablets. It's populated by people who find the spectacle of companies suing their own customers by the thousands indefensible. It's populated by activists who've figured out that the Internet is worth saving and that the entertainment industry is prepared to destroy it.
And the entertainment industry hasn't figured that out, and that's why they're doomed.
Why Hollywood is so slow to catch up on offering all of its movies and shows online
http://www.slate.com/id/2216328/pagenum/all/#p2
[..]
I would gladly pay a hefty monthly fee for [movie download] service—if someone would take my money. In reality, I pay nothing because no company sells such a plan. Instead I've been getting my programming from the friendly BitTorrent peer-to-peer network. Pirates aren't popular these days, but let's give them this—they know how to put together a killer on-demand entertainment system.
I sometimes feel bad about my plundering ways. Like many scofflaws, though, I blame the system. I wouldn't have to steal if Hollywood would only give me a decent online movie-streaming service. In my dreams, here's what it would look like: a site that offers a huge selection—50,000 or more titles to choose from, with lots of Hollywood new releases, indies, and a smorgasbord of old films and TV shows. …Don't gum it up with restrictions, like a requirement that I watch a certain movie within a specified time after choosing it. The only reasonable limit might be to force me to stream the movies so that I won't be able to save the flicks to my computer. Beyond that, charge me a monthly fee and let me watch whatever I want, whenever I want, as often as I want.
[..]
The current offerings are nowhere close to this dream service. [..]So why won't anyone in Hollywood build my service? The reason isn't stupidity. When I called people in the industry this week, I found that many in the movie business understand that online distribution is the future of media. But everything in Hollywood is governed by a byzantine set of contractual relationships between many different kinds of companies—studios, distributors, cable channels, telecom companies, and others.
[..]
A movie will stay in the pay-per-view market for just a few months; after that, it goes to the premium channels, which get a 15- to 18-month exclusive window in which to show the film. That's why you can't get older titles through Apple's rental plan—once a movie goes to HBO, Apple loses the right to rent it. (Apple has a much wider range of titles available for sale at $15 each; for-sale movies fall under completely different contracts with studios.) Between them, Starz and HBO have contracts to broadcast about 80 percent of major-studio movies made in America today. Their rights extend for seven years or more. After a movie is broadcast on Starz, it makes a tour of ad-supported networks (like USA, TNT, or one of the big-three broadcast networks) and then goes back to Starz for a second run. Only after that—about a decade after the movie came out in theaters—does it enter its "library" phase, the period when companies like Netflix are allowed to license it for streaming. For most Hollywood releases, then, Netflix essentially gets last dibs on a movie, which explains why many of its films are so stale.
Couldn't the studios just sign new deals that would give them the right to build an online service? Well, maybe—but their current deals are worth billions, and a new plan would mean sacrificing certain profits for an uncertain future. Understandably, many are unwilling to take that leap.
Just like in the music business, eventually the entire home-video market is sure to move online, and many consumers will abandon pirate sites in favor of easy-to-use legal services. The music industry lost a lot of money when it dithered over this transition, and now the movie business seems to be making the same mistake. It could be raking in a lot of cash by selling us easy online rentals. Until it works out a plan to do so, there's always BitTorrent.
Michael Geist Blog
http://www.michaelgeist.ca/
Big Entertainment Wants to Party Like It's 1996
http://www.internetevolution.com/document.asp?doc_id=175415&
Written by Cory Doctorow
The entertainment industry wants to retreat to the comfort of 1996. […]And most importantly, the laws regulating copyright and technology were almost entirely designed by the entertainment industry. They could write anydamnfoolthing and get it passed in Congress, by the UN, in the EU.
[…]
In 2009, the world is populated by people who no longer believe that "Thou shalt sell media on plastic discs forever" came down off the mountain on two stone tablets. It's populated by people who find the spectacle of companies suing their own customers by the thousands indefensible. It's populated by activists who've figured out that the Internet is worth saving and that the entertainment industry is prepared to destroy it.
And the entertainment industry hasn't figured that out, and that's why they're doomed.
Why Hollywood is so slow to catch up on offering all of its movies and shows online
http://www.slate.com/id/2216328/pagenum/all/#p2
[..]
I would gladly pay a hefty monthly fee for [movie download] service—if someone would take my money. In reality, I pay nothing because no company sells such a plan. Instead I've been getting my programming from the friendly BitTorrent peer-to-peer network. Pirates aren't popular these days, but let's give them this—they know how to put together a killer on-demand entertainment system.
I sometimes feel bad about my plundering ways. Like many scofflaws, though, I blame the system. I wouldn't have to steal if Hollywood would only give me a decent online movie-streaming service. In my dreams, here's what it would look like: a site that offers a huge selection—50,000 or more titles to choose from, with lots of Hollywood new releases, indies, and a smorgasbord of old films and TV shows. …Don't gum it up with restrictions, like a requirement that I watch a certain movie within a specified time after choosing it. The only reasonable limit might be to force me to stream the movies so that I won't be able to save the flicks to my computer. Beyond that, charge me a monthly fee and let me watch whatever I want, whenever I want, as often as I want.
[..]
The current offerings are nowhere close to this dream service. [..]So why won't anyone in Hollywood build my service? The reason isn't stupidity. When I called people in the industry this week, I found that many in the movie business understand that online distribution is the future of media. But everything in Hollywood is governed by a byzantine set of contractual relationships between many different kinds of companies—studios, distributors, cable channels, telecom companies, and others.
[..]
A movie will stay in the pay-per-view market for just a few months; after that, it goes to the premium channels, which get a 15- to 18-month exclusive window in which to show the film. That's why you can't get older titles through Apple's rental plan—once a movie goes to HBO, Apple loses the right to rent it. (Apple has a much wider range of titles available for sale at $15 each; for-sale movies fall under completely different contracts with studios.) Between them, Starz and HBO have contracts to broadcast about 80 percent of major-studio movies made in America today. Their rights extend for seven years or more. After a movie is broadcast on Starz, it makes a tour of ad-supported networks (like USA, TNT, or one of the big-three broadcast networks) and then goes back to Starz for a second run. Only after that—about a decade after the movie came out in theaters—does it enter its "library" phase, the period when companies like Netflix are allowed to license it for streaming. For most Hollywood releases, then, Netflix essentially gets last dibs on a movie, which explains why many of its films are so stale.
Couldn't the studios just sign new deals that would give them the right to build an online service? Well, maybe—but their current deals are worth billions, and a new plan would mean sacrificing certain profits for an uncertain future. Understandably, many are unwilling to take that leap.
Just like in the music business, eventually the entire home-video market is sure to move online, and many consumers will abandon pirate sites in favor of easy-to-use legal services. The music industry lost a lot of money when it dithered over this transition, and now the movie business seems to be making the same mistake. It could be raking in a lot of cash by selling us easy online rentals. Until it works out a plan to do so, there's always BitTorrent.
Monday, April 20, 2009
Building a better Internet - the Pouzin Society
[Here is a group worth following – especially some of their white papers with contributions from luminaries such as John Day, David Meyer, Mike O’Dell – BSA]
http://pouzinsociety.org/
The Pouzin Society, named after Louis Pouzin, the inventor of datagrams and connectionless networking, announces its initial organizing meeting. The society’s purpose is to provide a forum for developing viable solutions to the current Internet architecture crisis.
About 15 years ago, it became clear that IPv4 was reaching its limits, and the IETF responded by creating IPv6. In 2006 came the tacit admission that there continue to be fundamental scaling problems in the Internet routing architecture which would only be exacerbated by IPv6, and that Moore's Law could not save us this time. Several solutions were proposed, all based on revising IPv6 addressing using the concept of a locator/identifier split. Work has proceeded diligently, but a few months ago, it became clear that not only was this approach fatally flawed, but by implication, so was IP, or any variation of it. Academic efforts, beginning with NewArch and continuing with FIND and GENI are no closer to finding a solution than we were a decade ago.
In the meantime, “Patterns in Network Architecture” has appeared, describing a simple new architecture that not only solves existing problems but also predicts capabilities as yet unconsidered.
Initial meetings will be held in conjunction with FutureNet in Boston, May 4 - 7. There will be a one day organizing meeting on May 4 to discuss collaboration and next steps. On May 6 and 7 there will be a working meeting at Boston University on the specific topic of the current addressing crisis. There will be considerable work refining architectural details, but the central goal of the effort is to form a group that builds implementations of this new network architecture to evaluate its scalability, security, and other pertinent characteristics.
http://pouzinsociety.org/
The Pouzin Society, named after Louis Pouzin, the inventor of datagrams and connectionless networking, announces its initial organizing meeting. The society’s purpose is to provide a forum for developing viable solutions to the current Internet architecture crisis.
About 15 years ago, it became clear that IPv4 was reaching its limits, and the IETF responded by creating IPv6. In 2006 came the tacit admission that there continue to be fundamental scaling problems in the Internet routing architecture which would only be exacerbated by IPv6, and that Moore's Law could not save us this time. Several solutions were proposed, all based on revising IPv6 addressing using the concept of a locator/identifier split. Work has proceeded diligently, but a few months ago, it became clear that not only was this approach fatally flawed, but by implication, so was IP, or any variation of it. Academic efforts, beginning with NewArch and continuing with FIND and GENI are no closer to finding a solution than we were a decade ago.
In the meantime, “Patterns in Network Architecture” has appeared, describing a simple new architecture that not only solves existing problems but also predicts capabilities as yet unconsidered.
Initial meetings will be held in conjunction with FutureNet in Boston, May 4 - 7. There will be a one day organizing meeting on May 4 to discuss collaboration and next steps. On May 6 and 7 there will be a working meeting at Boston University on the specific topic of the current addressing crisis. There will be considerable work refining architectural details, but the central goal of the effort is to form a group that builds implementations of this new network architecture to evaluate its scalability, security, and other pertinent characteristics.
Wednesday, April 15, 2009
World's first real time map of science activity may predict scientific innovation
[From International Science Grid this week—BSA]
http://www.isgtw.org/?pid=1001700
Scientists at Los Alamos National Laboratory (LANL) in New Mexico have produced what they call the world's first "Map of Science" — a high-resolution graphic depiction of the virtual trails scientists leave behind whenever they retrieve information from online services.
The research, led by Johan Bollen of LANL, and his colleagues at the Santa Fe Institute, collected usage-log data gathered from a variety of publishers, aggregators, and universities from 2006 to 2008. Their collection totaled nearly 1 billion requests for online information. Because scientists usually read articles in online form well before they can be cited in print, usage data reveal scientific activity nearly in real-time, the map's creators say.
“This research will be a crucial component of future efforts to study and predict scientific innovation, as well novel methods to determine the true impact of articles and journals,” Bollen said to the Public Library of Science.
http://www.isgtw.org/?pid=1001700
Scientists at Los Alamos National Laboratory (LANL) in New Mexico have produced what they call the world's first "Map of Science" — a high-resolution graphic depiction of the virtual trails scientists leave behind whenever they retrieve information from online services.
The research, led by Johan Bollen of LANL, and his colleagues at the Santa Fe Institute, collected usage-log data gathered from a variety of publishers, aggregators, and universities from 2006 to 2008. Their collection totaled nearly 1 billion requests for online information. Because scientists usually read articles in online form well before they can be cited in print, usage data reveal scientific activity nearly in real-time, the map's creators say.
“This research will be a crucial component of future efforts to study and predict scientific innovation, as well novel methods to determine the true impact of articles and journals,” Bollen said to the Public Library of Science.
Developing a Coherent Cyberinfrastructure from Local Campus to National Facilities
[Another excellent report by Educause on the challenges and strategies of developing a coherent national cyber-infrastructure strategy. I particularly note the observation that “The use of conventional perimeter firewalls, which might be appropriate for parts of the campus constituency, must not burden high-speed flows between on-campus users and resources and those off campus. “ To address this problem CANARIE, working in partnership with regional networks has extended the backbone optical network to a number of individual labs on various campuses across Canada. In some cases we have even installed backbone ROADM equipment right on campus to allow researchers access to potentially hundreds of lambdas. To do this we had to extend dark fiber connections across the campus, bypassing the campus network, from the local regional and/or CANARIE POP. This is necessary not only for high end users, but also health and government data users who are subject to HIPAA requirements. This is only possible with facilities owned networks as universities are not likely to make this investment where the underlying backbone service provider changes every 5 years or so. It also goes without saying that developing a comprehensive cyber-infrastructure strategy will be essential for those institutions who are likely to pay substantially more money in electricity because of upcoming cap and trade. Cyber-infrastructure accounts between 30-50% of the electrical consumption at a research intensive university.—BSA]
Developing a Coherent Cyberinfrastructure from Local Campuses to National Facilities: Challenges and Strategies
http://www.educause.edu/Resources/DevelopingaCoherentCyberinfras/169441
Developing a Coherent Cyberinfrastructure from Local Campuses to National Facilities: Challenges and Strategies
http://www.educause.edu/Resources/DevelopingaCoherentCyberinfras/169441
Tuesday, April 14, 2009
Will bandwidth caps be the next battle for Network Neutrality
[Increasingly we are seeing carriers look to impose bandwidth caps and a variety of tiered services for Internet usage. Although I think some sort of bandwidth cap may be necessary for egregious users, there proliferation and adoption by cablecos and telco flies in the face of the fact that growth of Internet traffic is slowing down substantially as evidenced by the data provided by Andrew Olydzko. The cablecos and telcos seem to be the only industry that intentionally punishes their biggest customers when given declining growth rates they should be rewarding them. One suspects other motives may be at play as detailed in the following blogs and e-mails. From David Farber and Lauren Weinstein’s lists – BSA]
Andrew Oldzyko Presentation
Internet traffic growth and implications for access technologies
http://www.dtc.umn.edu/~odlyzko/talks/index.html
[From a posting by Richard Forno on David Farber’s list]
Download capping is the new DRM
http://www.tomshardware.com/news/time-warner-cable-internet-drm,7530.html#xtor
=RSS-181
Much like everyone reading this article, I'm a genuine supporter of advancement in hardware and technology services. Suffice to say, I was happy with the progression of Internet connection services over the years.
Recently, however, I would have to say that Internet connection advancement in the U.S. and Canada has been purely an interest of the corporations that provide them and not about serving the consumer-- you--and the advancement of technology in America in general.
In late March, I wrote an article on Tom's Hardware explaining why HDCP (high definition content protection) is the bane of movie watchers everywhere. Not only is HDCP an invasive technology that kills the enjoyment of movies for enthusiasts, it does nothing to stop pirates. We all know this to be true.
Don't think for a moment though, that big media doesn't know this--they absolutely do. Now, they have a new plan. Since big media can't directly go after pirates, they've decided to go after to after the group of people who they think can't do a thing about it: anyone using an Internet connection.
< - >
Download capping is the new DRM.
It ensures several things:
- You will be more hesitant to download movies and music legitimately-- even though you've paid to watch/listen.
- You will watch more cable TV (so you can see all those great ads).
- You will accidentally pay more for less.
- Pirates get a whacking.
Big media and ISPs can't effectively eliminate piracy by going after pirates directly or stop online video and music streaming services. So they have a better plan now: go after everyone.
How Much Time Warner's Broadband Caps Will Screw You - Broadband caps
The money quote comes from Time Warner's SEC filing as highlighted in this article and reveals the anti-competitive issue for consumers and video producers.
http://www.obsessable.com/news/2009/04/10/time-warner-makes-broadband-cap-concessions-unlimited-plan-for-150-month/
One position from Time Warner's COO,
Moreover it's clear that Time Warner fully expects its customers to keep climbing that bandwidth ladder over time, ratcheting themselves into sweeter and sweeter profit positions thanks to the tiered strategy plus known usage increases, thanks to a money quote (literally) from Hobbs: "Broadband data is such a great product. I think there will be some customers who don’t use much that will select the lower tier. But over time, they will use more and move up to the higher price plans." This too reveals that metered broadband isn't really about saving the internet or ensuring great customer experience for the more "polite," less bandwidth-hungry users on the network — it's about setting up a tiered scheme in which Time Warner stands to make an incredible amount of bank as general demand for internet usage is increasing.
And now the anti-competitive "money quote" from TW:
Thanks also to TW's SEC filing we know it's additionally about thwarting the increasing competition their cable video business faces from competitors delivering video content over the web: "TWC faces competition from a range of other competitors, including, increasingly, companies that deliver content to consumers over the Internet, often without charging a fee for access to the content. This trend
could negatively impact customer demand for TWC’s video services, especially premium and On-Demand services."
Time Warner's data caps could cost us much more than high monthly fees, if the company has it way.
On Fri, Apr 10, 2009 at 10:08 AM, David Farber wrote:
http://i.gizmodo.com/5206697/how-much-time-warners-broadband-caps-will-screw-you
Like the virus in 28 Days Later, Time Warner's internet-strangling broadband caps is spreading all over the country. They've got brand new pricing plans too and they yep, they suck. Let's look.
The old cap scheme was pretty limited, only going up to a max of 40GB. Now they've got a whole Skittles bag of caps. Here's how Time Warner Cable's COO Landel Hobbs breaks it down, all while breaking out the familiar warning that the internet is about to die if you don't limit your porn consumption to two times a day—MAX:
Internet demand is rising at a rate that could outpace capacity within a few years. According to industry analysts, the infrastructure may not be able to accommodate the explosion of online content by 2012. This could result in Internet brownouts.
• 1GB with 768kbps downstream for $15/month with $2/GB overcharges
• 10, 20, 40 and 60GB will go with Roadrunner Lite, Basic, Standard and Turbo packages, respectively, and maintain the same pricing. Overage is $1/GB.
• 100GB will be the new Road Runner...Turbo (I'm not sure why there are two Turbo packages) which is 10Mbps downstream and 1Mbps upstream for $75/month. This is still an order of magnitude more restrictive than AT&T and Comcast, who have caps of 150GB and 250GB, respectively.
• A 50Mbps/5Mbps down/up speed tier is coming for $100/month wh
Andrew Oldzyko Presentation
Internet traffic growth and implications for access technologies
http://www.dtc.umn.edu/~odlyzko/talks/index.html
[From a posting by Richard Forno on David Farber’s list]
Download capping is the new DRM
http://www.tomshardware.com/news/time-warner-cable-internet-drm,7530.html#xtor
=RSS-181
Much like everyone reading this article, I'm a genuine supporter of advancement in hardware and technology services. Suffice to say, I was happy with the progression of Internet connection services over the years.
Recently, however, I would have to say that Internet connection advancement in the U.S. and Canada has been purely an interest of the corporations that provide them and not about serving the consumer-- you--and the advancement of technology in America in general.
In late March, I wrote an article on Tom's Hardware explaining why HDCP (high definition content protection) is the bane of movie watchers everywhere. Not only is HDCP an invasive technology that kills the enjoyment of movies for enthusiasts, it does nothing to stop pirates. We all know this to be true.
Don't think for a moment though, that big media doesn't know this--they absolutely do. Now, they have a new plan. Since big media can't directly go after pirates, they've decided to go after to after the group of people who they think can't do a thing about it: anyone using an Internet connection.
< - >
Download capping is the new DRM.
It ensures several things:
- You will be more hesitant to download movies and music legitimately-- even though you've paid to watch/listen.
- You will watch more cable TV (so you can see all those great ads).
- You will accidentally pay more for less.
- Pirates get a whacking.
Big media and ISPs can't effectively eliminate piracy by going after pirates directly or stop online video and music streaming services. So they have a better plan now: go after everyone.
How Much Time Warner's Broadband Caps Will Screw You - Broadband caps
The money quote comes from Time Warner's SEC filing as highlighted in this article and reveals the anti-competitive issue for consumers and video producers.
http://www.obsessable.com/news/2009/04/10/time-warner-makes-broadband-cap-concessions-unlimited-plan-for-150-month/
One position from Time Warner's COO,
Moreover it's clear that Time Warner fully expects its customers to keep climbing that bandwidth ladder over time, ratcheting themselves into sweeter and sweeter profit positions thanks to the tiered strategy plus known usage increases, thanks to a money quote (literally) from Hobbs: "Broadband data is such a great product. I think there will be some customers who don’t use much that will select the lower tier. But over time, they will use more and move up to the higher price plans." This too reveals that metered broadband isn't really about saving the internet or ensuring great customer experience for the more "polite," less bandwidth-hungry users on the network — it's about setting up a tiered scheme in which Time Warner stands to make an incredible amount of bank as general demand for internet usage is increasing.
And now the anti-competitive "money quote" from TW:
Thanks also to TW's SEC filing we know it's additionally about thwarting the increasing competition their cable video business faces from competitors delivering video content over the web: "TWC faces competition from a range of other competitors, including, increasingly, companies that deliver content to consumers over the Internet, often without charging a fee for access to the content. This trend
could negatively impact customer demand for TWC’s video services, especially premium and On-Demand services."
Time Warner's data caps could cost us much more than high monthly fees, if the company has it way.
On Fri, Apr 10, 2009 at 10:08 AM, David Farber
http://i.gizmodo.com/5206697/how-much-time-warners-broadband-caps-will-screw-you
Like the virus in 28 Days Later, Time Warner's internet-strangling broadband caps is spreading all over the country. They've got brand new pricing plans too and they yep, they suck. Let's look.
The old cap scheme was pretty limited, only going up to a max of 40GB. Now they've got a whole Skittles bag of caps. Here's how Time Warner Cable's COO Landel Hobbs breaks it down, all while breaking out the familiar warning that the internet is about to die if you don't limit your porn consumption to two times a day—MAX:
Internet demand is rising at a rate that could outpace capacity within a few years. According to industry analysts, the infrastructure may not be able to accommodate the explosion of online content by 2012. This could result in Internet brownouts.
• 1GB with 768kbps downstream for $15/month with $2/GB overcharges
• 10, 20, 40 and 60GB will go with Roadrunner Lite, Basic, Standard and Turbo packages, respectively, and maintain the same pricing. Overage is $1/GB.
• 100GB will be the new Road Runner...Turbo (I'm not sure why there are two Turbo packages) which is 10Mbps downstream and 1Mbps upstream for $75/month. This is still an order of magnitude more restrictive than AT&T and Comcast, who have caps of 150GB and 250GB, respectively.
• A 50Mbps/5Mbps down/up speed tier is coming for $100/month wh
Tuesday, April 7, 2009
The economic benefits of R&E networks and impact on broadband
[Two excellent reports on the future of R&E networks have just been released. The first is the New Zealand research and education network which has undertaken a in-depth study on the economic benefits of R&E networks. This is one of the most comprehensive and in-depth studies I have seen on the subject, and yet I think only scratches the surface in terms of the economic benefits. Besides the direct monetizable and indirect benefits, I believe R&E networks will have an even greater importance in society going forward in working with industry to pilot new ways to accelerate research outcomes from academia to industry through the use of cyber-infrastructure, validating new network business models, facilitating new broadband solutions for consumers and most importantly in helping address the biggest challenge of all- climate change. The second report “Unleashing Waves of Innovation” documents these contributions that R&E networks have made from the very inception of the Internet. Not only did the Internet start with this community, but they have continued to be at the forefront of adopting new business models and architectures. For example, at one time radical concepts such as condominium fiber and wavelengths started with this community and have now been adopted by forward thinking carriers. The standard for network neutrality and the need for a public Internet that is for everyone has largely been carried by the R&E network community. Thanks to Donald Clark for this pointer – BSA]
Economic evaluation of NRENs
http://www.karen.net.nz/assets/Uploads/Publications/REANNZ-Economic-Value-Report-Summary.pdf
http://www.karen.net.nz/assets/Uploads/Publications/REANNZ-Economic-Value-Report-Full.pdf
Unleashing Waves of Innovation
http://www.cra.org/ccc/docs/init/Unleashing.pdf
Economic evaluation of NRENs
http://www.karen.net.nz/assets/Uploads/Publications/REANNZ-Economic-Value-Report-Summary.pdf
http://www.karen.net.nz/assets/Uploads/Publications/REANNZ-Economic-Value-Report-Full.pdf
Unleashing Waves of Innovation
http://www.cra.org/ccc/docs/init/Unleashing.pdf
Deep Packet Inspection and Privacy, Liberty and Freedom of Association
[Although Deep Packet Inspection (DPI) is often represented as no more than a simple traffic management tool used by cablecos, telcos and some ISPs, it is one of those technologies that can have profound social, economic and political consequences. The Internet is now so pervasive, that now in a few short years it has become the most important tool for exercising our most cherished rights including privacy, freedom of speech and freedom of association. I am pleased to see that the Canadian Privacy Commissioner and many others recognize this profound dichotomy between technology and social responsibility. Although DPI is often used to block or limit what is deemed by the carriers to be abusive traffic such as P2P, its unchecked and unconditional usage presages concern amongst many that it can easily be subverted into a tool of repression and interception. I hope that the upcoming hearings at the Canadian Radio and Television Commission (CRTC) on Network Neutrality will address these issues as well – BSA]
Canadian Privacy Commissioner Collection of Essays on DPI and Privacy
http://dpi.priv.gc.ca/index.php/essays/
Ralf Bendrath also has just compiled a little reading list with the limited non-computer-engineering academic literature around DPI, and issued a sort-of-call-for-papers for social sciences / law / humanities colleagues who are working on this:
Canadian Privacy Commissioner Collection of Essays on DPI and Privacy
http://dpi.priv.gc.ca/index.php/essays/
Ralf Bendrath also has just compiled a little reading list with the limited non-computer-engineering academic literature around DPI, and issued a sort-of-call-for-papers for social sciences / law / humanities colleagues who are working on this:
The next killer app for the Internet - dematerialization
[I am here at the fantastic Freedom to Connect conference which is well worth watching on webcast (http://freedom-to-connect.net/). Many scientists are warning that the planet may be close to a tipping point where we will experience run away global warming (see Andy Revkin’s recent blog from the NYTimes for a summary of several studies on this issue). We simply may not have the luxury time for small incremental adaptations to address the challenge of global climatic disruption. One of the major ways we can reduce our CO2 footprint is through de-materialization where we replace physical products with virtual ones delivered over the Internet. Some studies indicated that we can reduce Co2 emissions by as much as 20% with materialization. I argue that dematerialization can be further amplified through carbon rewards (instead of Carbon taxes) where consumers are rewarded with a variety of virtual products in exchange of reducing their carbon footprint in other walks of their lives. But to take advantage of this opportunity of dematerialization we need open high speed broadband networks everywhere. Hence the importance of conferences such as Freedom to Connect. From a pointer by Tim OReilly on Dave Farber’s list – BSA]
Andy Revkin blog on the planet being at a tipping point
http://dotearth.blogs.nytimes.com/2009/03/28/tipping-points-and-the-climate-challenge/
Interesting seybold piece on the environmental impact of publishing:
http://www.seyboldreport.com/ready-or-not-welcome-age-low-carbon-publishing
Have you ever considered what the carbon footprint of a magazine or an eReader is? What about the carbon footprint of your publication? Not everyone cares about carbon footprints or defers to the authority of science on climate change, but when Coke, Pepsi and Apple begin to carbon footprint their products, and Taco-Bell begins to open LEED-certified restaurants with low carbon footprints, it may be time to start.
According to information recently released by Apple, the lifecycle carbon footprint of an iPhone is responsible for the emission of 121 pounds of CO2-equivalent greenhouse gas emissions over the course of a three year expected lifetime of use. Over 10 million iPhones have been sold to date. Though it is not a direct comparison, it is interesting to note that Discover magazine estimated that the lifecycle carbon footprint of each copy of its publication is responsible for 2.1 pounds of carbon dioxide emissions, the same amount produced by twelve 100-watt light bulbs glowing for an hour or a car engine burning 14 ounces of gasoline. Over the next few years it can be expected that the reporting of lifecycle data and the carbon labeling of all products will move from the margins to the mainstream - including the footprinting of print and digital media products. Welcome to the age of low carbon everything.
There are billions of kilowatt hours of electricity embodied in the paper, ink and digital technologies we use each day, and close to a kilogram of CO2 is emitted for each kilowatt, but the energy and greenhouse gas emissions associated with print and digital media supply chains have typically been overlooked, misunderstood or underestimated. Those days are drawing to an end. Increasingly major brands like Walmart, Pepsi, Coke, TacoBell and Timberland see carbon footprinting and carbon disclosure as an opportunity to differentiate themselves and grow - even in the face of a global recession.
[…]
Before you use one of the many carbon calculators popping on the Web to measure the carbon footprint of whatever medium you use, it's important to realize that the results can vary dramatically - as do their underlying assumptions. Most fail to employ standards. Until now, lots of calculators and "carbon neutral" companies have made promises to help you reduce your footprint. But there's been no single authority or regulatory agency to dictate how carbon usage should be calculated or disclosed. Standards and specifications for carbon footprinting such as ISO 14040, ISO 14064 and PAS 2050 now do exist, and open standards-based Web 2.0 platforms like AMEE are now available that enable accurate carbon footprinting, like-for-like comparisons and large-scale supply chain analysis.
Carbon rewards instead of carbon taxes
http://green-broadband.blogspot.com/
Andy Revkin blog on the planet being at a tipping point
http://dotearth.blogs.nytimes.com/2009/03/28/tipping-points-and-the-climate-challenge/
Interesting seybold piece on the environmental impact of publishing:
http://www.seyboldreport.com/ready-or-not-welcome-age-low-carbon-publishing
Have you ever considered what the carbon footprint of a magazine or an eReader is? What about the carbon footprint of your publication? Not everyone cares about carbon footprints or defers to the authority of science on climate change, but when Coke, Pepsi and Apple begin to carbon footprint their products, and Taco-Bell begins to open LEED-certified restaurants with low carbon footprints, it may be time to start.
According to information recently released by Apple, the lifecycle carbon footprint of an iPhone is responsible for the emission of 121 pounds of CO2-equivalent greenhouse gas emissions over the course of a three year expected lifetime of use. Over 10 million iPhones have been sold to date. Though it is not a direct comparison, it is interesting to note that Discover magazine estimated that the lifecycle carbon footprint of each copy of its publication is responsible for 2.1 pounds of carbon dioxide emissions, the same amount produced by twelve 100-watt light bulbs glowing for an hour or a car engine burning 14 ounces of gasoline. Over the next few years it can be expected that the reporting of lifecycle data and the carbon labeling of all products will move from the margins to the mainstream - including the footprinting of print and digital media products. Welcome to the age of low carbon everything.
There are billions of kilowatt hours of electricity embodied in the paper, ink and digital technologies we use each day, and close to a kilogram of CO2 is emitted for each kilowatt, but the energy and greenhouse gas emissions associated with print and digital media supply chains have typically been overlooked, misunderstood or underestimated. Those days are drawing to an end. Increasingly major brands like Walmart, Pepsi, Coke, TacoBell and Timberland see carbon footprinting and carbon disclosure as an opportunity to differentiate themselves and grow - even in the face of a global recession.
[…]
Before you use one of the many carbon calculators popping on the Web to measure the carbon footprint of whatever medium you use, it's important to realize that the results can vary dramatically - as do their underlying assumptions. Most fail to employ standards. Until now, lots of calculators and "carbon neutral" companies have made promises to help you reduce your footprint. But there's been no single authority or regulatory agency to dictate how carbon usage should be calculated or disclosed. Standards and specifications for carbon footprinting such as ISO 14040, ISO 14064 and PAS 2050 now do exist, and open standards-based Web 2.0 platforms like AMEE are now available that enable accurate carbon footprinting, like-for-like comparisons and large-scale supply chain analysis.
Carbon rewards instead of carbon taxes
http://green-broadband.blogspot.com/
Friday, March 13, 2009
Implications of Post-neoclassical Economics on Telecom regulation and poilcy
[Here is an excellent conference/workshop on some of the issues of applying classical economic theory to telecom and ICT. Tim Cowen, Visiting Professor, City University School of Law also posted an excellent analysis of these issues on Gordon Cooks famous Arch-Econ list, reprinted with Tim’s permission –BSA]
The New Economics of ICT:
Implications of Post-neoclassical Economics for the Information & Communications Technology Sectors
March 20, 2009
Columbia Business School
Uris Hall, 116th & Broadway
Room 307
New York, NY
http://www4.gsb.columbia.edu/citi/neweconomics
Neoclassical economics has long been a tool and model, albeit distorted on occasions, for policymakers in the development of legislative and regulatory rules. In particular it has been applied in the information & communications technology (ICT) sectors with such policies as the long-run incremental costs rules, appeals to economies of scale and scope or, inappropriately, reliance on two or three firms to emulate perfect competition’s results. However, economics has moved well beyond these simple, static concepts. Experimental, behavioral, developmental, institutional, neuro-, complexity and network economics are now part of the economists’ tool kit. Although not yet well integrated, they show the flaws in neoclassical analysis. Similar advances have been made in financial theory and practice and the disciplines are, finally, becoming linked.
While the “new economics” (and finance) has permeated many sectors, it has not yet had a significant impact on the ICT sectors. Reliance on the old paradigm are maintained. For example, Ofcom in the United Kingdom failed to adopt the real options methodology, or any other dynamic method in determining access pricing.
The objective of this Symposium is to understand the implications of the New economics & financial models for the ICT sector. What do they mean for policymakers, investors, and industry leaders?
---------------
[From a posting on Gordon Cook’s Arch-econ list by Tim Cowen, Visiting Professor, City University School of Law-BSA]
A few thoughts about the direction that policy is going in the EU and how that is relevant to this list below.
In the EU there are an increasing number of references to "behavioral economics" being made in speeches by policy makers and indeed EU Commissioners.e.g;
europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/08/660&format=HTML&aged=0&language=EN - 28k - Cached - Similar pages -
For example I attended a presentation by Commissioner Kuneva at Kings College in London University last month where she was speaking about her approach to competition policy; she explicitly referred to behavioral economics in her review of the importance of the market as part of the mechanism that drives the EU. Interestingly she also made a point that is critical for all to understand: the market mechanism is the cornerstone of democracy and given her personal history of growing up in Eastern Europe, she felt strongly that the market is vital to ensure personal freedom, particularly from the perils of state control.
The basis of US anti trust came from similar thinking: anti trust was created to bust the trusts that developed as a consequence of free market capitalism in the late 1800s and strip power from the robber barons. The trusts were seen at the time as a threat to democracy as too much power was concentrated in too small a number of hands. The market wasn't working to provide opportunity innovation, growth, and personal freedom. It had become controlled and there was a threat to democracy.
I have provided the link to the behavioral economic conference held by the Commission on the 28th November last year above. In the comments that Commissioner Kuneva made, she drew out the thought that consumers do not always act in their best economic self interest. This is a challenge to much of classical economic thinking or at least to those people who read Adam Smith's Wealth of Nations without reading the Theory of Moral Sentimens.
Consumers have been shown by behavioralists to value many different things and their value systems drive their choices. As the Commissioner pointed out, this is important in formulating the extent and degree and the way in which regulation should be implemented. She drew out 4 different issues: Default Bias (in which when making decisions we default to a previously successful behavior or rule), Framing, (weighing losses above potential gains leading to risk aversion),Present Bias (or one in the hand is worth two in the bush),and Choice Overload.
In particular she pointed out that these issues are critical for the regulation of industries such as telecommunications and energy.
The thinking and references are useful in determining predictably irrational decisions. They inform policy makers when thinking about entrenched monopoly . For example understanding of the default bias idea is important when thinking about the extent of entrenched monopoly. It is also a bit more accessible than talking about switching costs or loosely covering different motivations with the redefinition of common phrases and words.
(I have always had a problem with talking about people's motivations in terms of 'utility' and other such expressions that are used to redefine commonly used language to mean the exact opposite of normal usage. How can happiness be encompassed by the word utility? to say that a social worker, often motivated by caring and feeling for common humanity is motivated by personal utility indicates more about the cynical mentality of the analyst/economist that has to see everything in terms of personal benefit, than the reality of people's motivations. We can define a pot as a pan handle but its still a pot to most people ).
Social good and pubic goods are at the moment being redefined and the role of the market is under intense scrutiny. I am not one that says Greenspan got it all wrong, but I don't think that Keynes was all wrong either. These issues are central to the regulation of telecommunications as those policy makers that are concerned with outcomes need to understand all aspects of market failure, not just the ones we have seen before.
regards
tim
The New Economics of ICT:
Implications of Post-neoclassical Economics for the Information & Communications Technology Sectors
March 20, 2009
Columbia Business School
Uris Hall, 116th & Broadway
Room 307
New York, NY
http://www4.gsb.columbia.edu/citi/neweconomics
Neoclassical economics has long been a tool and model, albeit distorted on occasions, for policymakers in the development of legislative and regulatory rules. In particular it has been applied in the information & communications technology (ICT) sectors with such policies as the long-run incremental costs rules, appeals to economies of scale and scope or, inappropriately, reliance on two or three firms to emulate perfect competition’s results. However, economics has moved well beyond these simple, static concepts. Experimental, behavioral, developmental, institutional, neuro-, complexity and network economics are now part of the economists’ tool kit. Although not yet well integrated, they show the flaws in neoclassical analysis. Similar advances have been made in financial theory and practice and the disciplines are, finally, becoming linked.
While the “new economics” (and finance) has permeated many sectors, it has not yet had a significant impact on the ICT sectors. Reliance on the old paradigm are maintained. For example, Ofcom in the United Kingdom failed to adopt the real options methodology, or any other dynamic method in determining access pricing.
The objective of this Symposium is to understand the implications of the New economics & financial models for the ICT sector. What do they mean for policymakers, investors, and industry leaders?
---------------
[From a posting on Gordon Cook’s Arch-econ list by Tim Cowen, Visiting Professor, City University School of Law-BSA]
A few thoughts about the direction that policy is going in the EU and how that is relevant to this list below.
In the EU there are an increasing number of references to "behavioral economics" being made in speeches by policy makers and indeed EU Commissioners.e.g;
europa.eu/rapid/pressReleasesAction.do?reference=SPEECH/08/660&format=HTML&aged=0&language=EN - 28k - Cached - Similar pages -
For example I attended a presentation by Commissioner Kuneva at Kings College in London University last month where she was speaking about her approach to competition policy; she explicitly referred to behavioral economics in her review of the importance of the market as part of the mechanism that drives the EU. Interestingly she also made a point that is critical for all to understand: the market mechanism is the cornerstone of democracy and given her personal history of growing up in Eastern Europe, she felt strongly that the market is vital to ensure personal freedom, particularly from the perils of state control.
The basis of US anti trust came from similar thinking: anti trust was created to bust the trusts that developed as a consequence of free market capitalism in the late 1800s and strip power from the robber barons. The trusts were seen at the time as a threat to democracy as too much power was concentrated in too small a number of hands. The market wasn't working to provide opportunity innovation, growth, and personal freedom. It had become controlled and there was a threat to democracy.
I have provided the link to the behavioral economic conference held by the Commission on the 28th November last year above. In the comments that Commissioner Kuneva made, she drew out the thought that consumers do not always act in their best economic self interest. This is a challenge to much of classical economic thinking or at least to those people who read Adam Smith's Wealth of Nations without reading the Theory of Moral Sentimens.
Consumers have been shown by behavioralists to value many different things and their value systems drive their choices. As the Commissioner pointed out, this is important in formulating the extent and degree and the way in which regulation should be implemented. She drew out 4 different issues: Default Bias (in which when making decisions we default to a previously successful behavior or rule), Framing, (weighing losses above potential gains leading to risk aversion),Present Bias (or one in the hand is worth two in the bush),and Choice Overload.
In particular she pointed out that these issues are critical for the regulation of industries such as telecommunications and energy.
The thinking and references are useful in determining predictably irrational decisions. They inform policy makers when thinking about entrenched monopoly . For example understanding of the default bias idea is important when thinking about the extent of entrenched monopoly. It is also a bit more accessible than talking about switching costs or loosely covering different motivations with the redefinition of common phrases and words.
(I have always had a problem with talking about people's motivations in terms of 'utility' and other such expressions that are used to redefine commonly used language to mean the exact opposite of normal usage. How can happiness be encompassed by the word utility? to say that a social worker, often motivated by caring and feeling for common humanity is motivated by personal utility indicates more about the cynical mentality of the analyst/economist that has to see everything in terms of personal benefit, than the reality of people's motivations. We can define a pot as a pan handle but its still a pot to most people ).
Social good and pubic goods are at the moment being redefined and the role of the market is under intense scrutiny. I am not one that says Greenspan got it all wrong, but I don't think that Keynes was all wrong either. These issues are central to the regulation of telecommunications as those policy makers that are concerned with outcomes need to understand all aspects of market failure, not just the ones we have seen before.
regards
tim
Thursday, January 29, 2009
Google announces tools to determine if your ISP is blocking or throttling traffic
[Another exciting development in addition to Glasnost and Switzerland to determine if your carrier is blocking or throttling certain types of traffic. Although no one disagrees that carriers have the right to manage their network, it should be done on a non-discriminatory and transparent basis both in terms of the sources of the traffic, application AND class of user. Blocking P2P traffic obviously is in the self interest of many carriers who distribute video outside of their IP network and deliberately throttling traffic asymmetrically between low speed versus high speed subscribers should not be used a technique to push lower speed subscribers to purchase higher speed services—BSA]
Posted by Vint Cerf, Chief Internet Evangelist, and Stephen Stuart, Principal Engineer
When an Internet application doesn't work as expected or your connection seems flaky, how can you tell whether there is a problem caused by your broadband ISP, the application, your PC, or something else? It can be difficult for experts, let alone average Internet users, to address this sort of question today.
Last year we asked a small group of academics about ways to advance network research and provide users with tools to test their broadband connections. Today Google, the New America Foundation's Open Technology Institute, the PlanetLab Consortium, and academic researchers are taking the wraps off of Measurement Lab (M-Lab), an open platform that researchers can use to deploy Internet measurement tools.
Researchers are already developing tools that allow users to, among other things, measure the speed of their connection, run diagnostics, and attempt to discern if their ISP is blocking or throttling particular applications. These tools generate and send some data back-and-forth between the user's computer and a server elsewhere on the Internet. Unfortunately, researchers lack widely-distributed servers with ample connectivity. This poses a barrier to the accuracy and scalability of these tools. Researchers also have trouble sharing data with one another.
M-Lab aims to address these problems. Over the course of early 2009, Google will provide researchers with 36 servers in 12 locations in the U.S. and Europe. All data collected via M-Lab will be made publicly available for other researchers to build on. M-Lab is intended to be a truly community-based effort, and we welcome the support of other companies, institutions, researchers, and users that want to provide servers, tools, or other resources that can help the platform flourish.
Today, M-Lab is at the beginning of its development. To start, three tools running on servers near Google's headquarters are available to help users attempt to diagnose common problems that might impair their broadband speed, as well as determine whether BitTorrent is being blocked or throttled by their ISPs. These tools were created by the individual researchers who helped found M-Lab. By running these tools, users will get information about their connection and provide researchers with valuable aggregate data. Like M-Lab itself these tools are still in development, and they will only support a limited number of simultaneous users at this initial stage.
At Google, we care deeply about sustaining the Internet as an open platform for consumer choice and innovation. No matter your views on net neutrality and ISP network management practices, everyone can agree that Internet users deserve to be well-informed about what they're getting when they sign up for broadband, and good data is the bedrock of sound policy. Transparency has always been crucial to the success of the Internet, and, by advancing network research in this area, M-Lab aims to help sustain a healthy, innovative Internet.
You can learn more at the M-Lab website. If you're a researcher who'd like to deploy a tool, or a company or institution that is interested in providing technical resources, we invite you to get involved.
--
Posted By Google Public Policy Blog to Google Public Policy Blog at 1/28/2009 03:32:00 P
Posted by Vint Cerf, Chief Internet Evangelist, and Stephen Stuart, Principal Engineer
When an Internet application doesn't work as expected or your connection seems flaky, how can you tell whether there is a problem caused by your broadband ISP, the application, your PC, or something else? It can be difficult for experts, let alone average Internet users, to address this sort of question today.
Last year we asked a small group of academics about ways to advance network research and provide users with tools to test their broadband connections. Today Google, the New America Foundation's Open Technology Institute, the PlanetLab Consortium, and academic researchers are taking the wraps off of Measurement Lab (M-Lab), an open platform that researchers can use to deploy Internet measurement tools.
Researchers are already developing tools that allow users to, among other things, measure the speed of their connection, run diagnostics, and attempt to discern if their ISP is blocking or throttling particular applications. These tools generate and send some data back-and-forth between the user's computer and a server elsewhere on the Internet. Unfortunately, researchers lack widely-distributed servers with ample connectivity. This poses a barrier to the accuracy and scalability of these tools. Researchers also have trouble sharing data with one another.
M-Lab aims to address these problems. Over the course of early 2009, Google will provide researchers with 36 servers in 12 locations in the U.S. and Europe. All data collected via M-Lab will be made publicly available for other researchers to build on. M-Lab is intended to be a truly community-based effort, and we welcome the support of other companies, institutions, researchers, and users that want to provide servers, tools, or other resources that can help the platform flourish.
Today, M-Lab is at the beginning of its development. To start, three tools running on servers near Google's headquarters are available to help users attempt to diagnose common problems that might impair their broadband speed, as well as determine whether BitTorrent is being blocked or throttled by their ISPs. These tools were created by the individual researchers who helped found M-Lab. By running these tools, users will get information about their connection and provide researchers with valuable aggregate data. Like M-Lab itself these tools are still in development, and they will only support a limited number of simultaneous users at this initial stage.
At Google, we care deeply about sustaining the Internet as an open platform for consumer choice and innovation. No matter your views on net neutrality and ISP network management practices, everyone can agree that Internet users deserve to be well-informed about what they're getting when they sign up for broadband, and good data is the bedrock of sound policy. Transparency has always been crucial to the success of the Internet, and, by advancing network research in this area, M-Lab aims to help sustain a healthy, innovative Internet.
You can learn more at the M-Lab website. If you're a researcher who'd like to deploy a tool, or a company or institution that is interested in providing technical resources, we invite you to get involved.
--
Posted By Google Public Policy Blog to Google Public Policy Blog at 1/28/2009 03:32:00 P
Tuesday, January 6, 2009
Three revolutionary technololgies that may impact future Internet
1.”Twisted” light in optical fibers
-----------------------------------
[“Twisted” light has the potential to dramatically increase bandwidth of optical networks. Already researchers are using various wireless techniques such as phase quadrature phase shift modulation to achieve data rates in excess of 560 Gbps on a single wavelength in a DWDM system, and it is expected that data rates in excess of 1000 Gbps per wavelength will be possible soon. These techniques will work with existing DWDM networks and dramatically increase their bandwidth capacity to tens if not hundreds of terabits. Optical Orbital Angular Momentum (OOAM) has the potential to add an almost infinite number of phase states to the modulated signal and further increase the capacity to thousands of terabits. Up to now the challenge has been how to couple OOAM modulated signals into single mode fiber.
http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04388855
In summary, we have analyzed and verified the generation of an optical vortex carrying OAM directly in fiber starting with a fiber core mode for the first time to our knowledge. This is achieved by transferring OAM from an acoustic vortex
generated in the fiber. Analysis of the coupling coefficient of this acousto-optic interaction verifies independent conservation of spin and orbital angular momenta.
Also see
http://www.iop.org/EJ/article/1367-2630/9/9/328/njp7_9_328.html#nj250899s1
2. Truphone Brings Skype To iPhone & iTouch
--------------------------------------------
http://gigaom.com/2009/01/05/truphone-brings-skype-to-iphone-itouch/
[Now you can make skpe calls on your iTouch or Iphone using any Wifi networks and avoid expensive cell phone charges and long distance fees. Excerpt from the Gigaom web site—BSA]
Geraldine Wilson, who was recently appointed as the chief executive of Truphone, told me in a conversation earlier today that Truphone wants to “offer our users a comprehensive communications experience. We started out as a voice app but now we are broadening it to other applications.”
By doing so, Wilson and Truphone founder James Tagg believe that they will give Truphone users a reason to stay insider the application longer, creating more opportunities to make phone calls and bringing in much-needed revenues. “In a mobile environment it is hard to switch between different applications, and that is why we are creating a single application environment,” Tagg says.
3. New Internet-ready TVs put heat on cable firms
------------------------------------------------
[Excerpts from Globe and Mail – BSA]
http://business.theglobeandmail.com/servlet/story/RTGAM.20090105.wrtvweb06/BNStory/Business/home
For years, technology companies have tried in vain to bring the Internet onto the screen at the centre of North American living rooms. Although TV shows have made the migration to the Web, to date, it has been a one-way road.
Now, a new breed of Internet-connected televisions is threatening to shake up both the technology and broadcasting industries while making millions of recently purchased high-definition TVs yesterday's news.
Yesterday, LG Electronics Inc. unveiled a new line of high-definition TVs at the Consumer Electronics Show in Las Vegas that will include software from Netflix Inc. – to allow users to download movies and television programs directly to their TVs over an Internet connection.
Mr. McQuivey said Internet-connected TVs will have truly arrived when we see a major Web video services like Hulu.com start taking viewers away from cable companies.
Hulu – a joint project of NBC Universal Inc. and News Corp., which is not yet available in Canada – is ad-supported and offers free on-demand videos, allowing users to watch popular U.S. programs at their convenience.
“[Cable companies] have the most to lose and it's their business model which is at greatest risk of redundancy in this transition,” said Carmi Levy, an analyst with AR Communications Inc. “Their consistent revenue stream will come under attack as new offerings come to the market. … It's similar to what the telephone companies have faced from voice over Internet telephony (VoIP), cellphones and free instant messaging tools.”
-----------------------------------
[“Twisted” light has the potential to dramatically increase bandwidth of optical networks. Already researchers are using various wireless techniques such as phase quadrature phase shift modulation to achieve data rates in excess of 560 Gbps on a single wavelength in a DWDM system, and it is expected that data rates in excess of 1000 Gbps per wavelength will be possible soon. These techniques will work with existing DWDM networks and dramatically increase their bandwidth capacity to tens if not hundreds of terabits. Optical Orbital Angular Momentum (OOAM) has the potential to add an almost infinite number of phase states to the modulated signal and further increase the capacity to thousands of terabits. Up to now the challenge has been how to couple OOAM modulated signals into single mode fiber.
http://ieeexplore.ieee.org/stamp/stamp.jsp?arnumber=04388855
In summary, we have analyzed and verified the generation of an optical vortex carrying OAM directly in fiber starting with a fiber core mode for the first time to our knowledge. This is achieved by transferring OAM from an acoustic vortex
generated in the fiber. Analysis of the coupling coefficient of this acousto-optic interaction verifies independent conservation of spin and orbital angular momenta.
Also see
http://www.iop.org/EJ/article/1367-2630/9/9/328/njp7_9_328.html#nj250899s1
2. Truphone Brings Skype To iPhone & iTouch
--------------------------------------------
http://gigaom.com/2009/01/05/truphone-brings-skype-to-iphone-itouch/
[Now you can make skpe calls on your iTouch or Iphone using any Wifi networks and avoid expensive cell phone charges and long distance fees. Excerpt from the Gigaom web site—BSA]
Geraldine Wilson, who was recently appointed as the chief executive of Truphone, told me in a conversation earlier today that Truphone wants to “offer our users a comprehensive communications experience. We started out as a voice app but now we are broadening it to other applications.”
By doing so, Wilson and Truphone founder James Tagg believe that they will give Truphone users a reason to stay insider the application longer, creating more opportunities to make phone calls and bringing in much-needed revenues. “In a mobile environment it is hard to switch between different applications, and that is why we are creating a single application environment,” Tagg says.
3. New Internet-ready TVs put heat on cable firms
------------------------------------------------
[Excerpts from Globe and Mail – BSA]
http://business.theglobeandmail.com/servlet/story/RTGAM.20090105.wrtvweb06/BNStory/Business/home
For years, technology companies have tried in vain to bring the Internet onto the screen at the centre of North American living rooms. Although TV shows have made the migration to the Web, to date, it has been a one-way road.
Now, a new breed of Internet-connected televisions is threatening to shake up both the technology and broadcasting industries while making millions of recently purchased high-definition TVs yesterday's news.
Yesterday, LG Electronics Inc. unveiled a new line of high-definition TVs at the Consumer Electronics Show in Las Vegas that will include software from Netflix Inc. – to allow users to download movies and television programs directly to their TVs over an Internet connection.
Mr. McQuivey said Internet-connected TVs will have truly arrived when we see a major Web video services like Hulu.com start taking viewers away from cable companies.
Hulu – a joint project of NBC Universal Inc. and News Corp., which is not yet available in Canada – is ad-supported and offers free on-demand videos, allowing users to watch popular U.S. programs at their convenience.
“[Cable companies] have the most to lose and it's their business model which is at greatest risk of redundancy in this transition,” said Carmi Levy, an analyst with AR Communications Inc. “Their consistent revenue stream will come under attack as new offerings come to the market. … It's similar to what the telephone companies have faced from voice over Internet telephony (VoIP), cellphones and free instant messaging tools.”
Top 10 IT Trends for Higher Education in 2009
[From Lev Gonick’s excellent blog as mentioned in NLR newsletter – www.nlr.net--BSA]
Top 10 IT Trends for Higher Education in 2009
http://blog.case.edu/lev.gonick/
What happens when tough economic times combine with fatigue across the campus community hyping the latest 'killer app', and the growing intolerance of disruptions to services occasioned by security-related activities. I think the intersection of these three realities represent the most important challenges for IT Leadership on the campus in 2009.
The truth is that we've not seen 3 years of negative economic growth since the birth of the Internet. We are one year into the global recession and the crystal-ball gazing efforts underway on most campuses are not producing rosy scenarios. CIO leaders at most universities are closing in on 'core' operations as they look for options for cost cutting requirements after more than 5 years of marginal growth. CIOs are portfolio managers. Like their counterparts, CIO portfolio management is really about combining requirements for operational excellence, customer service, and selective innovation (r&d) activity. In a three-year secular downturn, there are going to be tough decisions ahead to keep strong performance in all three core activities outlined above.
For many university technology leaders the emergence over the past couple of years of web 2.0 technologies represented a confluence of maturing underlying technologies combined with the rise of what we asserted was the first really promising set of mass collaboration tools. Here we were sitting on the precipice of the long promised 'transformational' potential of technology to the education enterprise and then the economy tanks. In reality, the economic downturn is only one reason that the campus community is less enamored with web 2.0 tools than most of us technologists. For many across the university the rate of change in introducing ever more exciting technologies has left them, to put it diplomatically, breathless. In reality, the hype over web 2.0 is only the most recent instantiation of the long held view that we technologists are amusing ourselves and the rest of the campus to death, forever one gadget or applet away from the ultimate breakthrough.
Finally, whether it is the latest facebook virus, botnets instigated from far flung corners of the world, or the now predictable 'urgent' security fixes from our favorite vendors, there is a real sense across the campus that the 'bad guys' are winning the war. What was simply a nuisance that could be solved with a bit of end-user education and throwing some hardware at the problem has emerged into our own full fledged war on the forces of evil on the Internet. Like recent international conflicts, most on the university campus are ready to conclude that we have neither a strategy for winning this war nor an exit strategy.
Combined, economic blues, end-user fatigue, and a growing sense of collective vulnerability to the forces who would seek to harm us has the campus technology community facing its biggest set of challenges in 25 years.
Against that sobering backdrop, here are my top 10 IT trends for higher education for the year ahead, 2009.
1. To The Cloud and Beyond. Watch for significant moves in the university space going well beyond cloud email services. I expect we'll see the emergence of shared storage utilities and a range of 'web services' in 2009 following industry trends, campus economic pressure, and ecological considerations. While the same resistance points will find their way into campus deliberations, resistance is too expensive, distracts us from where we can bring real value, and ultimately futile. But for the most regulated storage requirements, there really is no alternative.
2. The Consumer Reigns Supreme. There has been an academic debate in most large organizations for 5 years about how we were going to manage the growing presence of consumer technologies within our enterprises. No more. The tsunami is here. Those of us still debating the merits of attending to Facebook, iTouch/iPhone, streaming media, massive player online gaming, mashups, and virtual reality platforms are staring at the wall of this tidal wave of consumer technologies. New trends in 2009 will likely include the first college-centered breakthroughs for mobile computing after mass notification. Watch for location-based technologies and presence technologies embedded in mobile smart phones and other devices (like wi-fi enabled iTouch) to lead to the first set of scalable campus applets.
3. Streaming Media for Education Goes Mainstream. Students expect it. Teachers accept it. Network engineers will have to live with it. Academic technologists need to figure out how to scale it. In the next 12 months, I think YouTube, iTunes U, and the plethora of campus-based services for academic streaming media are going to hit main street. Economics plus assessment data now provide compelling evidence that student success is positively associated with the integration of streaming media into the capture and review of traditional learning models of instructor-centered delivery. In the next year I expect that we will see significant acceleration of efforts associated with video/speech to text technologies to provide real time transcripts for purposes of enhanced search capabilities. I also expect that large repositories of meta-tagged and transcoded academic assets (classes, recitations, seminars etc ...) will begin to emerge allowing for federated searches and mashing up of learning content by students and faculty alike.
4. SecondLife Goes Back to School. Initial exuberance and hype led to hundreds of universities experimenting with 3D Virtual Worlds three years ago. The user-generated universe requires new pedagogy and curriculum considerations. Academic technologists and the education community has learned a lot over the past several years. Look for new functionality and education-centered technology capabilities over the next year. The net result should be an exciting and provocative set of new collaborative capabilities to help enable more campus control and flexible tools for learning. Dust off your avatar and get ready for one of the most important collaborative learning platforms to make inroads in the year ahead.
5. e-Book Readers Disrupt the College Text Book Marketplace. Early predictions of the demise of the college text book market in 2008 were highly exaggerated. Sony and Amazon (among others) are in e-Book Reader space for the long haul. Early in 2009, expect to see new hardware form factors reflecting a more mature and robust technology. More important, I think we'll see pilot activity among the Book publishers and the e-book publishing industry to work with the campus to create relevant tools for learning embedded in their core technologies.
6. The IT Help Desk Becomes An Enterprise Service Desk . Long underfunded and staffed with underpaid students I think we are going to hit an inflection point in the IT Help Desk world. Customer service matters. Truth is that with a few important notable exceptions most campus Help Desks are not our strongest service lines. An emergent group of higher-education focused companies have entered this space and are offering a compelling value proposition for many campuses. On some campuses, the Berlin wall between IT Help Desks and Facilities and other customer service organizations are also coming down. The trend line is about to hit a take-off point. I think 2009 may well be the year.
7. Course Management Systems are Dead! Long Live Course Management Systems! Proprietary course management systems are heading for a brick wall. The combination of economic pressures combined with saturated markets and the maturing stage of the life cycle of these once innovative platforms means that 2009 may well be the year of change or a year of serious planning for change. Relatively inexpensive and feature-comparable open source alternatives combined with some now learned experience in the process of transition from closed to open systems for the inventory of repeating courses makes real change in this once bedrock of education technology a growing possibility. As product managers and management view these trend lines, I think we might see incumbent players make a valiant effort to re-invent themselves before the market drops out from underneath them. Look for the number of major campuses moving (or making serious threats to move) from closed systems to climb in the year ahead.
8. ERP? What's That? No, I don't think the large enterprise resource planning systems that undergird our major administrative systems are going to fall off the face of the earth like antiquated dinosaurs in the next 12 months. I do think that ERP upgrades which many campuses are now facing, planning, and staging are going to need to be re-positioned. At a minimum, I think we will see decisions made to delay major upgrades for 18-24 months. It is also possible that pressure will grow in this next year on the duopoly of these integrated systems providers to re-open their maintenance and other fee schedules in exchange for continuing multi-year commitments from the campus community. We will also see new models mature in the hosting of ERP services both as shared services among the campus community and as a commercial service offering. For these glacially-moving systems, change is happening. It's just hard some times to see the rate of change until you're looking in the rear view mirror 10 years from now.
9. In God We Trust -- Everyone Else Bring Data. Decision support software and data warehousing tools have been available on campus for well over a decade. While cultures of evidence are not well rooted in the decision making on many University campuses, the growing pressures for better decision making in the context of budget pressures is compelling the campus to make better decisions. The small priesthood of campus analysts with skills to support decision making have more job security than most. At the same time, look for new reporting tools and growing expectations that metrics, scorecards, and data analytics will be used to drive tough decision making on campus.
10. Smile, Interactive High Definition Video Conferencing moves from the Board Room to the Research Lab and the Lecture Hall. Facing budget pressures and public pressure to go green, corporations around the world are investing in next generation video conferencing. Moving operating dollars into infrastructure investments in this collaboration platform technology has led to significant reductions in travel costs, better space utilization, and a growing conscientiousness about carbon footprints. As businesses continues to look for capabilities to support global operations video conferencing has become a daily part of many companies. The logic facing corporations now confront the University community. Over the past 18 months some public universities have been mandated to reduce their carbon footprints. Most everyone else is facing growing operating pressures pinching travel and other budget lines. New students care about pro-active green initiatives as part of their University experience. Over the next 12 months look for double digit growth in campus adoption of next generation video conferencing tools, including integrated collaboration technologies.
One more trend for good measure. Substitute this one if you disagree vehemently with any of the other items above.
11. The campus data center goes under the scope . Most every campus technology leader has been zinged for disaster recovery and business continuity planning. Add to this that there is exponential demand among the research community for computational research space to support high performance computing. The facilities community is under growing pressure to distribute the costs of power consumption on campus. Data centers consume disproportionate amounts of space, cooling, and power. Finally, growing green is a campus imperative leading to potential operating savings through virtualization, data center optimization, and new greener strategies. Board audit committees and senior management are going to hold technology management accountable for robust data center operations in a highly constrained budget environment.
I don't know about you but my holiday gift wish list includes an extra bottle of Tylenol three, a Teflon flak jacket, and a hope that structured innovation remains part of the campus IT portfolio. Against multiple pressures, focus on structured innovation remains our best hope of remaining central to the University's strategic mission and activity.
Lev Gonick
Case Western Reserve University
Cleveland, OH
December 15, 2008
Top 10 IT Trends for Higher Education in 2009
http://blog.case.edu/lev.gonick/
What happens when tough economic times combine with fatigue across the campus community hyping the latest 'killer app', and the growing intolerance of disruptions to services occasioned by security-related activities. I think the intersection of these three realities represent the most important challenges for IT Leadership on the campus in 2009.
The truth is that we've not seen 3 years of negative economic growth since the birth of the Internet. We are one year into the global recession and the crystal-ball gazing efforts underway on most campuses are not producing rosy scenarios. CIO leaders at most universities are closing in on 'core' operations as they look for options for cost cutting requirements after more than 5 years of marginal growth. CIOs are portfolio managers. Like their counterparts, CIO portfolio management is really about combining requirements for operational excellence, customer service, and selective innovation (r&d) activity. In a three-year secular downturn, there are going to be tough decisions ahead to keep strong performance in all three core activities outlined above.
For many university technology leaders the emergence over the past couple of years of web 2.0 technologies represented a confluence of maturing underlying technologies combined with the rise of what we asserted was the first really promising set of mass collaboration tools. Here we were sitting on the precipice of the long promised 'transformational' potential of technology to the education enterprise and then the economy tanks. In reality, the economic downturn is only one reason that the campus community is less enamored with web 2.0 tools than most of us technologists. For many across the university the rate of change in introducing ever more exciting technologies has left them, to put it diplomatically, breathless. In reality, the hype over web 2.0 is only the most recent instantiation of the long held view that we technologists are amusing ourselves and the rest of the campus to death, forever one gadget or applet away from the ultimate breakthrough.
Finally, whether it is the latest facebook virus, botnets instigated from far flung corners of the world, or the now predictable 'urgent' security fixes from our favorite vendors, there is a real sense across the campus that the 'bad guys' are winning the war. What was simply a nuisance that could be solved with a bit of end-user education and throwing some hardware at the problem has emerged into our own full fledged war on the forces of evil on the Internet. Like recent international conflicts, most on the university campus are ready to conclude that we have neither a strategy for winning this war nor an exit strategy.
Combined, economic blues, end-user fatigue, and a growing sense of collective vulnerability to the forces who would seek to harm us has the campus technology community facing its biggest set of challenges in 25 years.
Against that sobering backdrop, here are my top 10 IT trends for higher education for the year ahead, 2009.
1. To The Cloud and Beyond. Watch for significant moves in the university space going well beyond cloud email services. I expect we'll see the emergence of shared storage utilities and a range of 'web services' in 2009 following industry trends, campus economic pressure, and ecological considerations. While the same resistance points will find their way into campus deliberations, resistance is too expensive, distracts us from where we can bring real value, and ultimately futile. But for the most regulated storage requirements, there really is no alternative.
2. The Consumer Reigns Supreme. There has been an academic debate in most large organizations for 5 years about how we were going to manage the growing presence of consumer technologies within our enterprises. No more. The tsunami is here. Those of us still debating the merits of attending to Facebook, iTouch/iPhone, streaming media, massive player online gaming, mashups, and virtual reality platforms are staring at the wall of this tidal wave of consumer technologies. New trends in 2009 will likely include the first college-centered breakthroughs for mobile computing after mass notification. Watch for location-based technologies and presence technologies embedded in mobile smart phones and other devices (like wi-fi enabled iTouch) to lead to the first set of scalable campus applets.
3. Streaming Media for Education Goes Mainstream. Students expect it. Teachers accept it. Network engineers will have to live with it. Academic technologists need to figure out how to scale it. In the next 12 months, I think YouTube, iTunes U, and the plethora of campus-based services for academic streaming media are going to hit main street. Economics plus assessment data now provide compelling evidence that student success is positively associated with the integration of streaming media into the capture and review of traditional learning models of instructor-centered delivery. In the next year I expect that we will see significant acceleration of efforts associated with video/speech to text technologies to provide real time transcripts for purposes of enhanced search capabilities. I also expect that large repositories of meta-tagged and transcoded academic assets (classes, recitations, seminars etc ...) will begin to emerge allowing for federated searches and mashing up of learning content by students and faculty alike.
4. SecondLife Goes Back to School. Initial exuberance and hype led to hundreds of universities experimenting with 3D Virtual Worlds three years ago. The user-generated universe requires new pedagogy and curriculum considerations. Academic technologists and the education community has learned a lot over the past several years. Look for new functionality and education-centered technology capabilities over the next year. The net result should be an exciting and provocative set of new collaborative capabilities to help enable more campus control and flexible tools for learning. Dust off your avatar and get ready for one of the most important collaborative learning platforms to make inroads in the year ahead.
5. e-Book Readers Disrupt the College Text Book Marketplace. Early predictions of the demise of the college text book market in 2008 were highly exaggerated. Sony and Amazon (among others) are in e-Book Reader space for the long haul. Early in 2009, expect to see new hardware form factors reflecting a more mature and robust technology. More important, I think we'll see pilot activity among the Book publishers and the e-book publishing industry to work with the campus to create relevant tools for learning embedded in their core technologies.
6. The IT Help Desk Becomes An Enterprise Service Desk . Long underfunded and staffed with underpaid students I think we are going to hit an inflection point in the IT Help Desk world. Customer service matters. Truth is that with a few important notable exceptions most campus Help Desks are not our strongest service lines. An emergent group of higher-education focused companies have entered this space and are offering a compelling value proposition for many campuses. On some campuses, the Berlin wall between IT Help Desks and Facilities and other customer service organizations are also coming down. The trend line is about to hit a take-off point. I think 2009 may well be the year.
7. Course Management Systems are Dead! Long Live Course Management Systems! Proprietary course management systems are heading for a brick wall. The combination of economic pressures combined with saturated markets and the maturing stage of the life cycle of these once innovative platforms means that 2009 may well be the year of change or a year of serious planning for change. Relatively inexpensive and feature-comparable open source alternatives combined with some now learned experience in the process of transition from closed to open systems for the inventory of repeating courses makes real change in this once bedrock of education technology a growing possibility. As product managers and management view these trend lines, I think we might see incumbent players make a valiant effort to re-invent themselves before the market drops out from underneath them. Look for the number of major campuses moving (or making serious threats to move) from closed systems to climb in the year ahead.
8. ERP? What's That? No, I don't think the large enterprise resource planning systems that undergird our major administrative systems are going to fall off the face of the earth like antiquated dinosaurs in the next 12 months. I do think that ERP upgrades which many campuses are now facing, planning, and staging are going to need to be re-positioned. At a minimum, I think we will see decisions made to delay major upgrades for 18-24 months. It is also possible that pressure will grow in this next year on the duopoly of these integrated systems providers to re-open their maintenance and other fee schedules in exchange for continuing multi-year commitments from the campus community. We will also see new models mature in the hosting of ERP services both as shared services among the campus community and as a commercial service offering. For these glacially-moving systems, change is happening. It's just hard some times to see the rate of change until you're looking in the rear view mirror 10 years from now.
9. In God We Trust -- Everyone Else Bring Data. Decision support software and data warehousing tools have been available on campus for well over a decade. While cultures of evidence are not well rooted in the decision making on many University campuses, the growing pressures for better decision making in the context of budget pressures is compelling the campus to make better decisions. The small priesthood of campus analysts with skills to support decision making have more job security than most. At the same time, look for new reporting tools and growing expectations that metrics, scorecards, and data analytics will be used to drive tough decision making on campus.
10. Smile, Interactive High Definition Video Conferencing moves from the Board Room to the Research Lab and the Lecture Hall. Facing budget pressures and public pressure to go green, corporations around the world are investing in next generation video conferencing. Moving operating dollars into infrastructure investments in this collaboration platform technology has led to significant reductions in travel costs, better space utilization, and a growing conscientiousness about carbon footprints. As businesses continues to look for capabilities to support global operations video conferencing has become a daily part of many companies. The logic facing corporations now confront the University community. Over the past 18 months some public universities have been mandated to reduce their carbon footprints. Most everyone else is facing growing operating pressures pinching travel and other budget lines. New students care about pro-active green initiatives as part of their University experience. Over the next 12 months look for double digit growth in campus adoption of next generation video conferencing tools, including integrated collaboration technologies.
One more trend for good measure. Substitute this one if you disagree vehemently with any of the other items above.
11. The campus data center goes under the scope . Most every campus technology leader has been zinged for disaster recovery and business continuity planning. Add to this that there is exponential demand among the research community for computational research space to support high performance computing. The facilities community is under growing pressure to distribute the costs of power consumption on campus. Data centers consume disproportionate amounts of space, cooling, and power. Finally, growing green is a campus imperative leading to potential operating savings through virtualization, data center optimization, and new greener strategies. Board audit committees and senior management are going to hold technology management accountable for robust data center operations in a highly constrained budget environment.
I don't know about you but my holiday gift wish list includes an extra bottle of Tylenol three, a Teflon flak jacket, and a hope that structured innovation remains part of the campus IT portfolio. Against multiple pressures, focus on structured innovation remains our best hope of remaining central to the University's strategic mission and activity.
Lev Gonick
Case Western Reserve University
Cleveland, OH
December 15, 2008
Subscribe to:
Posts (Atom)