My Photo

Bill St. Arnaud is a R&E Network and Green IT consultant who works with clients on a variety of subjects such as the next generation research and education and Internet networks. He also works with clients to develop practical solutions to reduce GHG emissions such as free broadband and dynamiccharging of eVehicles (See http://green-broadband.blogspot.com/) . For more about me please see http://goo.gl/pOpwBView my complete profile

Monday, August 27, 2007

Future of Research Networks in Europe, China and United States


[Here a couple of pointers to some analysis on future research network directions. I especially recommend the European EARNEST study, as it conclusions and recommendations will be input to the design of GEANT 3 in Europe--BSA]

Results of EARNEST study https://wiki.internet2.edu/confluence/download/attachments/16835/Karel+Vietsch+Dorte+Olesen+EARNEST-DO-1.ppt?version=1

GEANT 3 https://wiki.internet2.edu/confluence/download/attachments/16835/David+West+CCIRN+GEANT2_3+26_8_07.ppt?version=1

China CNGI https://wiki.internet2.edu/confluence/download/attachments/16835/Xing+Li+02-CNGI-CERNET2-report.ppt?version=1

US Advanced Networking R&D plan http://www.nitrd.gov/advancednetworkingplan/



In France - download of music will be free - how the Internet is changing the music industry


[A couple recent announcements today illustrate how the Internet is radically changing the music business. The music business is one of the first of many industries that will be fundamentally be changed by the Internet, especially with the advent of Web 2.0 open source and SOA. In my books, any industry that is dependent on take down orders, clumsy security protocols, lawyers and government lobbying for survival is a SELL. Think movies, publishing and telecom. These exciting new developments in France of course, in my opinion reflect the strong pro-competitive marketplace for broadband that has been enforced by regulators and policy makers in that country. The French teleco regulator is one of the most enlightened in the world. Thanks to various blogs and contributors including Benoit Felten, Om Malik, Dave Farber and Dewayne Hendricks- BSA]

In France, Music will be Free http://gigaom.com/2007/08/22/in-france-music-will-be-free/

Neuf launches a "free" music offer as part of its triple-play package http://www.fiberevolution.com/2007/08/neuf-launches-a.html

Artist promote thyself
Thanks to new Web businesses, musicians can reach a bigger audience
-- and keep more of the profits for themselves

A full-time career in music seemed unlikely for Chris O'Brien, or at
least one that would pay the bills.

But these days, the 27-year-old Medford musician is selling thousands
of albums online, along with downloads from his debut CD,
"Lighthouse," and he soon plans to offer T-shirts, tickets, and other
merchandise on his MySpace page and personal website.

He credits at least part of his newfound business acumen to nimbit, a
sales, promotion, and distribution company in Framingham that helps
emerging artists build careers online.

"This is the era of the independent artist," O'Brien said. "It's
easier and more doable than it ever has been. People are opting to
remain independent because there's a lot more money to be had."

Nimbit is one of a growing number of businesses, including CD Baby
and Musictoday, that have helped make it easier for independent
musicians to make a living from their work and widely distribute
their music.

It is the brainchild of Patrick Faucher and Matt Silbert, who worked
for a Web firm, Stumpworld Systems, which developed some of the first
e-commerce sites for bands such as Phish and Aerosmith.

About five years ago, they decided to design a platform to help
budding bands, so they set out to take some of the features created
for the major acts and build a suite of Web tools that independent
artists could use.

Soon after, they merged with Artist Development Associates and added
direct-to-fan sales, along with production and promotion services,
creating a one-stop solution for artists to run their businesses.

In June, nimbit introduced its online merchandise table, the first
portable Web store that lets musicians sell CDs, DVDS, MP3s,
merchandise, and e-tickets from a single point of purchase, virtually
anywhere online. The tool can easily be embedded in any website,
blog, or e-mail that allows widgets.

"Increasingly, recording artists and consumers are uniting and
circumventing traditional channels for creating and distributing
music," said Mike Goodman, a media and entertainment analyst at
Yankee Group in Boston. "These days, musicians can do business
directly with consumers. They don't need a recording label. They
don't need a store. They don't need Ticketmaster, the way they used to."

Just a few years ago, Steve Roslonek, of Wethersfield, Conn., was
getting e-mail orders for his CDs and going to the post office once a
week to send of the packages. His growing success as a children's
musician made it almost impossible to keep up with the requests. With
the help of nimbit over the past several years, he has earned more
than $100,000 from sales of CDs, tickets, and merchandise.

The full article can be found here:


This is another sign that disruptive business models are having an
immense impact the traditional business models. The fact is that many
of these industries have no idea how to compete with technologies
they barely understand.


Survey says: only DRM-free music is worth paying for
By Ken Fisher | Published: August 05, 2007 - 10:32PM CT

free-music-is-worth-paying-for.html>

One of the largest surveys of music consumers to closely examine the
question of Digital Rights Management (DRM) has an important two-part
message for the music industry. The first is that DRM is definitely
turning consumers off music sales, and charging them extra to get rid
of it may be an uphill battle. The second message is that knowledge
of DRM and its problems is spreading fast.

Entertainment Media Research, working with media law firm Olswang,
conducted lengthy online surveys with 1,700 UK music consumers,
selected from a pre-existing panel of more than 300,000 music
consumers in the UK (PDF: 2007 Digital Music Survey). What makes this
survey important is the fact that it was aimed squarely at the music-
buying public, not the anti-RIAA crowd, not the techno-libertarians,
and not our general readership. I've been told more than once that
the views on DRM found at publications like Ars Technica are "not
representative" of the general public. Perhaps this was once the
case, but it can no longer be maintained generally. At least in the
UK, the dirt on DRM is out, and it's spreading.
First, the bird's eye view: 68 percent of those with opinions on the
matter say that the only music worth purchasing is that which is DRM-
free. Yet less than half (39 percent) are willing to pay a little
extra for it, while 18 percent say that they'd rather save a little
dough and keep the DRM if they had to chose between the two. In the
middle is a mass of people with no opinion on the matter, because
they're not sure what DRM is or don't know their preference. That
will likely soon change.

Familiarity with DRM has grown significantly in the last year. In
2006, more than half of respondents had never heard of DRM, but that
number has dropped 16 percentage points in 2007, to 37 percent. The
number of people who claimed to have a good or exact knowledge of DRM
nearly tripled in that same timeframe.

Of those who have some idea of what DRM is, their views are largely—
but not entirely—negative. 61 percent said that DRM "invades the
rights of the music consumer to hear their music on different
platforms." 49 percent called it a "nuisance," and 39 percent
expressed concerns that DRM could have privacy implications. Despite
this, 63 percent agreed that DRM "is a good idea because it protects
copyrighted music from illegal file-sharers." In other words, the
idea of stopping illegal file-sharing via DRM doesn't bother these
consumers much, but the effect the effort is having on their own
purchases is not appreciated.

This view isn't surprising. Few are those who, in principle, believe
that all information (and content) should be "free"; the mainstream
viewpoint is still staunchly in the "artists should be compensated"
camp. This appreciation for the music business does not surmount all
other concerns, however.

Consumers aren't interested in a "nuisance" for the sake of stopping
file-sharers, and of course those of us who play closer attention to
the world of DRM know that DRM actually does not stop file-sharing at
all. As this general truth spreads, so does dissatisfaction.

The takeaway from the survey is that DRM's bad reputation is
spreading among general music consumers, and there is a growing
aversion to purchasing music that comes with DRM. Despite this, the
general understanding of the struggle the industry faces with piracy
is still somewhat positive among those same consumers. Still, given
that file sharing in the UK is at an all-time high, it would appear
that the the music industry needs to remove the digital locks on its
tunes, and fast.

http://HTDAW.livedigital.com/blog/89163

Prince Points the Way to a Brighter Future for Music
07.09.07 | 2:00 AM

In his autobiography, Miles Davis wrote that Prince was the only
musician in the world capable of moving music forward. Davis was
referring to musical prowess, but he may as well have been talking
about Prince's business acumen, as evidenced by his upcoming album
giveaway -- the latest in a long series of innovative maneuvers,
including his escape from a Warner Music Group contract in 1994,
early support for P2P trading and status as one of the first major
artists to sell music from his website.

Davis' last, best hope for the future of music most recently outraged
the music establishment by saying he'll give away CDs of his Planet
Earth album to British fans who purchase next week's Mail on Sunday
newspaper. In light of the giveaway, Sony/BMG refused to distribute
the album in Great Britain, provoking outbursts from music retailers
who had been cut out of the action.

Paul Quirk, co-chairman of Britain's Entertainment Retailers
Association, threatened: "The Artist Formerly Known as Prince should
know that with behavior like this he will soon be the Artist Formerly
Available in Record Stores."

Part of the problem, according to retailers, is that Prince's move
helped solidify a growing perception on the part of consumers that
music is free.

Jack Horner, creative and joint managing director for Frukt, a music-
marketing agency, said that while "people like (Prince) play a key
part in helping figure out what the models may be in the music
business of tomorrow, by giving away a whole album on the front of a
newspaper, there is a very clear devaluing of music, which is not a
positive message to send out right now."

Neither the Mail on Sunday or Prince's camp would divulge how much
the newspaper paid Prince for the right to give his album away, but
it's clear Prince was paid upfront, and that nearly 3 million Mail on
Sunday readers -- plus everyone who bought tickets to one of his
shows -- will receive the CD for free. The giveaway almost certainly
contributed to Prince selling out 15 of his 21 shows at London's O2
Arena within the first hour of ticket sales. The venue (formerly the
Millennium Dome) holds around 20,000 people. If the remaining six
shows sell out, the series will gross over $26 million.

Combined with the undisclosed fee paid by the Mail on Sunday, it's
not a bad take for someone who's involved in a "very clear devaluing
of music."

Prince's latest gambit also succeeded by acknowledging that copies,
not songs, are just about worthless in the digital age. The longer an
album is on sale, the more likely it is that people can find
somewhere to make a copy from a friend's CD or a stranger's shared-
files folder. When copies approach worthlessness, only the original
has value, and that's what Prince sold to the Mail on Sunday: the
right to be Patient Zero in the copying game.

As with blogging and so many other things digital, music distribution
could become a competition to see who posts things first. In a sense,
music distribution would no longer be about space -- it would be
about time.

More bands and labels are likely to explore the idea of squeezing
extra value out of their music by selling off the right to be first,
as traditional sources of revenue continue to dry up. Universal's
recent insistence on an "at will" contract with Apple music store,
for instance, is thought to be part of a plan for the world's largest
record label to start selling the exclusive rights to debut certain
albums. And nowhere is it written in stone that music stores are the
only candidates for buying those rights.

Artists have licensed music to advertisers for decades, of course,
but this goes a step further: allowing the licensee to function as
the music's distributor (at least initially). If this idea catches
on, artists and labels looking to duplicate Prince's success will
have to proceed with caution if they want to avoid accusations of
selling out.

In the '90s, a popular slogan for posters and graffiti in and around
my college radio station was "Corporate Rock Sucks," and although
that attitude no longer seems prevalent, fans still routinely revolt
when they hear one of their favorite songs used in a car ad.

Prince ensured that the Mail on Sunday version of his album looks
identical to the one sold in stores, giving it the clear appearance
of coming with the paper, rather than being of the paper. Companies
that want to make a business out of music sponsorships, like RCRD LBL
(an upcoming collaboration between Engadget's Pete Rojas and Downtown
Records), will have to negotiate sponsorships with similar care. If
they do, brands, fans and bands large and small stand to benefit.

Eliot Van Buskirk has covered digital music since 1998, after seeing
the world's first MP3 player sitting on a colleague's desk. He plays
bass and rides a bicycle.

http://www.wired.com/entertainment/music/commentary/listeningpost/
2007/07/listeningpost_0709



My Original Writing blog: http://itgotworse.livedigital.com


Open Source Business for Telecommunications and other Business


[This is an excellent journal for those who want to understand the business opportunities of the open source business model and why many venture capital companies are investing in them. Open source business models combined with SOA and Web 2.0, (sometimes referred to Business 2.0 or Enterprise 2.0) will have a significant potential to radically transform todays business processes. Early adopters will have a significant first mover advantage in the marketplace. Thanks to Tony Bailetti for this pointer-- BSA]

The August issue of the Open Source Business Resource (OSBR) is now available in PDF and HTML formats at http://www.osbr.ca. The OSBR is a publication of the Talent First Network (TFN).

The August issue in pdf format is at: http://www.osbr.ca/news/august07.pdf and in html format at: http://www.osbr.ca/archive.php


In the August issue of OSBR, Peter Liu from DragonWave Inc. examines open source telecommunications companies (OST) and finds that 9 of 12 OST companies are supported by venture capital.

Dave McIlhagga from DM Solutions Group and Ervin Ruci from Geocoder.ca describe how their respective businesses interact with the open source geospatial community. Bill White from Roaring Penguin Software relates the
business model that grew a one-man consultancy into a successful company which maintains its ties with the open source community.

Tammy Yuan explores the ways the Carrier Grade Linux Working Group is changing the proprietary face of the telecommunications industry.

Rowland Few details the Talent First Network's role within the Ontario open source ecosystem, Michael Weiss outlines the open source patterns lead project, and Peter Hoddinott answers questions regarding open source
business models and commercialization of open source assets.

This issue also contains an expanded upcoming events section, as well as newsbytes, recently published reports, letters to the editor, and how you can contribute to subsequent issues.

We hope you enjoy this issue of the OSBR and look forward to your feedback. If you wish to contribute to the OSBR, please contact Dru Lavigne, OSBR Editor at dru@osbr.ca

The challenges of broadband policy and regulation in North America


[At one time Canada had the second highest penetration of broadband in the world and the US was a bit lower. Both countries have declined substantially in their international rankings. Canada is now around number eight and the USA is in the high teens or low twenties, depending on which study you look at.

This decline in broadband ranking has caused considerable concern amongst the Internet "technocrati", not so much in the absolute rankings, which are always subject to debate, but in the growing trend of appearing to fall further behind the rest of the world. More ominously, it is not only the decline in our respective broadband rankings that is causing angst, but the quality of the broadband being delivered. Most other countries who rank higher in terms of broadband deployment are also deploying various forms of fiber to the home with much higher data rates, as much as 100 Mbps for example in Japan. When taken into consideration, both the decline in broadband rankings and the low availability of higher speed broadband the situation in North America looks even more dire.

It is generally accepted that greater broadband penetration and higher speeds is a reflection of country's ability to innovate and support new business opportunities and greater productivity. With the coming wave of new Internet tools and applications such as Web 2.0, SOA, mashups, new media, citizen science and so forth a first rate broadband infrastructure will be critical to our future society and economy. Various studies have shown that ICT in general, and broadband Internet in particular, have a significant and measurable impact on GDP.

Canada and the United States are quite unique in terms of how broadband Internet is delivered compared to most other countries. Both countries have a financially sound and vibrant cable industry which is providing strong competition to the traditional DSL services offered by the telephone companies.

The cable industry penetration in North America has allowed for very competitive duopoly to exist. In most other countries the argument for regulatory intervention is much more compelling because there largely exists only one monopoly supplier. Structural separation, to varying degrees, is now the mantra of most regulators in these countries.

Structural separation in other infrastructure industries like electricity and gas has been extremely successful (see below). It has had a significant impact in allowing new business models to develop and reduces costs. We are starting to see the same results occur with broadband Internet in those countries that have started down the path of structural separation e.g. England and France. But it is yet unknown if structural separation will somehow provide the business case for building fiber to the home or any other type of next generation network.

It is interesting to note, that historically it was "regulated" structural separation that allowed the cable-telco duopoly to be created in North America in the first place (a fact often forgotten by many cableco executives). In the 1970-1990's regulators in Canada and the US prevented telephone companies from acquiring cable companies and/or offering cable TV services. In Canada this structural separation was mandated for cultural content reasons and in the US it was done to prevent concentration of ownership in the media industry. This strong regulatory enforcement allowed for the creation of a thriving, and now extremely successful cable industry. In countries where there no structural separation was mandated between cable and telephone, the nascent cable industry usually failed to reach any substantial penetration.

As a result in the North America duopoly situation, other than in some remote rural and northern areas, there is no perceived market failure in delivering broadband. More importantly most customers are quite happy with their current broadband service and there is little demand by them for greater bandwidths that would be afforded by fiber to the home.

There continues to remain the futile hope that wireless, in particular WiMax will provide inter-modal competition. I remain skeptical, not from a technology perspective, but by the business case. This is true regardless if its Muni WiFi or a more sophisticated commercial service. Any perspective retail broadband wireless company must compete with well entrenched cable and TV companies whose infrastructure has been largely amortized through their earlier delivery of their basic service of cable-TV and telephone. More importantly cable and telephone companies can offer a variety of bundled packages of triple play or quadruple play which is beyond the reach of any perspective wireless broadband company (delivering HTDV over wireless broadband remains a stretch). An important indicator of these developments is the almost complete disappearance of the retail Internet industry in Canada and the US. Although a few small players exist in niche markets, the retail Internet industry has largely been displaced by the facilities based providers. This was not due to any malfeasance on behalf of the cablecos or the telcos, but by the economic reality of operating a very thin margin business competing against large mass market competitors.

So what is a regulator or policy make to do? I personally don't believe re-regulation is the answer. More facilities based competition is required. One of the puzzling questions is why aren't the big telephone and cable companies competing in each other's backyard? They have the financial resources and clout to be effective competitors and yet they seem to be studiously avoiding competing against each other? In Canada we are further challenged by the trade barrier of foreign telecom ownership.

I naively remain hopeful that a private sector solution will be found. There were many of the same laments of falling behind when Europe was leading with videotext or Japan with ISDN, GSM cellphones and so on. And yet at the end of day American ingenuity and entrepreneurship triumphed in all these cases, especially with the Internet.

In my opinion the university/research community has a critical role play to help develop new business models and architectures to address this challenge. The MIT "Living the Future" program is a good example of this type of thinking, where non-technical students on and off campus will be encouraged to use the network and develop their own applications and services. The following are some excellent pointers to articles posted on Dewayne Hendricks and Dave Farber's list.-- BSA]


Game Over: The U.S. is unlikely to ever regain its broadband leadership Robert X. Cringely


The question we were left with two weeks ago was "Why has America
lost its broadband leadership?" but it really ought to have been
"Whatever happened to the Information Superhighway?"

It died.

[...]



New OECD report shows limitations of US broadband public policy By Eric Bangeman

http://arstechnica.com/news.ars/post/20070715-new-oecd-report-shows-
limitations-of-us-broadband-public-policy.html

The Organization for Economic Co-operation and Development has just released a 319-page report titled "OECD Communications Outlook 2007" (PDF http://www.oecdbookshop.org/oecd/get-it.asp?
REF=9307021E.PDF&TYPE=browse). As you may have guessed by the title and the size, it's a comprehensive look at the state of the telecommunications industry around the world. Of particular interest is the section on broadband deployment, which tracks usage, deployment, and pricing trends over the past couple of years. Overall, broadband has become faster and cheaper, especially in countries where there are a large number of cable and DSL providers. [..] Competition is key The OECD notes that the broadband situation is better in areas with multiple broadband options. "Price decreases and improved services have been the most marked in markets characterized by intense competition," says the report. "Competition may be the product of regulatory intervention, as in the case of local loop unbundling, or may be the result of new infrastructure-based competition."

The countries with the lowest cost per megabit per second are generally characterized by two things: a significant fiber infrastructure and a healthy amount of competition. In Japan and Korea, for instance, fiber is widespread, resulting in the fastest residential broadband speeds available anywhere. In Europe, the regulatory environment allows consumers in many countries to choose from any number of DSL and cable providers. When Nobel Intent correspondent Chris Lee moved into his flat in the Netherlands, he had no less than three cable and three DSL providers competing for his business, including one company—KPN—that offered both. France is another country with abundant broadband competition—and it has the fifth-cheapest broadband in the world in terms of price per Mbps.

In contrast, the Federal Communications Commission's policy of deregulation (http://arstechnica.com/news.ars/post/
20050804-5168.html) has left most consumers faced with duopolies (at
best) and de facto monopolies (I live over 20,000 feet from the nearest DSLAM in Chicago, so DSL isn't an option for me). The situation is such that the nation as a whole is a broadband laggard, according to one FCC commissioner (http://arstechnica.com/news.ars/ post/20061109-8185.html). As a result of the FCC's policies, competition based on price and speed is spotty at best, and fiber deployments are in their early stages.

The FCC's vision of competition entails different broadband modes (e.g., cable versus DSL) rather than different providers offering the same type of service, which is why there have been rumblings about an "open access" requirement for the upcoming 700MHz auction. The FCC is on the wrong track, according to the OECD's reasoning. "Regulatory decisions across most OECD countries to allow the fixed PSTN's incumbents local loop to be unbundled has been a major factor in the development of OECD communications markets and stimulating the development and competitive provision of broadband offers," explains the report.




WHY WI-FI NETWORKS ARE FLOUNDERING
[SOURCE: BusinessWeek, AUTHOR: Olga Kharif]

The road is getting bumpier for cities and the companies they have
partnered with in a bid to blanket their streets with high-speed
Internet access at little or no cost to users. While 415 U.S. cities
and counties are now building or planning to build municipal Wi-Fi
networks, "deployments are slowing down slightly," says Esme Vos,
founder of consultancy MuniWireless.com. Vos's tally still marks a
nearly 70% jump from mid-2006, when there were 247 muni Wi-Fi
projects on tap, but that's down from the torrid pace of a year
earlier, when deployment plans doubled. Perhaps the clearest hint of
trouble ahead is that some of the companies partnering with cities on
these projects, including EarthLink and AT&T, are having second
thoughts about remaining in the municipal Wi-Fi business.

tc20070814_929868.htm?campaign_id=rss_tech>


Muni Wi-Fi: Its limitations
Telephony Online
By Carol Wilson

The recent spate of criticism regarding municipal Wi-Fi networks
falls into two different categories: technology and business case.

Note: This is a three part article. Here are pointers to each part:

Part 1: muni_wifi_networks_071807/index.html>
Part 2: muni_wifi_networks_072707/>
Part 3: muni_wifi_limitations_080207/>



Structural separation has been very successful with electric power and gas Latest data indicates significant reduction in prices and increased reliability http://www.fraserinstitute.ca/shared/readmore.asp?sNav=pb&id=709

Scientists Create Their Own Web 2.0 Network With NanoHUB

[Some excerpts from GridToday article -- BSA]

wwww.gridtoday.com

Scientists Create Their Own Web 2.0 Network With NanoHUB


nanoHUB.org, a so-called science gateway for nano-science and nanotechnology housed at Purdue University, is taking the tools of Web 2.0 and applying them, along with a few tricks of its own, to further nano-scholarly pursuits.

The result is a Web site that is a required bookmark for people who get excited about algorithms, carbon nanotubes, nanoelectronics and quantum dots -- the current hot topics on the site.

Soon, other science disciplines, such as pharmacy and medical research, will be launched using the same technology.

In nanoHUB, if you know the science you can begin to use the tools immediately. nanoHUB puts scientific tools into the hands of people who wouldn't normally touch them with a 10-foot pole."

The nanoHUB is a project of the National Science Foundation-funded Network for Computational Nanotechnology, a consortium of research universities, government agencies and corporate research labs.

Ian Foster, the University of Chicago's Arthur Holly Compton Distinguished Service Professor of Computer Science, director of the Computation Institute at Argonne National Laboratory and the person sometimes labeled the father of grid computing, says nanoHUB is one of the underappreciated successes of the United States' cyberinfrastructure.

Michael McLennan, a senior research scientist for the Office of Information Technology at Purdue, says that just as Google is famously powered by its secret algorithms, the secret sauce of nanoHUB is a software application that is between the supercomputers at national research facilities that power the site and the Web interface. This "middleware," named Maxwell's Daemon, also finds available computing resources on national science grids and sends job requests to those computers faster than the blink of an eye.

"Maxwell is actually running back here at Purdue and reaching out to high-performance computing resources on the TeraGrid and other science grids across the nation," McLennan says. "This middleware is more sophisticated than running a Java applet in the Web browser."

nanoHUB is the first of several planned science hubs housed at Purdue.

"Eventually we will release Maxwell as open-source software once we test it, package it and write documentation for it," McLennan says. "However, there are still groups that don't want to build their own hubs, even with the middleware, and we are contracting with those groups to build hubs for them." The nanoHUB takes advantage of several Web 2.0 technologies:

* Like YouTube and Digg, nanoHUB consists of user-supplied content. On the site, users find software, podcasts, PowerPoint lectures and even Flash-based video tutorials.

* Like sites such as Flikr or YouTube, nanoHUB has dynamic tags that automatically aggregate into subject categories.

* Like Netflix, users can rate any resource on nanoHUB. Software, podcasts, lectures and contributors' contributions all can be rated by the community.

The real stars of nanoHUB are its simulation tools. So far, 55 nanosimulation software tools have been made available through the site for subjects such as nanoelectronics, chemistry and physics. These tools allow researchers to change data or views of their simulations and see real-time changes. In the last 12 months, there have been more than 225,000 such simulations run on nanoHUB.


SOA and web services for medical and government applications

[Rosie Lombardi is a journalist has a very good web site on SOA for government applications. I particularly like her recent articles in Government CIO magazine. A good example is how SOA web services is used by Government Alberta to link various databases to track down dead beat dads -- BSA]

CIO Government Review on SOA http://www.intergovworld.com/article/1cf1b5d40a01040801c4b5793333f8a2/pg1.htm


Rosie Lombardi's web site
http://www.rosie-lombardi.com/


Using SOA and web services to track down dead beat dads http://www.rosie-lombardi.com/otherpubs/government/SOAblocks.html

[Some excerpts from original article--BSA]

In 2005, the Alberta Ministry of Justice deployed a SOA-based enhancement to its maintenance enforcement program (MEP), driven by new legislation enacted the year before. Dubbed "Deadbeat Dad" legislation, the Ministry was concerned with finding ways to track and enforce adherence to court orders by withholding access to government services, explains Stuart Charlton, enterprise architect at BEA Systems Inc. in Toronto. "So if someone doesn't pay child support, they might suspend their driver's licence."

As in other provinces, Alberta's ministries and government departments are far from integrated. Tracking thousands of these MEP cases over time, sometimes more than 10 years, and building in the triggers for enforcement actions based on patterns of misbehaviour was a major system undertaking within the Justice Ministry. But developing processes to ensure actions are executed by a multitude of external ministries would have been a monumental inter-departmental undertaking.

Instead of labour-intensive back-and-forth between multiple departments - phoning, faxing, exchanging forms and information - the Ministry of Justice used Web services to automate the workflow.
A producer posts the services it has made available for common use - for example, an application that identifies an Albertan as an MEP case - through an electronic interface based on SOA standards, which can be used by any authorized consumer using the same Web-based technology. All the various conditions for an exchange between departments, including exceptions that require human judgement, are agreed and scripted in advance.



A Collaborative Orthopaedic Research Environment. http://eprints.ecs.soton.ac.uk/14409/

The role of collaboration in scientific and scholarly research is changing due to advances in Web technology. In particular, the need for collaborative science has generated demands for working environments that facilitate human communication and resource sharing among research communities. The Collaborative Orthopaedic Research Environment (CORE) project provides an infrastructure that combines clinical, educational and research activities in a Virtual Research Environment (VRE) for orthopaedic researchers to collaborate in the design, analysis, and dissemination of experiments. An overview of Service-Oriented Architecture (SOA) concepts is presented in this report before moving on to discuss the benefits and rationale of using a SOA in the context of the CORE project. A user requirements study conducted to guide the authors in designing the CORE VRE is also reported.

Monday, August 20, 2007

Network Neutrality and Non-discrimination



[Outside of the USA, network neutrality has largely been seen as a US problem because of the decision there to entirely de-regulate broadband. In the rest of the world most regulators still maintain a policy of non-discrimination and are smug in the belief that network neutrality is a non-issue for them. Non-discrimination means that carriers can offer Internet QoS, preferred carriage and other premium network services as long as they offer it on a non-discriminatory basis.

Most people would agree that carriers and ISPs have a duty and responsibility to maintain their networks and minimize impacts of denial service attacks and congestion. But the engineering activities to achieve these goals can deliberately or inadvertently result in discriminatory action against perceived competitors or users of the network.

Examples include the case where many carriers and ISPs are deploying tools like Sandvine to block BitTorrent seeding. The BBC new P2P service is threatened to be blocked by many ISPs in the UK because of its threat of congestion on their networks. AT&T has announced that it intends to block all illegal video and file sharing on its network. While AT&T may be going beyond the call of duty to prevent illegal video and file sharing its actions have the benefit of enhancing its own video IPTV delivery.

So we face an interesting dilemma as to what good engineering practice actually constitutes discriminatory behaviour against a competitor or user and clearly violates the tenets of network neutrality? One can easily imagine other scenarios where carriers block other applications such as P2P VoIP for legitimate congestion and security reasons. Can, or should, this activity be regulated? Who is to determine if a legitimate engineering activity is actually in reality discriminatory behaviour?

Thanks to Frank Coluccio and Dewayne Hendricks for these pointers -- BSA]



Comcast Throttles BitTorrent Traffic, Seeding Impossible

From: dewayne@warpspeed.com (Dewayne Hendricks)
Comcast Throttles BitTorrent Traffic, Seeding Impossible

Written by Ernesto on August 17, 2007
impossible/>
Over the past weeks more and more Comcast users started to notice that their BitTorrent transfers were cut off. Most users report a significant decrease in download speeds, and even worse, they are unable to seed their downloads. A nightmare for people who want to keep up a positive ratio at private trackers and for the speed of BitTorrent transfers in general.

ISPs have been throttling BitTorrent traffic for almost two years now. Most ISPs simply limit the available bandwidth for BitTorrent traffic, but Comcast takes it one step further, and prevents their customers from seeding. And Comcast is not alone in this, Canadian ISPs Cogeco and Rogers use similar methods on a smaller scale.

Unfortunately, these more aggressive throttling methods can’t be circumvented by simply enabling encryption in your BitTorrent client. It is reported that Comcast is using an application from Sandvine to throttle BitTorrent traffic. Sandvine breaks every (seed) connection with new peers after a few seconds if it’s not a Comcast user. This makes it virtually impossible to seed a file, especially in small swarms without any Comcast users. Some users report that they can still connect to a few peers, but most of the Comcast customers see a significant drop in their upload speed.

The throttling works like this: A few seconds after you connect to someone in the swarm the Sandvine application sends a peer reset message (RST flag) and the upload immediately stops. Most vulnerable are users in a relatively small swarm where you only have a couple of peers you can upload the file to. Only seeding seems to be prevented, most users are able to upload to others while the download is still going, but once the download is finished, the upload speed drops to 0. Some users also report a significant drop in their download speeds, but this seems to be less widespread. Worse on private trackers, likely that this is because of the smaller swarm size

Although BitTorrent protocol encryption seems to work against most forms of traffic shaping, it doesn’t help in this specific case. Setting up a secure connection through VPN or over SSH seems to be the only solution. More info about how to setup BitTorrent over SSH can be found here.

Last year we had a discussion whether traffic shaping is good or bad, and ISPs made it pretty clear that they do not like P2P applications like BitTorrent. One of the ISPs that joined our discussions said: “The fact is, P2P is (from my point of view) a plague - a cancer, that will consume all the bandwidth that I can provide. It’s an insatiable appetite.”, and another one stated: “P2P applications can cripple a network, they’re like leaches. Just because you pay 49.99 for a 1.5-3.0mbps connection doesn’t mean your entitled to use whatever protocols you wish on your ISP’s network without them provisioning it to make the network experience good for all users involved.”

[snip]


ISPs to BBC: We will throttle iPlayer unless you pay up
By Nate Anderson | Published: August 13, 2007 - 11:16AM CT

While the network neutrality debate can sometimes feel a bit theoretical in the US, it's a live issue in Europe, and this week it hit the pages of newspapers across the UK. What made news was a set of demands by UK ISPs, which banded together to tell the BBC that the ISPs would start to throttle the Corporation's new iPlayer service because it could overwhelm their networks. Unless the BBC pays up, of course.

Continued at: http://preview.tinyurl.com/2bcu78

YouTube for Science -SciVee

[From a posting on Slashdot --BSA]

http://science.slashdot.org/science/07/08/19/1328253.shtml



Shipud writes "The National Science Foundation, Public Library of Science and the San Diego Supercomputing Center have partnered to set up what can best be described as a "YouTube for scientists", SciVee". Scientists can upload their research papers, accompanied by a video where they describe the work in the form of a short lecture, accompanied by a presentation. The formulaic, technical style of scientific writing, the heavy jargonization and the need for careful elaboration often renders reading papers a laborious effort. SciVee's creators hope that that the appeal of a video or audio explanation of paper will make it easier for others to more quickly grasp the concepts of a paper and make it more digestible both to colleagues and to the general public."

http://www.scivee.tv/


Cisco announcement of virtualization and orchestration of networks and services

Cisco announcement of virtualization and orchestration of networks and services

[There is a lot of buzz and excitement around virtualization and orchestration of networks and related services such as storage, computation, etc. Orchestration allows end users to configure and link together these virtualized services. This of course, has something we have been promoting with UCLP for some time now (www.uclp.ca or www.inocybe.ca). UCLP uses web services and grid technology for the virtualizations and BPEL for the orchestration. Some excerpts from Grid Today-- BSA]

http://www.gridtoday.com/grid/1684997.html

Cisco Takes On Virtualization Orchestration

Cisco announced today VFrame Data Center (VFrame DC), an orchestration platform that leverages network intelligence to provision resources together as virtualized services. This industry-first approach greatly reduces application deployment times, improves overall resource utilization and offers greater business agility. Further, VFrame DC includes an open API, and easily integrates with third party management applications, as well as best-of-breed server and storage virtualization offerings.

With VFrame DC, customers can now link their compute, networking and storage infrastructures together as a set of virtualized services. This services approach provides a simple yet powerful way to quickly view all the services configured at the application level to improve troubleshooting and change management. VFrame DC offers a policy engine for automating resource changes in response to infrastructure outages and performance changes. Additionally, these changes can be controlled by external monitoring systems via integration with the VFrame DC Web services application programming interface (API).

VFrame DC is a highly efficient orchestration platform for service provisioning which requires only a single controller and one back-up controller. The real time provisioning engine has a comprehensive view of compute, storage and network resources. This view enables VFrame DC to provision resources as virtualized services using graphical design templates. These design templates comprise one of four VFrame DC modular components: design, discovery, deploy and operations. These components are integrated together with a robust security interface that allows controlled access by multiple organizations.


Many application environments can benefit from VFrame DC, including: Web tier such as Apache and IIS; Oracle 11i application suite; SAP R3; BEA Weblogic; IBM Websphere; Oracle 10G RAC; multi-server clustered applications; grid-based applications; and large dynamic development and test environments.


Cisco also announced at a press conference today its vision for next-generation datacenters, called Data Center 3.0, which entails the real-time, dynamic orchestration of infrastructure services from shared pools of virtualized server, storage and network resources, while optimizing application service-levels, efficiency and collaboration. VFrame DC is an important step in delivering this vision.

Additional information and resources on Cisco VFrame DC can be found at http://newsroom.cisco.com/networkersUS/2007/.


Call for Participation: Trust and the Future of the Internet



"Call for Participation: Trust and the Future of the Internet

The Internet Society (ISOC) Board of Trustees is currently engaged in a
discovery process to define a long term Major Strategic Initiative to
ensure that the Internet of the future remains accessible to everyone. The
Board believes that Trust is an essential component of all successful
relationships and that an erosion of Trust: in individuals, networks, or
computing platforms, will undermine the continued health and success of
the Internet.

The Board will meet in special session the first week October of 2007 for
intensive study focused on the subject of trust within the context of
network enabled relationships. As part of this process, the Board is
hereby issuing a call for subject experts who can participate in the two
day discussion. Topics of interest include: the changing nature of trust,
security, privacy, control and protection of personal data, methods for
establishing authenticity and providing assurance, management of threats,
and dealing with unwanted traffic.

Participants will be selected based on a short paper summarizing
individual interests and qualifications as well as availability. The
retreat will be held in Toronto, Ontario (CA) . Travel and accommodation
costs will be covered by ISOC and participants should expect to arrive
October 4th and depart on the 6th or 7th. Expressions of interest may be
emailed to: Oct07-retreat @ elists.isoc.org and papers should not exceed
three pages. Papers must received by August 24th, 2007 and the Program
Committee will make their selections on or before September 7th, 2007.
Subject experts will be allotted one hour for presentation on October 5th
and will be included in the days round-table discussions. In order to
facilitate open discussion, final presentation materials should be
forwarded to ISOC no later than September 21st, 2007.

We look forward to a lively and informative meeting on this important
topic and encourage you to share this announcement with your communities
of interest."

Lucy Lynch
Director of Technical Projects
Internet Society (ISOC)


Some cool Web 2.0 mashup and workflow tools for science and business applications


[IBM development group has produced some useful tools to help businesses and scientists get their workflow and mashup applications up and running. And the Mygrid team have launched a new repository for the sharing and reuse of various scientific workflows. Thanks to Richard Ackerman and Ed Pimentl for these pointers -- BSA]

http://services.alphaworks.ibm.com/qedwiki/
http://www.alphaworks.ibm.com/tech/web2db2
http://services.alphaworks.ibm.com/



http://scilib.typepad.com/science_library_pad/2007/08/myexperiment---.html


myExperiment makes it really easy for the next generation of scientists to contribute to a pool of scientific workflows, build communities and form relationships. myExperiment enables scientists to share, re-use and repurpose workflows and reduce time-to-experiment, share expertise and avoid reinvention.

myExperiment introduces the concept of a workflow bazaar; a collaborative environment where scientists can safely publish their creations, share them with a wider group and find the workflows of others. Workflows can now be swapped, sorted and searched like photos and videos on the web.

myExperiment is a Virtual Research Environment which makes it easy for people to share experiments and discuss them.

We are currently working with our users to determine exactly how they want this site to work. We had a user meeting at the end of September 2006 to brainstorm myExperiment, and you can read some of the results from this meeting at our portal party wiki.

Currently, a lightweight repository of workflows and the Taverna BioService Finder are available.

Scientists should be able to swap workflows and publications as easily as citizens can share documents, photos and videos on the Web. myExperiment owes far more to social networking websites such as MySpace and YouTube than to the traditional portals of Grid computing, and is immediately familiar to the new generation of scientists. The myExperiment provides a personalised environment which enables users to share, re-use and repurpose experiments - reducing time-to-experiment.

We expect to start with focused pilot myExperiment portals based upon case studies for the specific areas of Astronomy, Bioinformatics, Chemistry and Social Science.


Virtulization, SOA and Service Oriented "Infrastructure"


[Here is a couple of excellent articles on the concept of Service Oriented Architecture (SOI) where web services are used to represent various virtualized and real distributed computational, network elements, storage and data facilities. Web services provides a new management tool to allow managers of these facilities to quickly configure and re-arrange these facilities as required. This, of course, was also the rationale behind UCLP - to allow users, whether enterprise managers or researchers to reconfigure physical and virtual facilities, including network elements as they saw fit for their application. Some excerpts from eWeek and GridToday-- BSA

http://www.eweek.com/article2/0,1895,2158548,00.asp

Virtualization has proved itself in the data center, where companies are deploying the technology as a way to consolidate hardware, save on power and cooling costs, and enhance disaster recovery capabilities.

Now, industry observers say, virtualization will play a key role in the growing SOA (service-oriented architecture) movement. In fact, David Greschler, director of integrated virtualization strategy at Microsoft, in Redmond, Wash., calls virtualization "the key enabler for SOA."

"Everything is tied together. What virtualization does is provide a way for all these pieces [of IT infrastructure] to be separated from each other, but also to work together," Greschler said.

Those pieces he is talking about include applications, operating systems, presentation layers, virtual machines, and storage and network devices.

[...]

At this level resides concepts such as policy-based management and the enablement of self-managing virtualized systems, he said.



[From www.gridtoday.com]
SPECIAL FEATURES ==============================================================

[ ] M1579356 ) Connecting the Dots: Applications and Grid Infrastructure
By Yaron Haviv, CTO, Voltaire


The move to grid or grid-like architectures within the datacenter
brings many benefits, such as growth of capacities, lower costs, and
support for an increasing number and variety of applications. This
trend has also brought additional infrastructure requirements and
associated challenges. Connecting and managing hundreds or thousands
of servers and networked storage, and incorporating server and storage
virtualization technologies, has created communication challenges,
network complexity and a steep learning curve for getting the most out
of the ability to virtualize infrastructure.

With all of this complexity to deal with, have we figured out how much
time and resources it takes to deploy applications over grids?

This article examines how grids built around a service-oriented
architecture (SOA) focusing on business tasks, business flows, and
service delivery will significantly shorten the time and efforts in
application deployment and configuration, while delivering the
greatest efficiencies. A datacenter grid model is proposed, which
includes considerations for deployment and provisioning tools for
applications, server and storage infrastructure, and high-performance
grid fabrics.

[..]

Service-Oriented Infrastructure (SOI) Management

While virtualization of servers, storage and fabrics is a key element
to achieving a flexible and more efficient datacenter, it also is
critical to develop a new approach to data center resource management.
Instead of manual procedures by which administrators create and
configure the infrastructure, infrastructure resources should be
dynamically created and configured based on the application
requirements. This is achieved through the use of SOI management
tools.

Fabric provisioning and SOI management tools, such as Voltaire
GridVision Enterprise software, depend on the use of dynamic and
unified datacenter fabrics, which have loose relationships between
resources and can be programmed to create whatever topology or logical
links are needed at a given time or to satisfy a given application
load.

These tools are complementary to many of the virtualization and
automation/provisioning tools in market today because they focus on
the infrastructure and connectivity aspects of virtual datacenter
resources. They can integrate with the server virtualization products
(such as Xen and VMWare) and typically use an open and extensible API
for optional integration with server and storage provisioning tools
Orchestration and scheduling tools can use the SOI Web services API
and object models to provision infrastructure as needed, collect
health/performance information and get notified on infrastructure
events and changes.

With a SOI, equipment can be wired once, thus eliminating physical
user intervention. Complex application deployment procedures that
cross organizational boundaries can be automated and conducted in few
minutes rather than days or weeks. They are less error-prone and
consume fewer resources. Furthermore, infrastructure can be
built-to-order to meet application-specific requirements with the
right balance of CPU, network and storage resources. Ultimately, this
makes applications on a grid more efficient and eliminates the right
bottleneck.



Crowdsourcing as a tool for research and development

[From a posting on Dewayne Hendricks list -- BSA]





Randy Burge interviews Alpheus Bingham, co-founder of Innocentive via telephone

Alpheus Bingham knew something big had to shift in the way invention and innovation happened at pharmaceutical giant, Eli Lilly. A top R&D
Executive at Lilly in the mid 1990s, Bingham, along with others, struggled to devise new ways to leverage knowledge to reduce the ridiculously high costs of developing new medicines.

Drug discovery moves at its own expensive glacial pace. Progress is throttled by complex tangles of chemistries, physiologies, mind-sets, regimens, efficacies, budgets, regulators, stockholders, and a thousand other variables. How does a company innovate its innovation?

Bingham scanned the environment for new methods and inspirations to
Generate more diversity and throughput in Lilly's R&D idea pool. Creative
ferment was high, but the need for change was even higher. How did the Lilly team invent something as radical as crowdsourced R&D in an industry burdened by protocols and status quo?

Lilly, in a bold move, launched e.Lilly to incubate nascent solutions
Like the one that became Bingham's crowdsourcing company, Innocentive. But, launching Innocentive was the easy part — could such open-ended
Crowdsourced potential be integrated into the formal channels of R&D?


Innocentive is now adapting its crowdsourcing model to the social philanthropy arena and beyond. It is a story for the innovation ages.



Innocentive has recently partnered with the Rockefeller Foundation in an exciting new dimension for both organizations. The partnership evolves the Innocentive model to elicit and manage crowdsourced solutions for critical social medicine and other problems and challenges addressed by the Rockefeller Foundation. Tell us about this new crowdsourced philanthropic mission.



crowdsourcing_diversity>

Tony Hey's presentation on the "Social Grid" - Web 2.0 and eScience

[I highly recommend taking a look at Tony Hey's recent presentation at OGF on the "Social Grid". Lots of excellent examples of using Web 2.0, Amazon EC2/S2 for eScience applications. Thanks to Savas Parastatidis for this pointer from his excellent blog (but note his blog only works with Microsoft Explore)r -- BSA]

Tony Hey's presentation at OGF http://www.ogf.org/OGF20/materials/828/The%20Social%20Grid%20(OGF20).pdf

Savas Parastatidis blog: http://savas.parastatidis.name/2007/05/13/4cab27f5-185d-41f9-9c1e-79517f139d6b.aspx


Tony Hey's slides on "The Social Grid" are now available (6.5MB PDF). This talk focused on the value of using the Web and its existing, stable technologies to deliver value to eScience. It also encouraged the Grid community to consider modern Web usage patterns and technologies, like "software-as-a-service", social networking, semantics, as the means to meet the scientist's requirements in a familiar-to-them manner.

Web 2.0 Social Networking tools for dogs

[Thanks to Richard Ackerman for this pointer -- BSA]


http://www.sniflabs.com/

If you're passing through a dog park in Boston in the coming months and happen to catch a glimpse of a funny little device hanging off a pooch's collar, don't be surprised. A startup called SNIF Labs is gearing up to beta test a technology designed to help dogs--and their owners--become better acquainted.

SNIF Labs--the company's name is short for Social Networking in Fur--is developing what its website calls "a custom radio communications protocol" that allows special tags dogs wear on their collars to swap dog and owner information with other SNIF-tag users. When two dogs wearing tags come within range of each other, the tags start to swap dog and even owner information.

Once owners are back home and using the company's social-networking service, they can trade information about their dogs and themselves online.

from MIT Technology Review