Monday, December 17, 2007

Cloud computing is the way forward rather than the Grid


[Savas Parastatidis is part of Tony Hey's Technical Computing team at Microsoft. He recently gave and excellent presentation at SC07 called "The Web as the Platform for Research" at the Grid Computing Environments (GCE) workshop.

I completely agree with his assertion that “Cloud computing is the way forward rather than the Grid. Organizations are not going to be sharing resources [where] you can definitely get resources cheaper from the cloud rather than having to maintain, and then share them with others. “

The Amazon Ec2/S3 services are a good example of this trend towards cloud computing and I expect soon many other large organizations like IBM, Microsoft and Google will be offering cloud services as well.

As I mentioned in previous postings the cost of using cloud services is often cheaper than the cost of power and cooling a typical cluster at a university, never mind the overhead and operational costs of such a facility.

As well many small commercial organizations are establishing a variety of free web services and tool for the research and education community using cloud services like Amazon. They make money whenever a third party researcher uses these free services on the compute cloud - sort of a sophisticated "click advertising" revenue model. This also frees up the researcher from worrying about the complexity and politics of using a grid or local HPC resource and can instead focus on their core competency of their research field. - BSA]

http://savas.parastatidis.name/2007/11/12/919a1978-9c1f-4b7b-8824-009363863b8e.aspx


Wednesday, October 31, 2007

Web 2.0 allows user to bypass IT department

[Excerpts from Information Week article -- BSA]


http://www.informationweek.com/news/showArticle.jhtml?articleID=202601956&pgno=4&queryText=

By Andy Dornan
InformationWeek
Forget outsourcing. the real threat to IT pros could be Web 2.0. While there's a lot of hype and hubris surrounding wikis, mashups, and social networking, there's also a lot of real innovation--much of it coming from increasingly tech-savvy business users, not the IT department.

"We've cut IT staff by 20%, and we're providing a whole lot more in terms of IT services," says Ken Harris, CIO at nutritional products manufacturer Shaklee. Harris started with a mashup platform from StrikeIron; he found mashups such an effective way to integrate multiple Web services that he turned to Web-based service providers to replace in-house functions. Now, Shaklee gets its ERP from Workday and search from Visual Sciences, and it's looking at other IT functions that software as a service can replace.

Instead of passive consumers, Web surfers can become active creators.

All that interactivity ought to make Web 2.0 ideally suited for business use.
Of all Web 2.0 technologies, social networking is the one that gets vendors and venture capitalists most excited. At least 17 startups are pitching social networking technology to business customers (see table, Social Networking Technology Startups), while countless social networking Web sites are chasing individual users.


IT'S LOOSENING GRIP
Loss of IT control is a consistent theme as Web 2.0 penetrates business. The greatest upheaval is likely to come from enterprise mashups, which combine the social and technical aspects of Web 2.0 by letting users develop their own applications. Though very few businesses use mashups at present, those that are see great benefits, and larger players such as BEA, IBM, and Oracle are entering the game. Cutting out the middleman--that's the IT department--can be a great way of aligning business and technology.

"Mashups have let end users do more of what used to be done by IT," says Warren Breakstone, executive VP in charge of client services at investment tools provider Thomson Financial. Although not in the IT department, Breakstone started using a hosted mashup service from Serena Software and now runs a team of business analysts who develop Web-based applications for sales, marketing, and other personnel. "Now we're moving into traditional IT services: The IT department is using apps that we built."

Breakstone says this doesn't bring his team into conflict with the IT department. "It frees IT up to do those mission-critical tasks behind the scenes," he says.


User Driven Innovation: Profiting from mashups and Open APIs



[Here is a couple more pointers on the democratization of innovation. There are now a number of compnaies specializing in building platforms for maships and open APIs. As well In Europe, one of the expressions is the emergence of this open innovation model is the European network of living labs.. As ArturWe are trying that this new approach of open innovation could help to the redesign of the Internet in a different way, with more implication of the users. For example, we try to engage media companies and media users in this process. Thanks to Michael Weiss and Artur Serra for these pointers
-- BSA]

User Innovation with Open APIs: Profiting from Mashups http://www.ebusinesscluster.com/media_lib/docs/Nov_2007_Wiess.pdf

Open Living Labs
www.openlivinglabs.org

Some telcos "Get it"


[Excerpts from my post on Internet Evolution -- BSA]

http://www.internetevolution.com/author.asp?section_id=506&doc_id=136710&

Although many telcos are treated as the popular whipping boys for all that’s wrong with today’s telecom environment, there’s a small number that are starting to understand the dynamics of the new marketplace.

A good example is Netherlands-based KPN Telecom NV (NYSE: KPN), which recently announced that it's joining forces with Reggefiber to speed up the rollout of FTTH (fiber to the home) in Almere, the fifth largest city in the Netherlands. Reggefiber already owns some networks in smaller towns and in parts of cities, including the project in Amsterdam. [..]

KPN, like many other telcos, is losing many customers to the local cable companies. Cablecos can offer full triple-play services -- high-speed Internet access, television, and telephone -- quite easily, whereas most telcos can only deliver Internet and telephony. Telcos are positioning themselves to take advantage of IPTV, but it’s still an open question whether this technology will succeed over DSL networks. The rollout in Almere is likely to spur more cooperation between Reggefiber and KPN in other parts of The Netherlands.

Swedish telco Telia came to the same conclusion that municipal open access networks may not be such a bad thing. Telia used to be a vociferous opponent of municipal open networks. But over time, many municipal networks are discovering that they don’t have the necessary skills and financing to manage an open network. Many municipal networks in Sweden have issued calls for proposals from third parties to operate their open access networks. Guess who won most of these deals?
[..]

What’s even more surprising is that some telcos are also starting to realize that peer-to-peer (P2P) networks, and other nefarious applications, are actually good services that should be encouraged by their customers rather than penalized. Currently, most telcos treat P2P users as costly parasites on their networks, because only a small number of users consume most of the bandwidth. Their knee-jerk response has been to block such traffic with Layer 7 filters from companies like Sandvine Inc. (London: SAND) or, in extreme cases, to disconnect these power users entirely.

More advanced telcos are starting to deploy technologies from a number of P2P companies, such as BitTorrent Inc. and Joost , which enhance P2P traffic experience for their customers rather than try to block it. By distributing super node servers throughout the network, Telcos can reduce unbalanced P2P traffic loads and provide a much better P2P experience for their customers. Increasingly, when P2P companies obtain licensing arrangements with the music and film industry, the threat of legal action and charges of abetting and aiding piracy become less of a concern.

As more telcos understand that their own survivability is at stake, the smarter ones will realize that new business models and relationships are critical to their success.
[..]


The end of Grid Computing

[I have always been a skeptical of grid computing and networking when it is narrowly defined as "utility computing" or "bandwidth on demand" networks. Both concepts remind me of the bad old days of computing service bureaus and circuit switched networks. In both cases extensive centralized administrative processes are required to ensure adequate capacity and support. Virtualization on the other hand provides a lot more user control of both the computation and network facilities. It also enables greater convergence between cyber-infrastructure and the next generation of the Internet. For a good example of this approach is 4WARD. the EU has just launched a very exciting program on network virtualization which is very similar to CANARIE's UCLP initiative called 4WARD -- BSA]



Grids: On Demand or Virtual? http://www.canarie.ca/canet4/library/recent/Gridnets_Oct_17_2007.ppt

Convergence of Cyber-Infrastructure and Next Generation Internet http://www.canarie.ca/canet4/library/recent_presentations.html

4WARD
The Success of the Internet is also its Failure http://www.emobility.eu.org/Events/2007-09-04_PIMRC_Conference_Athens/General_4WARD_public.pdf

http://telzur.blogspot.com/2007/10/end-of-grid-computing.html

The End of Grid Computing?
In the year 2003 the MIT Technology review ranked "Grid Computing" among the 10 Emerging Technologies That Will Change the World [1]. We are now four years later and something is not going well with "Grid Computing". An indication that there is a problem can easily be seen by looking at the "Google Trends" plot for the term "Grid Computing":


(click on the image to get the current trend).
This finding can be compared with another buzz word, "Virtualization", which is older than "Grid Computing" and yet is gaining more and more momentum:


There is however one exception. The Academic Grid is still having lot's of glory thanks to the huge heavily funded European (EGEE) and other US projects. When LHC data will start to be taken at CERN it will reach it's top importance. But, it seems that for other scientific projects Grid Computing is not going to be such a success. It will remain as "Nice to have" but will never replace High-Performance Computing (HPC) on one hand and classical distributed computing tools such as Condor [2] which exists for more than 20 years on the other hand. Once the governmental fundings will be removed then all the hype of the academic Grid Computing will decline very quickly as well. As was pointed in an interesting talk by Fabrizio Gagliardi about the future of grid computing, at the GridKa07 School, other kinds of Grid Computing infrastructures that will stand on stable financial ground may emerge as the successors, for example Amazon's S3 and EC2 and the joint IBM and Google's cloud computing.


The democratization of Innovation - the Economist


[The latest issue of the Economist has an excellent report on "Innovation". One the main themes from the report is that ICT and the Internet are transforming the way innovation is done, and the speed at which it is happening. Innovation used to be largely the purview of "white coat" researchers at universities and industry labs. Governments have poured billions of dollars into university research, but have seen little in return in terms of innovation and new products or services. But now thanks to the Internet and SOA, Web 2.0 tools, open source business models etc a much larger community can extract this knowledge from academic research and be full participants, if not leaders, in the innovation process. This will be a significant advantage for entrepreneurs and small business and will have significant transformative process on our economy. Thanks to Craig Dobson for this pointer. Some excerpts from the Economist report- BSA]

Democratization of Innovation
http://www.canarie.ca/canet4/library/recent/ManagementofInnovation_Carleton_Oct10_2007.ppt

Economist Report http://www.economist.com/specialreports/displayStory.cfm?story_id=9928154


How globalisation and information technology are spurring faster innovation

Now the centrally planned approach is giving way to the more democratic, even joyously anarchic, new model of innovation. Clever ideas have always been everywhere, of course, but companies were often too closed to pick them up. The move to an open approach to innovation is far more promising. An insight from a bright spark in a research lab in Bangalore or an avid mountain biker in Colorado now has a decent chance of being turned into a product and brought to market.

That is why innovation matters. With manufacturing now barely a fifth of economic activity in rich countries, the “knowledge economy” is becoming more important. Indeed, rich countries may not be able to compete with rivals offering low-cost products and services if they do not learn to innovate better and faster.

The move toward open innovation is beginning to transform entire industries

Mr Chesbrough's two books “Open Innovation” and “Open Business Models” have popularised the notion of looking for bright ideas outside of an organisation. As the concept of open innovation has become ever more fashionable, the corporate R&D lab has become decreasingly relevant. Most ideas don't come from there (see chart 4).

IBM is another iconic firm that has jumped on the open-innovation bandwagon. The once-secretive company has done a sharp U-turn and embraced Linux, an open-source software language. IBM now gushes about being part of the “open-innovation community”, yielding hundreds of software patents to the “creative commons” rather than registering them for itself. However, it also continues to take out patents at a record pace in other areas, such as advanced materials, and in the process racks up some $1 billion a year in licensing fees.

For a business that uses open and networked innovation, it matters less where ideas are invented. Managers need to focus on extracting value from ideas, wherever they come from.

Infrastructure as a web service - IaaS


[Web services and SOA are a much broader applicability beyond linking software systems together. These architectures are now being used to represent physical devices as web services as well. A good example of infrastructure as a web service is in the telecommunications environment where tools like UCLP expose very telecommunications element and service (including management and control plane elements) and virtualize them as a web service. This allows end users to compose or orchestrate their own network solutions linking together servers, network links, instruments, control plane knobs and storage devices into a sophisticated articulate private network. Some excerpts from SOA systems --BSA]

Agria Network Infrastructure Web Services http://www.inocybe.ca/

http://soa.sys-con.com/read/439721_1.htm


The many crucial jobs IT performs for a company are hard enough - provisioning employees and keeping their workstations up and running; protecting data to meet the stringent requirements imposed by Sarbanes-Oxley, HIPAA, PCI, and other regulations; managing data recovery and business continuity; and so on. The risk of operating with inadequate resources or burning unnecessarily through corporate funds are unwelcome addenda to the IT department's burden.

Now imagine a world where you can scale your IT capacity up or down on command without any capital expenditure.

This world exists. It's enabled by a new business concept based on virtualizing the IT environment and is called Infrastructure as a Service. IaaS extends the remote hosting concept of Software as a Service (SaaS) to hardware.

The interest in IaaS can be attributed to significant increases in IT-enabled business models such as e-commerce, Web 2.0 and SaaS, which drive demand, and by advances in technology that enable it, including virtualization, utility computing, and data center automation.



Infrastructure as a Service is generally delivered in addition to a utility computing platform. As long as you have a platform like VMware for virtualization, you look identical to your infrastructure provider. So, if you wanted to push 30 machines or 50 machines out of your data center for 90 days, you could easily bring them back because you're both running the same virtual platform.



Gartner's top 10 strategic technologies for 2008


[Some excerpts form Network World article -- BSA]


http://www.networkworld.com/news/2007/100907-10-strategic-technologies-gartner.html?netht=101007dailynews2&&nladname=101007dailynews


1. Green IT

This one is taking on a bigger role for many reasons, including an increased awareness of environmental danger; concern about power bills; regulatory requirements; government procurement rules; and a sense that corporations should embrace social responsibility.

But IT is still responsible for 2% of all carbon releases, and it’s coming from many sources. “Fast memory is getting to be a surprisingly high energy consuming item,” Claunch said.

2. Unified Communications (UC)



3. Business Process Management

BPM is more of a business discipline than a technology, but is necessary to make sure the technology of service-oriented architectures (SOA) deliver business value, Cearley said. It’s also important for dealing with laws like Sarbanes-Oxley that require business to define processes, he said.

“SOA and BPM have common objectives,” Cearley said. “They’re both focused on driving agility, driving business process improvement, flexibility and adaptability within the organization. SOA is a key mechanism that makes BPM easier.”

4. Metadata Management

Metadata is the foundation for information infrastructure and is found throughout your IT systems: in service registries and repositories, Web semantics, configuration management databases (CMDB), business service registries and in application development.

“Metadata is not just about information management,” Cearley said. “You need to look beyond that. Metadata is everywhere.”


5. Virtualization 2.0

“Virtualization 2.0” goes beyond consolidation. It simplifies the installation and movement of applications, makes it easy to move work from one machine to another, and allows changes to be made without impacting other IT systems, which tend to be rigid and interlinked, Claunch said.

There are also disaster recovery benefits, since the technology lets you restack virtual systems in different orders in recovery centers, providing more flexibility.

“Virtualization is a key enabling technology because it provides so many values,” Claunch said. “Frankly it’s the Swiss Army knife of our toolkit in IT today.”

6. Mashups & Composite Applications

Mashups, a Web technology that combines content from multiple sources, has gone from being a virtual unknown among IT executives to being an important piece of enterprise IT systems. “Only like 18 months ago, very few people (knew what a mashup was),” Cearley said. “It’s been an enormous evolution of the market.”

U.S. Army intelligence agents are using mashups for situational awareness by bringing intelligence applications together. Enterprises can use mashups to merge the capabilities of complementary applications, but don’t go too far.

“Examine the application backlog for potential relief via mashups,” the analysts stated in their slideshow. “Investigate power users’ needs but be realistic about their capabilities to use mashups.” Other stories on this topic

7. Web Platform & WOA

Web-oriented architecture, a version of SOA geared toward Web applications, is part of a trend in which the number of IT functions being delivered as a service is greatly expanding. Beyond the well-known software-as-a-service, Cearley said over time everything could be delivered as a service, including storage and other basic infrastructure needs.

“This really is a long-term model that we see evolving from a lot of different parts of the market,” Cearley said. It’s time for IT executives to put this on their radar screens and conduct some “what-if” scenarios to see what makes sense for them, he said.


9. Real World Web

Increasingly ubiquitous network access with reasonably useful bandwidth is enabling the beginnings of what analysts are calling the “real world Web,” Claunch said. The goal is to augment reality with universal access to information specific to locations, objects or people. This might allow a vacationer to snap a picture of a monument or tourist attraction and immediately receive information about the object, instead of flipping through a travel book.

10. Social Software

Social software like podcasts, videocasts, blogs, wikis, social bookmarks, and social networking tools, often referred to as Web 2.0, is changing the way people communicate both in social and business settings.

“It’s really been empowering people to interact in an electronic medium in a much richer fashion than we did with e-mail or corporate collaboration systems,” Cearley said.

The effectiveness of these tools for enterprise use varies, and some tools that have the potential to improve productivity aren’t yet mature enough for enterprise use, Gartner says. For example, wikis are highly valuable and mature enough for safe and effective enterprise use. Meanwhile, Gartner says prediction markets potentially have a lot of enterprise value but so far have low maturity. Podcasts, conversely, can be used safely and effectively but don’t have a lot of business value, the analyst firm said.


Mashup for the Masses - Intel Mash Masker


http://mashmaker.intel.com/

Intel® Mash Maker: Mashups for the Masses


Intel® Mash Maker is an extension to your existing web browser that allows you to easily augment the page that you are currently browsing with information from other websites. As you browse the web, the Mash Maker toolbar suggests Mashups that it can apply to the current page in order to make it more useful for you. For example: plot all items on a map, or display the leg room for all flights.

Intel® Mash Maker learns from the wisdom of the community. Any user can teach Mash Maker new mashups, using a simple copy and paste interface, and once one user has taught Mash Maker a mashup, this mashup will be automatically suggested to other users. Intel® Mash Maker also relies on the community to teach it about the structure and semantics of web pages, using a built in structure editor.

Cyber-infrastructure, libraries and network science

[Richard Ackerman at CISTI maintains an excellent blog on libraries and eScience. His latest posting is well worth reading --BSA]


http://scilib.typepad.com/science_library_pad/2007/09/e-science-and-l.html

E-Science and libraries

As I said over a year and a half ago

A lot of science today is very computation and data intense. I think there is a big role for academic libraries as custodians of data and research output.

Science Library Pad - the role of the academic library and librarian - January 13, 2006

Fortunately, there are lots of people thinking about the role of the library in relation to cyberinfrastructure, as well as e-research and publishing.

This month's D-Lib has a number of relevant articles and opinion pieces, including

Library Support for "Networked Science"
by Bonita Wilson
doi:10.1045/september2007-editorial

Cyberinfrastructure, Data, and Libraries, Part 1: A Cyberinfrastructure Primer for Librarians Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt1

Cyberinfrastructure, Data, and Libraries, Part 2: Libraries and the Data Challenge: Roles and Actions for Libraries Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt2

The editorial by Bonita Wilson points to "The Dawn of Networked Science" in The Chronicle for Higher Education (which is not open to read, so no link for them).

I also recommend the August 2007 CTWatch Quarterly, which was on the topic "The Coming Revolution in Scholarly Communications & Cyberinfrastructure".

A reminder that there will be an E-Science track in the upcoming OECD meeting on October 3, 2007.

For more information on this topic, you can see my E-Science category.


Burlington Telecom FTTH Case Study


[For any community that is looking to deploy a community fiber network, I highly recommend taking a look at this study on the Burlington project. Although the Burlington network bills itself as an "open access" network, its network architecture, like that in Amsterdam is built around home run fiber, terminating on GPON at the central office. GPON is used to reduce interface costs only, so at a future date if a customer wanted layer one connectivity to a given service provider and bypass the PON altogether then it would only require a simple fiber patch. This provides for a future proof architecture as new technologies and business model evolve. Thanks to Frank Culuccio and Tim Nulty for this pointer -- BSA]

Burlington Telecom Case Study
Christopher Mitchell, Director, Telecommunications as Commons Initiative christopher@ilsr.org | August 2007
http://www.newrules.org/info/bt.pdf


Wednesday, September 12, 2007

"Freeconomics", future of the Internet and network neutrality

[Here are some excellent additional pointers on the development of "freeconomics" and the resulting impact on the future of the Internet. As Susan Crawford points out in her insightful blog, carriers want to remold the Internet into the mobile telephone business model where every service and application is monetized in which they earn a small percentage on each transaction. Hence prioritized services, QoS and walled gardens are essential to their future. However, counteracting that trend is the development of many new free services and applications from Skype to Google Earth. As I pointed out in my last post many companies in the UK and France are offering free Internet, telephony and cable in creative bundling. If free is the way of the future Internet - then how is the infrastructure going to be paid for? One possible solution is something like our Green Broadband project. Some excerpts from Chris Anderson's and Susan Crawford's blogs as well as this article in Financial Times-- BSA]

Green Broadband and Free Gigabit Internet to the Home http://www.canarie.ca/canet4/library/customer/Green_Broadband.ppt

The Rise of Freeconomics
Chris Anderson http://www.longtail.com/the_long_tail/2006/11/the_rise_of_fre.html


[..]
I begin my economics of abundance speech with Carver Mead's mind-bending question: "What happens when things get (nearly) free?" His answer is that you waste them, be they transistors or megabytes of bandwidth capacity. You use them profligately, extravagantly, irresponsibly. You shift out of conservation mode and get into exploitation mode. You do crazy things like offering people the ability to put their whole music collection in their pocket, or promising the average email user that they'll never have to delete another message to conserve space. Just as Alan Kay "wasted" transistors to create the graphic user interface, we will all learn how to waste newly abundant resources, retraining our minds to ignore our instincts about costs and scarcity.

Today we have an unprecedented number of resources that are closing in on free when measured in units that were once meaningful to regular folks. Through the 1950s and 1960s Mead watched transistors drop from $100 each to $10, then $1, then $0.10, then a penny. Then, in the 1970s as transistors were integrated into semiconductor chips, they fell to a millicent and then a microcent. They're now nearly down to a nanocent--virtually free. Hard drives now go for about 30 cents per gigabyte, or .03 cents per megabyte (I remember my first 10-megabyte drive, which cost me a few weeks salary at the time). Bandwidth now costs less than ten cents per gigabyte at retail, and it wouldn't surprise me to hear that it's fallen below the penny-per-gigabyte level for big commercial outfits. How long would it have taken you to download a gigabyte of data in the old dial-up days, if you could even keep a connection open that long?

With apologies to Levitt and Dubner, I'll cheekily call the emerging realization that abundance is driving our world "freeconomics". Understanding when to shift out of scarcity mode and start giving away what you once held dear is a core competency for our age. Heck, there might even be a book in it!


Why giveaways are changing the face of business
Michael Schrage http://www.ft.com/cms/s/2/01e4b1a4-9741-11da-82b7-0000779e2340.html

Never in history has so much innovation been offered to so many for so little. The world’s most exciting businesses – technology, transport, media, medicine and finance – are increasingly defined by the word “free”.

Google charges users nothing to search the internet; neither does Yahoo nor Microsoft MSN. E-mail? Instant messaging? Blogging? Free. Skype, the Luxembourg-based company that is now a multibillion-dollar division of Ebay, offers free VOIP – Voice Over Internet Protocols – telephone calls worldwide. San Francisco-based Craigslist provides free online classified advertising around the world.

In America, the Progressive insurance group gives comparison-minded shoppers free vehicle insurance quotes from its competitors. Innumerable financial service companies offer clients free tax advice, online bill payments and investment research. Michael O’Leary, Ryanair’s colourful founder, predicts his discount carrier may soon offer free tickets to his cost-conscious euro-flyers.

Of course, Milton Friedman, the Nobel economist, is right: just as “there’s no such thing as a free lunch”, there is also no such thing as a “free innovation”. These “free” offerings are all creatures of creative subsidy. Free search engines have keyword-driven advertisers. Financial companies use cash flow from profitable core businesses to cost-effectively support alluringly “free” money management services. Ryanair counts on the lucrative introduction of in-flight gambling to make its “free tickets” scenario a commercial reality. Innovative companies increasingly recognise that innovative subsidy transforms the pace at which markets embrace innovation. “Free” inherently reduces customer risk in exploring the new or improved – and bestows competitive advantage. To the extent that business models can be defined as the artful mix of “what companies profitably charge for” versus “what they give away free”, successful innovators are branding and bundling ever-cleverer subsidies into their market offerings. The right “free” fuels growth and profit. Technology has successfully upgraded King Gillette’s classic “razor & blades” business model.

The simple reality is that technology will continue eroding entry barriers to provocative cross-subsidy. The more digital or virtual a process, product or service, the faster and easier crafting clever subsidies become. Scale matters, too. Global scale facilitates global subsidies. Just as advertisers subsidise free Google searches, marketers can easily download advertising-supported “free” songs, videos and games into iPods, Sony PSPs and Nokia phones. Internet-based telephone calls similarly lend themselves to sponsorship: “This free call from your brother in New York is brought to you by Tesco . . . please press #1 to accept . . . ” While that prospect will not thrill traditional telecommunications companies, consumers might appreciate the “free” choice.


Susan Crawford on Network Neutrality http://scrawford.blogware.com/blog/_archives/2007/9/10/3221233.html

[...]
Non-neutrality also serves the interests of those who would like (more generally) to see the internet morphed into something much more akin to the current wireless model here in the US: a fully-monetized network, permitting use of particular applications that share their revenues with the network access provider. (This network would not be the same thing as the internet.)

Another document came out last week that ties this all together. It's from the ITU, and it's called "Trends in Telecommunication Reform 2007: The Road to Next-Generation Networks (NGN)."

The ITU defines "NGN" as a network that provides quality-of-service-enabled transport technologies. The idea is that packet transport will be "enriched with Multi Protocol Label Switching (MPLS) to ensure Quality of Service (QoS)."

Translation, as far as I can tell: packet transport becomes the same as circuit-switched transport. Prioritization is controlled; it's a network optimized on billing.

Digital rights management: Desirable, inevitable, and almost irrelevant


[Here a couple of pointers to the issue of DRM. The first is by the famous iconoclast Andrew Odlyzko titled "Digital rights management: Desirable, inevitable, and almost irrelevant".

The second pointer is from Paul Budde who was privileged to follow a presentation by the award-winning Canadian essayist and novelist, John Ralston Saul.

Both are well worth reading on the issue of DRM and the related topic of Identity Management Systems (IdM) which, in many cases, an essential component of DRM and is an equally unproven, illogical and has the potential to be undermine some fundamental freedoms we take for granted in society-- BSA]



http://www.dtc.umn.edu/~odlyzko/doc/drm2007.pdf
Digital rights management: Desirable, inevitable, and almost irrelevant

DRM is attractive for several related reasons. Content providers feel they can get more control over their wares. Such control is comforting in general, and could enable new methods of charging, which might provide greater revenues. More generally, the Internet is enabling sellers to ¯nd out much more about buyers' ability and willingness to pay, and also (through DRM and other techniques) is providing sellers with tools to control usage (and thus prevent arbitrage), leading to unprecedented opportunities and incentives for price discrimination [8, 9]. Thus it should not be surprising that extensive e®orts have gone into research, development, and deployment of DRM.

Yet the record of DRM so far is not too inspiring. And a rising chorus of voices (including Steve Jobs of Apple) is urging the content industry to give up or at least relax its insistence on DRM. The lecture summarized here will review the arguments of DRM skeptics. This abstract provides a very brief overview of some of the main points. References are given to my papers, where those points are explained in more detail, and citations are provided to the extensive literature on the subject.

The fundamental issue that limits current use and future prospects of DRM is ... to maximize the value of your intellectual property, not to protect it for the sake of protection.

DRM all too often gets in the way of maximizing the value of intellectual property. To someextent this is the fault of the DRM technologies. We simply do not know how to build secure systems. The last half a century demonstrates this conclusively.
[...]




Changing societies and the role of telecoms http://www.buddeblog.com.au/changing-societies-and-the-role-of-telecoms/



Introduction
I was recently privileged to follow a presentation by the award-winning Canadian essayist and novelist, John Ralston Saul. He was lecturing at the same conference at which I was speaking – the annual local government management conference (SOLGM) in Wellington New Zealand.

John has been hailed as one of the great thinkers of our time. He gave an awe-inspiring presentation, full of new ideas and philosophies about our society, culture, economic developments, the environment and aboriginal values.

[..]
John mentioned that our western societies still had not adjusted to the new economic order. Our modern economic society started somewhere around 1770 and was based on competition for limited resources, capital, labour and also the products we produced – food, clothing, luxury goods and so on were scarce. However since the 1960s the economic order has changed from one based on scarcity to one based on surplus.

[...]

The fallacy of Intellectual Property rights
As we have moved away from agricultural and manufacturing societies we have changed into a society that is based on ideas and information. It is therefore unforgivable that we are limiting the growth, and the flow, of ideas and information through Intellectual Property rules and regulations.

According to John, the inclusion of Intellectual Property into the WTO is severely hampering the flow of new ideas and information; it is attempting to control the dissemination of ideas, thus making the spread and sharing of them increasingly difficult. This is damaging the new economy!

In the digital media industry we see this happening through Digital Management Rights (DMR). This is a totally unsustainable state of affairs and needs to be changed.

There must surely be better ways to protect Intellectual Property.

At the BuddeComm Digital Media Roundtable in August clear evidence was given that people are prepared to pay for Intellectual Property, but that it needs to be based on conditions that are acceptable to society as a whole, and not on power of the few who want to protect the scarcity economy.

[...]

This first-ever international policy forum on the participative web will bring together experts from around the world, from policy makers and academ



This first-ever international policy forum on the participative web will bring together experts from around the world, from policy makers and academics to business executives and a wide range of civil society. The presentations and discussions will be around the themes of Creativity, Confidence and Convergence will contribute to the OECD Ministerial Meeting on The Future of the Internet Economy in Seoul, Korea, 17-18 June 2008 (www.oecd.org/FutureInternet).

Preparations of the 2007 Participative Web Technology Foresight Forum by OECD and the Canadian government taking place in Ottawa (Canada) on 3 October 2007 are well underway and registration is now open.

Questions to be addressed in the Foresight Forum include: What does the future hold for the participative web? What are the trends and impacts on knowledge creation, business, users and governments? What are the implications for enhancing confidence and trust in the Internet? What is the government role in providing the right environment for stimulating Internet innovation and economic growth?

The details of the Forum and means to register are on the OECD website are at: http://www.oecd.org/futureinternet/participativeweb. The site includes many tools for online participation. Please participate!



Speakers include: John Lettice, Founder, Co-founder and editorial director of The Register, Jungwook Lim, Vice President for Service Innovation, Daum Communications, Robert Sutor, Vice President for Open Source and Standards, IBM, Jonathan Taplin, Professor University of California, Micheal Gill, Chief Executive Officer, Fairfax Business Media, Hemanshu Nigam, Chief Security Office, MySpace, Bob Young, Founder, Lulu.com, Antony Williams, Autor of ‘Wikinomics’, Paul Misener, Head of Global Public Policy, Amazon.com, Andrew Herbert, Managing Director, Microsoft Research, Cambridge, UK, Daniel E. Atkins, Director, Office of Cyberinfrastructure, The National Science Foundation, Kiyoshi Mori, Vice Minister Policy Coordination, Japanese Ministry of Internal Affairs and Communications, Bill St. Arnaud, Senior Director Advanced Networks, CANARIE Inc., Andres Monroy-Hernandez, MIT Lego and Lifelong Kindergarten initiative, Ginsu Yoon, VP International, Second Life, Quitterie Delmas, AGORAVOX and responsible for blogosphere of French Presidential candidate Bayrou, Ellen Miller, Executive Director, Sunlight Foundation, Anne Bucher, Head of Division, Directorate General Information Society (European Commission), Gary Davis, Deputy Commissioner, Office of the Data Protection Commissioner, Ireland, Michael A. Geist, Canadian Research Chair of Internet and e-Commerce Law, Sangwon Ko, Executive Director, Division of Information Industry Research, Korea Information Society Development Institute, Chris Kelly, Vice President of Corporate Development and Chief Privacy Officer, Facebook, Richard Simpson, Director, E-Commerce Branch, Industry Canada / Chair ICCP OECD Committee, and notable representatives from civil society such as The Public Voice, CPTech, etc.



Monday, September 10, 2007

Will the the Future of the Internet be free?



[Chris Anderson - the author of the "long Tail" and senior editor at Wired Magazine is working on a new book about a new model for business on the Internet aptly called Free. Google and other companies pioneered the concept, but there are now many other examples of how this business is transforming even the delivery of basic Internet service.

For example in the UK and France, because of strongly enforced structural separation (actually functional separation) and unbundling there is a host new companies offering creative bundling of services where they offer basic DSL Internet for free. TalkTalk, Sky, Iliad are examples of this innovative approach. In other cases companies are bundling free music downloads with their Internet service.

One of the best examples of this business model is Inuk networks - www.inuknetworks.com. They have been working with the UK university research network (UKERNA) to deliver free over the air TV, telephony and Internet to students in dormitories at universities throughout the UK. They make their money by selling "eyeballs" to the over the air broadcasters who then get a direct and exact measure of who is watching their programming.

As UKERNA only operates the network, there is natural structural separation between the network operator and those offering services. Kudos to UKERNA for demonstrating how research networks can showcase new Internet business models not only for academia, but for the larger consumer market as well.

Another good example is the Google Earth Flight simulator versus purchasing a PC based product.
-- BSA]


Chris Anderon's blog http://www.longtail.com/the_long_tail/2007/06/three_things_ab.html


Google Earth Flight Simulator http://marco-za.blogspot.com/2007/08/google-earth-flight-simulator.html



The Fiber Bible - a comprehensive overview of municpal and Fiber to the Home projects


[Thanks to Benoit Felten for this pointer who also maintains an excellent blog on all sorts of FTTx activities world wide -- BSA]


Benoit Felten's Blog: http://harmonica.typepad.com/fiberevolution/

The Fiber Bible: http://harmonica.typepad.com/fiberevolution/FttX-in-Europe-August-2007-A.doc

An overview of Fiber

European (Muni and other)
Fiber to the Home and
Fiber backbone projects

Telco 2.0 - Web 2.0 and mashups for telcos



[Microsoft has established an interesting web site of over 145 mashups and Web 2.0 tools for network management and provisioning for telcos. Also the Telephony Online Podcast is very interesting of some good examples of how mashups can be used in telco network environment. However I remain very skeptical how one big elephant in Redmond WA can teach a bunch of other aging elephants to be nimble and agile like a gazelle. Thanks to Frank Coluccio for this pointer -- BSA]

Microsoft Web site for mashups for telcos www.networkmashups.com

A Telephony Podcast: Microsoft and Telco 2.0

Executive Editor Rich Karpinski talks with Andy Chu, group manager for planning and strategy in Microsoft’s Communications Sector division, about Telco 2.0, networked mashups and how carriers can combine new Web 2.0 technologies with advanced network functionality to create altogether new services.

http://ct.pbinews.com/rd/cts?d=244-14118-32-26868-20726-744066-0-0-0-1

First e-VLBI data from China-Australia, China-Europe and Australia-Europe baselines


[CANARIE is pleased to have played a small part in this ground breaking event by providing the lightpaths across North America linking Asia to Europe-- BSA]

Dear all

The JIVE press release of yesterday's successful intercontinental eVLBI
demo involving Australian, Chinese and European telescopes is available
at http://www.expres-eu.org/ShAuEu_fringes.html

A 300dpi impage of the telescopes and paths involved is shown at
http://www.expres-eu.org/PHOTOS/evlbi_map_300dpi.jpg

Yesterday's event demonstrated the first real-time correlation results
from Chinese-Australian and Chinese-European baselines. Data were also
obtained from an Australian-European baseline for a short time as the
target source set and rose in observing areas on opposite sides of the
earth) .

Many thanks to all the astronomers, network engineers and others who
worked so hard to make this such an impressive demonstration.

best wishes to all

George

Japan's Warp-Speed Ride to Internet Future


[Thanks to Jim Baller for this posting. I highly recommend Jim Ballers web site for those who are interested in municipal and community broadband. Lots of very useful and practical information from pole attachments to building municipal networks. Some excerpts from Washington Post article -- BSA


Jim Baller's web site
www.baller.com

http://www.washingtonpost.com/wp-dyn/content/article/2007/08/28/AR2007082801990_pf.html

Japan's Warp-Speed Ride to Internet Future
TOKYO -- Americans invented the Internet, but the Japanese are running away with it.

Broadband service here is eight to 30 times as fast as in the United States -- and considerably cheaper. Japan has the world's fastest Internet connections, delivering more data at a lower cost than anywhere else, recent studies show.

Accelerating broadband speed in this country -- as well as in South Korea and much of Europe -- is pushing open doors to Internet innovation that are likely to remain closed for years to come in much of the United States.

The speed advantage allows the Japanese to watch broadcast-quality, full-screen television over the Internet, an experience that mocks the grainy, wallet-size images Americans endure.

Ultra-high-speed applications are being rolled out for low-cost, high-definition teleconferencing, for telemedicine -- which allows urban doctors to diagnose diseases from a distance -- and for advanced telecommuting to help Japan meet its goal of doubling the number of people who work from home by 2010.

"For now and for at least the short term, these applications will be cheaper and probably better in Japan," said Robert Pepper, senior managing director of global technology policy at Cisco Systems, the networking giant.

Japan has surged ahead of the United States on the wings of better wire and more aggressive government regulation, industry analysts say.

In sharp contrast to the Bush administration ... regulators here compelled big phone companies to open up wires to upstart Internet providers.

In short order, broadband exploded. Indeed, DSL in Japan is often five to 10 times as fast as what is widely offered by U.S. cable providers, generally viewed as the fastest American carriers. (Cable has not been much of a player in Japan.)

Perhaps more important, competition in Japan gave a kick in the pants to Nippon Telegraph and Telephone Corp. (NTT), once a government-controlled enterprise and still Japan's largest phone company. With the help of government subsidies and tax breaks, NTT launched a nationwide build-out of fiber-optic lines to homes, making the lower-capacity copper wires obsolete.

"Obviously, without the competition, we would not have done all this at this pace," said Hideki Ohmichi, NTT's senior manager for public relations.

His company now offers speeds on fiber of up to 100 megabits per second -- 17 times as fast as the top speed generally available from U.S. cable. About 8.8 million Japanese homes have fiber lines -- roughly nine times the number in the United States.

The burgeoning optical fiber system is hurtling Japan into an Internet future that experts say Americans are unlikely to experience for at least several years.

Shoji Matsuya, director of diagnostic pathology at Kanto Medical Center in Tokyo, has tested an NTT telepathology system scheduled for nationwide use next spring.

It allows pathologists -- using high-definition video and remote-controlled microscopes -- to examine tissue samples from patients living in areas without access to major hospitals. Those patients need only find a clinic with the right microscope and an NTT fiber connection.

Japan's leap forward, as the United States has lost ground among major industrialized countries in providing high-speed broadband connections, has frustrated many American high-tech innovators.

"The experience of the last seven years shows that sometimes you need a strong federal regulatory framework to ensure that competition happens in a way that is constructive," said Vinton G. Cerf, a vice president at Google.

Japan's lead in speed is worrisome because it will shift Internet innovation away from the United States, warns Cerf, who is widely credited with helping to invent some of the Internet's basic architecture. "Once you have very high speeds, I guarantee that people will figure out things to do with it that they haven't done before," he said.

Yet the story of how Japan outclassed the United States in the provision of better, cheaper Internet service suggests that forceful government regulation can pay substantial dividends.

For just $2 a month, upstart broadband companies were allowed to rent bandwidth on an NTT copper wire connected to a Japanese home. Low rent allowed them to charge low prices to consumers -- as little as $22 a month for a DSL connection faster than almost all U.S. broadband services.


Web 2.0, SOA and mashups for front line military and medical applications


[The acronyms are getting a bit silly - Military 2.0, Medicine 2.0, Battlefield 2.0, etc. But the underlying application of Web 2.0, SOA and mashups to all sorts of endeavours continues unabated - and the impact remains significant on how these technologies will transform various aspects of business and society. Some excepts from NetworkWorld and a pointer from Richard Ackerman -- BSA]

http://www.networkworld.com/news/2007/082807-us-troops-swap-combat-ideas.html?netht=082807dailynews1&


Troopideas.com is not exactly “MySpace for war fighters," but it’s a Web site that invites frontline troops to post their ideas for improving the combat experience. Engineers and developers then use Web 2.0 techniques such as mashups and wikis to turn those ideas into reality.

Dozens of U.S. servicemen and women have so far posted submissions to the Web site since it was launched early in August. The ideas range from low tech to high tech: Creating a helmet-mounted mirror that lets a soldier see behind his back, for example, and a scheme for blocking the cell phone transmissions used to detonate roadside bombs.

Increasingly, many of these problems are being solved by applying Web 2.0 technologies, such as service-oriented architectures (SOA), wikis and social networking that can help compress the typical product development timetable.

[...]

The information battlefield

Integrating information and creating context for understanding is a recurring theme in problems and ideas percolating up from the field. One soldier described how difficult it is to find out what strike options to use against a potential target. Army units typically have to make separate cell phone calls, for example, to Navy, Air Force or Marine counterparts to discover what artillery or missile or aircraft assets are available. “That introduces tremendous delay,” says Loftus. “Often you have to make a [strike] decision before all those calls get completed.”

Gestalt used SOA tools and interfaces to pull information about available strike assets from the existing command and control systems of the different services, displaying the integrated data on a Web page. Now, an Army commander can see what options he has to strike a target, how long it will take for each to hit the target, and evaluate the results and affects of each weapon.


Medicine 2.0

http://scienceroll.com/medicine-20/

I’m pretty sure that web 2.0, the new generation of web services, will (and already is playing) play an important role in the future of medicine. These web tools, expert-based community sites, medical blogs and wikis can ease the work of physicians, scientists, medical students or medical librarians.

We believe that the new generation of web services will change the way medicine is practiced and healthcare is delivered.

So I decided to collect sites, presentations and services that could be helpful for medical experts. I collaborate with Ves Dimov (at clinicalcases.org), Bob Coffield (at healthcarebloglaw.blogspot.com), Brian Jefferson and Ken Civello (at askdrwiki.com).


I’ve had several opportunities to present my work in many clinics here in Debrecen, Hungary, but now I’m planning to search for universities around the world interested in a presentation like this one below. If you’re interested, just send me an e-mail to berci.mesko [at] gmail.com.

Medicine 2.0= web 2.0 + medicine

[...]

Monday, August 27, 2007

Future of Research Networks in Europe, China and United States


[Here a couple of pointers to some analysis on future research network directions. I especially recommend the European EARNEST study, as it conclusions and recommendations will be input to the design of GEANT 3 in Europe--BSA]

Results of EARNEST study https://wiki.internet2.edu/confluence/download/attachments/16835/Karel+Vietsch+Dorte+Olesen+EARNEST-DO-1.ppt?version=1

GEANT 3 https://wiki.internet2.edu/confluence/download/attachments/16835/David+West+CCIRN+GEANT2_3+26_8_07.ppt?version=1

China CNGI https://wiki.internet2.edu/confluence/download/attachments/16835/Xing+Li+02-CNGI-CERNET2-report.ppt?version=1

US Advanced Networking R&D plan http://www.nitrd.gov/advancednetworkingplan/



In France - download of music will be free - how the Internet is changing the music industry


[A couple recent announcements today illustrate how the Internet is radically changing the music business. The music business is one of the first of many industries that will be fundamentally be changed by the Internet, especially with the advent of Web 2.0 open source and SOA. In my books, any industry that is dependent on take down orders, clumsy security protocols, lawyers and government lobbying for survival is a SELL. Think movies, publishing and telecom. These exciting new developments in France of course, in my opinion reflect the strong pro-competitive marketplace for broadband that has been enforced by regulators and policy makers in that country. The French teleco regulator is one of the most enlightened in the world. Thanks to various blogs and contributors including Benoit Felten, Om Malik, Dave Farber and Dewayne Hendricks- BSA]

In France, Music will be Free http://gigaom.com/2007/08/22/in-france-music-will-be-free/

Neuf launches a "free" music offer as part of its triple-play package http://www.fiberevolution.com/2007/08/neuf-launches-a.html

Artist promote thyself
Thanks to new Web businesses, musicians can reach a bigger audience
-- and keep more of the profits for themselves

A full-time career in music seemed unlikely for Chris O'Brien, or at
least one that would pay the bills.

But these days, the 27-year-old Medford musician is selling thousands
of albums online, along with downloads from his debut CD,
"Lighthouse," and he soon plans to offer T-shirts, tickets, and other
merchandise on his MySpace page and personal website.

He credits at least part of his newfound business acumen to nimbit, a
sales, promotion, and distribution company in Framingham that helps
emerging artists build careers online.

"This is the era of the independent artist," O'Brien said. "It's
easier and more doable than it ever has been. People are opting to
remain independent because there's a lot more money to be had."

Nimbit is one of a growing number of businesses, including CD Baby
and Musictoday, that have helped make it easier for independent
musicians to make a living from their work and widely distribute
their music.

It is the brainchild of Patrick Faucher and Matt Silbert, who worked
for a Web firm, Stumpworld Systems, which developed some of the first
e-commerce sites for bands such as Phish and Aerosmith.

About five years ago, they decided to design a platform to help
budding bands, so they set out to take some of the features created
for the major acts and build a suite of Web tools that independent
artists could use.

Soon after, they merged with Artist Development Associates and added
direct-to-fan sales, along with production and promotion services,
creating a one-stop solution for artists to run their businesses.

In June, nimbit introduced its online merchandise table, the first
portable Web store that lets musicians sell CDs, DVDS, MP3s,
merchandise, and e-tickets from a single point of purchase, virtually
anywhere online. The tool can easily be embedded in any website,
blog, or e-mail that allows widgets.

"Increasingly, recording artists and consumers are uniting and
circumventing traditional channels for creating and distributing
music," said Mike Goodman, a media and entertainment analyst at
Yankee Group in Boston. "These days, musicians can do business
directly with consumers. They don't need a recording label. They
don't need a store. They don't need Ticketmaster, the way they used to."

Just a few years ago, Steve Roslonek, of Wethersfield, Conn., was
getting e-mail orders for his CDs and going to the post office once a
week to send of the packages. His growing success as a children's
musician made it almost impossible to keep up with the requests. With
the help of nimbit over the past several years, he has earned more
than $100,000 from sales of CDs, tickets, and merchandise.

The full article can be found here:


This is another sign that disruptive business models are having an
immense impact the traditional business models. The fact is that many
of these industries have no idea how to compete with technologies
they barely understand.


Survey says: only DRM-free music is worth paying for
By Ken Fisher | Published: August 05, 2007 - 10:32PM CT

free-music-is-worth-paying-for.html>

One of the largest surveys of music consumers to closely examine the
question of Digital Rights Management (DRM) has an important two-part
message for the music industry. The first is that DRM is definitely
turning consumers off music sales, and charging them extra to get rid
of it may be an uphill battle. The second message is that knowledge
of DRM and its problems is spreading fast.

Entertainment Media Research, working with media law firm Olswang,
conducted lengthy online surveys with 1,700 UK music consumers,
selected from a pre-existing panel of more than 300,000 music
consumers in the UK (PDF: 2007 Digital Music Survey). What makes this
survey important is the fact that it was aimed squarely at the music-
buying public, not the anti-RIAA crowd, not the techno-libertarians,
and not our general readership. I've been told more than once that
the views on DRM found at publications like Ars Technica are "not
representative" of the general public. Perhaps this was once the
case, but it can no longer be maintained generally. At least in the
UK, the dirt on DRM is out, and it's spreading.
First, the bird's eye view: 68 percent of those with opinions on the
matter say that the only music worth purchasing is that which is DRM-
free. Yet less than half (39 percent) are willing to pay a little
extra for it, while 18 percent say that they'd rather save a little
dough and keep the DRM if they had to chose between the two. In the
middle is a mass of people with no opinion on the matter, because
they're not sure what DRM is or don't know their preference. That
will likely soon change.

Familiarity with DRM has grown significantly in the last year. In
2006, more than half of respondents had never heard of DRM, but that
number has dropped 16 percentage points in 2007, to 37 percent. The
number of people who claimed to have a good or exact knowledge of DRM
nearly tripled in that same timeframe.

Of those who have some idea of what DRM is, their views are largely—
but not entirely—negative. 61 percent said that DRM "invades the
rights of the music consumer to hear their music on different
platforms." 49 percent called it a "nuisance," and 39 percent
expressed concerns that DRM could have privacy implications. Despite
this, 63 percent agreed that DRM "is a good idea because it protects
copyrighted music from illegal file-sharers." In other words, the
idea of stopping illegal file-sharing via DRM doesn't bother these
consumers much, but the effect the effort is having on their own
purchases is not appreciated.

This view isn't surprising. Few are those who, in principle, believe
that all information (and content) should be "free"; the mainstream
viewpoint is still staunchly in the "artists should be compensated"
camp. This appreciation for the music business does not surmount all
other concerns, however.

Consumers aren't interested in a "nuisance" for the sake of stopping
file-sharers, and of course those of us who play closer attention to
the world of DRM know that DRM actually does not stop file-sharing at
all. As this general truth spreads, so does dissatisfaction.

The takeaway from the survey is that DRM's bad reputation is
spreading among general music consumers, and there is a growing
aversion to purchasing music that comes with DRM. Despite this, the
general understanding of the struggle the industry faces with piracy
is still somewhat positive among those same consumers. Still, given
that file sharing in the UK is at an all-time high, it would appear
that the the music industry needs to remove the digital locks on its
tunes, and fast.

http://HTDAW.livedigital.com/blog/89163

Prince Points the Way to a Brighter Future for Music
07.09.07 | 2:00 AM

In his autobiography, Miles Davis wrote that Prince was the only
musician in the world capable of moving music forward. Davis was
referring to musical prowess, but he may as well have been talking
about Prince's business acumen, as evidenced by his upcoming album
giveaway -- the latest in a long series of innovative maneuvers,
including his escape from a Warner Music Group contract in 1994,
early support for P2P trading and status as one of the first major
artists to sell music from his website.

Davis' last, best hope for the future of music most recently outraged
the music establishment by saying he'll give away CDs of his Planet
Earth album to British fans who purchase next week's Mail on Sunday
newspaper. In light of the giveaway, Sony/BMG refused to distribute
the album in Great Britain, provoking outbursts from music retailers
who had been cut out of the action.

Paul Quirk, co-chairman of Britain's Entertainment Retailers
Association, threatened: "The Artist Formerly Known as Prince should
know that with behavior like this he will soon be the Artist Formerly
Available in Record Stores."

Part of the problem, according to retailers, is that Prince's move
helped solidify a growing perception on the part of consumers that
music is free.

Jack Horner, creative and joint managing director for Frukt, a music-
marketing agency, said that while "people like (Prince) play a key
part in helping figure out what the models may be in the music
business of tomorrow, by giving away a whole album on the front of a
newspaper, there is a very clear devaluing of music, which is not a
positive message to send out right now."

Neither the Mail on Sunday or Prince's camp would divulge how much
the newspaper paid Prince for the right to give his album away, but
it's clear Prince was paid upfront, and that nearly 3 million Mail on
Sunday readers -- plus everyone who bought tickets to one of his
shows -- will receive the CD for free. The giveaway almost certainly
contributed to Prince selling out 15 of his 21 shows at London's O2
Arena within the first hour of ticket sales. The venue (formerly the
Millennium Dome) holds around 20,000 people. If the remaining six
shows sell out, the series will gross over $26 million.

Combined with the undisclosed fee paid by the Mail on Sunday, it's
not a bad take for someone who's involved in a "very clear devaluing
of music."

Prince's latest gambit also succeeded by acknowledging that copies,
not songs, are just about worthless in the digital age. The longer an
album is on sale, the more likely it is that people can find
somewhere to make a copy from a friend's CD or a stranger's shared-
files folder. When copies approach worthlessness, only the original
has value, and that's what Prince sold to the Mail on Sunday: the
right to be Patient Zero in the copying game.

As with blogging and so many other things digital, music distribution
could become a competition to see who posts things first. In a sense,
music distribution would no longer be about space -- it would be
about time.

More bands and labels are likely to explore the idea of squeezing
extra value out of their music by selling off the right to be first,
as traditional sources of revenue continue to dry up. Universal's
recent insistence on an "at will" contract with Apple music store,
for instance, is thought to be part of a plan for the world's largest
record label to start selling the exclusive rights to debut certain
albums. And nowhere is it written in stone that music stores are the
only candidates for buying those rights.

Artists have licensed music to advertisers for decades, of course,
but this goes a step further: allowing the licensee to function as
the music's distributor (at least initially). If this idea catches
on, artists and labels looking to duplicate Prince's success will
have to proceed with caution if they want to avoid accusations of
selling out.

In the '90s, a popular slogan for posters and graffiti in and around
my college radio station was "Corporate Rock Sucks," and although
that attitude no longer seems prevalent, fans still routinely revolt
when they hear one of their favorite songs used in a car ad.

Prince ensured that the Mail on Sunday version of his album looks
identical to the one sold in stores, giving it the clear appearance
of coming with the paper, rather than being of the paper. Companies
that want to make a business out of music sponsorships, like RCRD LBL
(an upcoming collaboration between Engadget's Pete Rojas and Downtown
Records), will have to negotiate sponsorships with similar care. If
they do, brands, fans and bands large and small stand to benefit.

Eliot Van Buskirk has covered digital music since 1998, after seeing
the world's first MP3 player sitting on a colleague's desk. He plays
bass and rides a bicycle.

http://www.wired.com/entertainment/music/commentary/listeningpost/
2007/07/listeningpost_0709



My Original Writing blog: http://itgotworse.livedigital.com


Open Source Business for Telecommunications and other Business


[This is an excellent journal for those who want to understand the business opportunities of the open source business model and why many venture capital companies are investing in them. Open source business models combined with SOA and Web 2.0, (sometimes referred to Business 2.0 or Enterprise 2.0) will have a significant potential to radically transform todays business processes. Early adopters will have a significant first mover advantage in the marketplace. Thanks to Tony Bailetti for this pointer-- BSA]

The August issue of the Open Source Business Resource (OSBR) is now available in PDF and HTML formats at http://www.osbr.ca. The OSBR is a publication of the Talent First Network (TFN).

The August issue in pdf format is at: http://www.osbr.ca/news/august07.pdf and in html format at: http://www.osbr.ca/archive.php


In the August issue of OSBR, Peter Liu from DragonWave Inc. examines open source telecommunications companies (OST) and finds that 9 of 12 OST companies are supported by venture capital.

Dave McIlhagga from DM Solutions Group and Ervin Ruci from Geocoder.ca describe how their respective businesses interact with the open source geospatial community. Bill White from Roaring Penguin Software relates the
business model that grew a one-man consultancy into a successful company which maintains its ties with the open source community.

Tammy Yuan explores the ways the Carrier Grade Linux Working Group is changing the proprietary face of the telecommunications industry.

Rowland Few details the Talent First Network's role within the Ontario open source ecosystem, Michael Weiss outlines the open source patterns lead project, and Peter Hoddinott answers questions regarding open source
business models and commercialization of open source assets.

This issue also contains an expanded upcoming events section, as well as newsbytes, recently published reports, letters to the editor, and how you can contribute to subsequent issues.

We hope you enjoy this issue of the OSBR and look forward to your feedback. If you wish to contribute to the OSBR, please contact Dru Lavigne, OSBR Editor at dru@osbr.ca

The challenges of broadband policy and regulation in North America


[At one time Canada had the second highest penetration of broadband in the world and the US was a bit lower. Both countries have declined substantially in their international rankings. Canada is now around number eight and the USA is in the high teens or low twenties, depending on which study you look at.

This decline in broadband ranking has caused considerable concern amongst the Internet "technocrati", not so much in the absolute rankings, which are always subject to debate, but in the growing trend of appearing to fall further behind the rest of the world. More ominously, it is not only the decline in our respective broadband rankings that is causing angst, but the quality of the broadband being delivered. Most other countries who rank higher in terms of broadband deployment are also deploying various forms of fiber to the home with much higher data rates, as much as 100 Mbps for example in Japan. When taken into consideration, both the decline in broadband rankings and the low availability of higher speed broadband the situation in North America looks even more dire.

It is generally accepted that greater broadband penetration and higher speeds is a reflection of country's ability to innovate and support new business opportunities and greater productivity. With the coming wave of new Internet tools and applications such as Web 2.0, SOA, mashups, new media, citizen science and so forth a first rate broadband infrastructure will be critical to our future society and economy. Various studies have shown that ICT in general, and broadband Internet in particular, have a significant and measurable impact on GDP.

Canada and the United States are quite unique in terms of how broadband Internet is delivered compared to most other countries. Both countries have a financially sound and vibrant cable industry which is providing strong competition to the traditional DSL services offered by the telephone companies.

The cable industry penetration in North America has allowed for very competitive duopoly to exist. In most other countries the argument for regulatory intervention is much more compelling because there largely exists only one monopoly supplier. Structural separation, to varying degrees, is now the mantra of most regulators in these countries.

Structural separation in other infrastructure industries like electricity and gas has been extremely successful (see below). It has had a significant impact in allowing new business models to develop and reduces costs. We are starting to see the same results occur with broadband Internet in those countries that have started down the path of structural separation e.g. England and France. But it is yet unknown if structural separation will somehow provide the business case for building fiber to the home or any other type of next generation network.

It is interesting to note, that historically it was "regulated" structural separation that allowed the cable-telco duopoly to be created in North America in the first place (a fact often forgotten by many cableco executives). In the 1970-1990's regulators in Canada and the US prevented telephone companies from acquiring cable companies and/or offering cable TV services. In Canada this structural separation was mandated for cultural content reasons and in the US it was done to prevent concentration of ownership in the media industry. This strong regulatory enforcement allowed for the creation of a thriving, and now extremely successful cable industry. In countries where there no structural separation was mandated between cable and telephone, the nascent cable industry usually failed to reach any substantial penetration.

As a result in the North America duopoly situation, other than in some remote rural and northern areas, there is no perceived market failure in delivering broadband. More importantly most customers are quite happy with their current broadband service and there is little demand by them for greater bandwidths that would be afforded by fiber to the home.

There continues to remain the futile hope that wireless, in particular WiMax will provide inter-modal competition. I remain skeptical, not from a technology perspective, but by the business case. This is true regardless if its Muni WiFi or a more sophisticated commercial service. Any perspective retail broadband wireless company must compete with well entrenched cable and TV companies whose infrastructure has been largely amortized through their earlier delivery of their basic service of cable-TV and telephone. More importantly cable and telephone companies can offer a variety of bundled packages of triple play or quadruple play which is beyond the reach of any perspective wireless broadband company (delivering HTDV over wireless broadband remains a stretch). An important indicator of these developments is the almost complete disappearance of the retail Internet industry in Canada and the US. Although a few small players exist in niche markets, the retail Internet industry has largely been displaced by the facilities based providers. This was not due to any malfeasance on behalf of the cablecos or the telcos, but by the economic reality of operating a very thin margin business competing against large mass market competitors.

So what is a regulator or policy make to do? I personally don't believe re-regulation is the answer. More facilities based competition is required. One of the puzzling questions is why aren't the big telephone and cable companies competing in each other's backyard? They have the financial resources and clout to be effective competitors and yet they seem to be studiously avoiding competing against each other? In Canada we are further challenged by the trade barrier of foreign telecom ownership.

I naively remain hopeful that a private sector solution will be found. There were many of the same laments of falling behind when Europe was leading with videotext or Japan with ISDN, GSM cellphones and so on. And yet at the end of day American ingenuity and entrepreneurship triumphed in all these cases, especially with the Internet.

In my opinion the university/research community has a critical role play to help develop new business models and architectures to address this challenge. The MIT "Living the Future" program is a good example of this type of thinking, where non-technical students on and off campus will be encouraged to use the network and develop their own applications and services. The following are some excellent pointers to articles posted on Dewayne Hendricks and Dave Farber's list.-- BSA]


Game Over: The U.S. is unlikely to ever regain its broadband leadership Robert X. Cringely


The question we were left with two weeks ago was "Why has America
lost its broadband leadership?" but it really ought to have been
"Whatever happened to the Information Superhighway?"

It died.

[...]



New OECD report shows limitations of US broadband public policy By Eric Bangeman

http://arstechnica.com/news.ars/post/20070715-new-oecd-report-shows-
limitations-of-us-broadband-public-policy.html

The Organization for Economic Co-operation and Development has just released a 319-page report titled "OECD Communications Outlook 2007" (PDF http://www.oecdbookshop.org/oecd/get-it.asp?
REF=9307021E.PDF&TYPE=browse). As you may have guessed by the title and the size, it's a comprehensive look at the state of the telecommunications industry around the world. Of particular interest is the section on broadband deployment, which tracks usage, deployment, and pricing trends over the past couple of years. Overall, broadband has become faster and cheaper, especially in countries where there are a large number of cable and DSL providers. [..] Competition is key The OECD notes that the broadband situation is better in areas with multiple broadband options. "Price decreases and improved services have been the most marked in markets characterized by intense competition," says the report. "Competition may be the product of regulatory intervention, as in the case of local loop unbundling, or may be the result of new infrastructure-based competition."

The countries with the lowest cost per megabit per second are generally characterized by two things: a significant fiber infrastructure and a healthy amount of competition. In Japan and Korea, for instance, fiber is widespread, resulting in the fastest residential broadband speeds available anywhere. In Europe, the regulatory environment allows consumers in many countries to choose from any number of DSL and cable providers. When Nobel Intent correspondent Chris Lee moved into his flat in the Netherlands, he had no less than three cable and three DSL providers competing for his business, including one company—KPN—that offered both. France is another country with abundant broadband competition—and it has the fifth-cheapest broadband in the world in terms of price per Mbps.

In contrast, the Federal Communications Commission's policy of deregulation (http://arstechnica.com/news.ars/post/
20050804-5168.html) has left most consumers faced with duopolies (at
best) and de facto monopolies (I live over 20,000 feet from the nearest DSLAM in Chicago, so DSL isn't an option for me). The situation is such that the nation as a whole is a broadband laggard, according to one FCC commissioner (http://arstechnica.com/news.ars/ post/20061109-8185.html). As a result of the FCC's policies, competition based on price and speed is spotty at best, and fiber deployments are in their early stages.

The FCC's vision of competition entails different broadband modes (e.g., cable versus DSL) rather than different providers offering the same type of service, which is why there have been rumblings about an "open access" requirement for the upcoming 700MHz auction. The FCC is on the wrong track, according to the OECD's reasoning. "Regulatory decisions across most OECD countries to allow the fixed PSTN's incumbents local loop to be unbundled has been a major factor in the development of OECD communications markets and stimulating the development and competitive provision of broadband offers," explains the report.




WHY WI-FI NETWORKS ARE FLOUNDERING
[SOURCE: BusinessWeek, AUTHOR: Olga Kharif]

The road is getting bumpier for cities and the companies they have
partnered with in a bid to blanket their streets with high-speed
Internet access at little or no cost to users. While 415 U.S. cities
and counties are now building or planning to build municipal Wi-Fi
networks, "deployments are slowing down slightly," says Esme Vos,
founder of consultancy MuniWireless.com. Vos's tally still marks a
nearly 70% jump from mid-2006, when there were 247 muni Wi-Fi
projects on tap, but that's down from the torrid pace of a year
earlier, when deployment plans doubled. Perhaps the clearest hint of
trouble ahead is that some of the companies partnering with cities on
these projects, including EarthLink and AT&T, are having second
thoughts about remaining in the municipal Wi-Fi business.

tc20070814_929868.htm?campaign_id=rss_tech>


Muni Wi-Fi: Its limitations
Telephony Online
By Carol Wilson

The recent spate of criticism regarding municipal Wi-Fi networks
falls into two different categories: technology and business case.

Note: This is a three part article. Here are pointers to each part:

Part 1: muni_wifi_networks_071807/index.html>
Part 2: muni_wifi_networks_072707/>
Part 3: muni_wifi_limitations_080207/>



Structural separation has been very successful with electric power and gas Latest data indicates significant reduction in prices and increased reliability http://www.fraserinstitute.ca/shared/readmore.asp?sNav=pb&id=709

Scientists Create Their Own Web 2.0 Network With NanoHUB

[Some excerpts from GridToday article -- BSA]

wwww.gridtoday.com

Scientists Create Their Own Web 2.0 Network With NanoHUB


nanoHUB.org, a so-called science gateway for nano-science and nanotechnology housed at Purdue University, is taking the tools of Web 2.0 and applying them, along with a few tricks of its own, to further nano-scholarly pursuits.

The result is a Web site that is a required bookmark for people who get excited about algorithms, carbon nanotubes, nanoelectronics and quantum dots -- the current hot topics on the site.

Soon, other science disciplines, such as pharmacy and medical research, will be launched using the same technology.

In nanoHUB, if you know the science you can begin to use the tools immediately. nanoHUB puts scientific tools into the hands of people who wouldn't normally touch them with a 10-foot pole."

The nanoHUB is a project of the National Science Foundation-funded Network for Computational Nanotechnology, a consortium of research universities, government agencies and corporate research labs.

Ian Foster, the University of Chicago's Arthur Holly Compton Distinguished Service Professor of Computer Science, director of the Computation Institute at Argonne National Laboratory and the person sometimes labeled the father of grid computing, says nanoHUB is one of the underappreciated successes of the United States' cyberinfrastructure.

Michael McLennan, a senior research scientist for the Office of Information Technology at Purdue, says that just as Google is famously powered by its secret algorithms, the secret sauce of nanoHUB is a software application that is between the supercomputers at national research facilities that power the site and the Web interface. This "middleware," named Maxwell's Daemon, also finds available computing resources on national science grids and sends job requests to those computers faster than the blink of an eye.

"Maxwell is actually running back here at Purdue and reaching out to high-performance computing resources on the TeraGrid and other science grids across the nation," McLennan says. "This middleware is more sophisticated than running a Java applet in the Web browser."

nanoHUB is the first of several planned science hubs housed at Purdue.

"Eventually we will release Maxwell as open-source software once we test it, package it and write documentation for it," McLennan says. "However, there are still groups that don't want to build their own hubs, even with the middleware, and we are contracting with those groups to build hubs for them." The nanoHUB takes advantage of several Web 2.0 technologies:

* Like YouTube and Digg, nanoHUB consists of user-supplied content. On the site, users find software, podcasts, PowerPoint lectures and even Flash-based video tutorials.

* Like sites such as Flikr or YouTube, nanoHUB has dynamic tags that automatically aggregate into subject categories.

* Like Netflix, users can rate any resource on nanoHUB. Software, podcasts, lectures and contributors' contributions all can be rated by the community.

The real stars of nanoHUB are its simulation tools. So far, 55 nanosimulation software tools have been made available through the site for subjects such as nanoelectronics, chemistry and physics. These tools allow researchers to change data or views of their simulations and see real-time changes. In the last 12 months, there have been more than 225,000 such simulations run on nanoHUB.


SOA and web services for medical and government applications

[Rosie Lombardi is a journalist has a very good web site on SOA for government applications. I particularly like her recent articles in Government CIO magazine. A good example is how SOA web services is used by Government Alberta to link various databases to track down dead beat dads -- BSA]

CIO Government Review on SOA http://www.intergovworld.com/article/1cf1b5d40a01040801c4b5793333f8a2/pg1.htm


Rosie Lombardi's web site
http://www.rosie-lombardi.com/


Using SOA and web services to track down dead beat dads http://www.rosie-lombardi.com/otherpubs/government/SOAblocks.html

[Some excerpts from original article--BSA]

In 2005, the Alberta Ministry of Justice deployed a SOA-based enhancement to its maintenance enforcement program (MEP), driven by new legislation enacted the year before. Dubbed "Deadbeat Dad" legislation, the Ministry was concerned with finding ways to track and enforce adherence to court orders by withholding access to government services, explains Stuart Charlton, enterprise architect at BEA Systems Inc. in Toronto. "So if someone doesn't pay child support, they might suspend their driver's licence."

As in other provinces, Alberta's ministries and government departments are far from integrated. Tracking thousands of these MEP cases over time, sometimes more than 10 years, and building in the triggers for enforcement actions based on patterns of misbehaviour was a major system undertaking within the Justice Ministry. But developing processes to ensure actions are executed by a multitude of external ministries would have been a monumental inter-departmental undertaking.

Instead of labour-intensive back-and-forth between multiple departments - phoning, faxing, exchanging forms and information - the Ministry of Justice used Web services to automate the workflow.
A producer posts the services it has made available for common use - for example, an application that identifies an Albertan as an MEP case - through an electronic interface based on SOA standards, which can be used by any authorized consumer using the same Web-based technology. All the various conditions for an exchange between departments, including exceptions that require human judgement, are agreed and scripted in advance.



A Collaborative Orthopaedic Research Environment. http://eprints.ecs.soton.ac.uk/14409/

The role of collaboration in scientific and scholarly research is changing due to advances in Web technology. In particular, the need for collaborative science has generated demands for working environments that facilitate human communication and resource sharing among research communities. The Collaborative Orthopaedic Research Environment (CORE) project provides an infrastructure that combines clinical, educational and research activities in a Virtual Research Environment (VRE) for orthopaedic researchers to collaborate in the design, analysis, and dissemination of experiments. An overview of Service-Oriented Architecture (SOA) concepts is presented in this report before moving on to discuss the benefits and rationale of using a SOA in the context of the CORE project. A user requirements study conducted to guide the authors in designing the CORE VRE is also reported.

Monday, August 20, 2007

Network Neutrality and Non-discrimination



[Outside of the USA, network neutrality has largely been seen as a US problem because of the decision there to entirely de-regulate broadband. In the rest of the world most regulators still maintain a policy of non-discrimination and are smug in the belief that network neutrality is a non-issue for them. Non-discrimination means that carriers can offer Internet QoS, preferred carriage and other premium network services as long as they offer it on a non-discriminatory basis.

Most people would agree that carriers and ISPs have a duty and responsibility to maintain their networks and minimize impacts of denial service attacks and congestion. But the engineering activities to achieve these goals can deliberately or inadvertently result in discriminatory action against perceived competitors or users of the network.

Examples include the case where many carriers and ISPs are deploying tools like Sandvine to block BitTorrent seeding. The BBC new P2P service is threatened to be blocked by many ISPs in the UK because of its threat of congestion on their networks. AT&T has announced that it intends to block all illegal video and file sharing on its network. While AT&T may be going beyond the call of duty to prevent illegal video and file sharing its actions have the benefit of enhancing its own video IPTV delivery.

So we face an interesting dilemma as to what good engineering practice actually constitutes discriminatory behaviour against a competitor or user and clearly violates the tenets of network neutrality? One can easily imagine other scenarios where carriers block other applications such as P2P VoIP for legitimate congestion and security reasons. Can, or should, this activity be regulated? Who is to determine if a legitimate engineering activity is actually in reality discriminatory behaviour?

Thanks to Frank Coluccio and Dewayne Hendricks for these pointers -- BSA]



Comcast Throttles BitTorrent Traffic, Seeding Impossible

From: dewayne@warpspeed.com (Dewayne Hendricks)
Comcast Throttles BitTorrent Traffic, Seeding Impossible

Written by Ernesto on August 17, 2007
impossible/>
Over the past weeks more and more Comcast users started to notice that their BitTorrent transfers were cut off. Most users report a significant decrease in download speeds, and even worse, they are unable to seed their downloads. A nightmare for people who want to keep up a positive ratio at private trackers and for the speed of BitTorrent transfers in general.

ISPs have been throttling BitTorrent traffic for almost two years now. Most ISPs simply limit the available bandwidth for BitTorrent traffic, but Comcast takes it one step further, and prevents their customers from seeding. And Comcast is not alone in this, Canadian ISPs Cogeco and Rogers use similar methods on a smaller scale.

Unfortunately, these more aggressive throttling methods can’t be circumvented by simply enabling encryption in your BitTorrent client. It is reported that Comcast is using an application from Sandvine to throttle BitTorrent traffic. Sandvine breaks every (seed) connection with new peers after a few seconds if it’s not a Comcast user. This makes it virtually impossible to seed a file, especially in small swarms without any Comcast users. Some users report that they can still connect to a few peers, but most of the Comcast customers see a significant drop in their upload speed.

The throttling works like this: A few seconds after you connect to someone in the swarm the Sandvine application sends a peer reset message (RST flag) and the upload immediately stops. Most vulnerable are users in a relatively small swarm where you only have a couple of peers you can upload the file to. Only seeding seems to be prevented, most users are able to upload to others while the download is still going, but once the download is finished, the upload speed drops to 0. Some users also report a significant drop in their download speeds, but this seems to be less widespread. Worse on private trackers, likely that this is because of the smaller swarm size

Although BitTorrent protocol encryption seems to work against most forms of traffic shaping, it doesn’t help in this specific case. Setting up a secure connection through VPN or over SSH seems to be the only solution. More info about how to setup BitTorrent over SSH can be found here.

Last year we had a discussion whether traffic shaping is good or bad, and ISPs made it pretty clear that they do not like P2P applications like BitTorrent. One of the ISPs that joined our discussions said: “The fact is, P2P is (from my point of view) a plague - a cancer, that will consume all the bandwidth that I can provide. It’s an insatiable appetite.”, and another one stated: “P2P applications can cripple a network, they’re like leaches. Just because you pay 49.99 for a 1.5-3.0mbps connection doesn’t mean your entitled to use whatever protocols you wish on your ISP’s network without them provisioning it to make the network experience good for all users involved.”

[snip]


ISPs to BBC: We will throttle iPlayer unless you pay up
By Nate Anderson | Published: August 13, 2007 - 11:16AM CT

While the network neutrality debate can sometimes feel a bit theoretical in the US, it's a live issue in Europe, and this week it hit the pages of newspapers across the UK. What made news was a set of demands by UK ISPs, which banded together to tell the BBC that the ISPs would start to throttle the Corporation's new iPlayer service because it could overwhelm their networks. Unless the BBC pays up, of course.

Continued at: http://preview.tinyurl.com/2bcu78

YouTube for Science -SciVee

[From a posting on Slashdot --BSA]

http://science.slashdot.org/science/07/08/19/1328253.shtml



Shipud writes "The National Science Foundation, Public Library of Science and the San Diego Supercomputing Center have partnered to set up what can best be described as a "YouTube for scientists", SciVee". Scientists can upload their research papers, accompanied by a video where they describe the work in the form of a short lecture, accompanied by a presentation. The formulaic, technical style of scientific writing, the heavy jargonization and the need for careful elaboration often renders reading papers a laborious effort. SciVee's creators hope that that the appeal of a video or audio explanation of paper will make it easier for others to more quickly grasp the concepts of a paper and make it more digestible both to colleagues and to the general public."

http://www.scivee.tv/


Cisco announcement of virtualization and orchestration of networks and services

Cisco announcement of virtualization and orchestration of networks and services

[There is a lot of buzz and excitement around virtualization and orchestration of networks and related services such as storage, computation, etc. Orchestration allows end users to configure and link together these virtualized services. This of course, has something we have been promoting with UCLP for some time now (www.uclp.ca or www.inocybe.ca). UCLP uses web services and grid technology for the virtualizations and BPEL for the orchestration. Some excerpts from Grid Today-- BSA]

http://www.gridtoday.com/grid/1684997.html

Cisco Takes On Virtualization Orchestration

Cisco announced today VFrame Data Center (VFrame DC), an orchestration platform that leverages network intelligence to provision resources together as virtualized services. This industry-first approach greatly reduces application deployment times, improves overall resource utilization and offers greater business agility. Further, VFrame DC includes an open API, and easily integrates with third party management applications, as well as best-of-breed server and storage virtualization offerings.

With VFrame DC, customers can now link their compute, networking and storage infrastructures together as a set of virtualized services. This services approach provides a simple yet powerful way to quickly view all the services configured at the application level to improve troubleshooting and change management. VFrame DC offers a policy engine for automating resource changes in response to infrastructure outages and performance changes. Additionally, these changes can be controlled by external monitoring systems via integration with the VFrame DC Web services application programming interface (API).

VFrame DC is a highly efficient orchestration platform for service provisioning which requires only a single controller and one back-up controller. The real time provisioning engine has a comprehensive view of compute, storage and network resources. This view enables VFrame DC to provision resources as virtualized services using graphical design templates. These design templates comprise one of four VFrame DC modular components: design, discovery, deploy and operations. These components are integrated together with a robust security interface that allows controlled access by multiple organizations.


Many application environments can benefit from VFrame DC, including: Web tier such as Apache and IIS; Oracle 11i application suite; SAP R3; BEA Weblogic; IBM Websphere; Oracle 10G RAC; multi-server clustered applications; grid-based applications; and large dynamic development and test environments.


Cisco also announced at a press conference today its vision for next-generation datacenters, called Data Center 3.0, which entails the real-time, dynamic orchestration of infrastructure services from shared pools of virtualized server, storage and network resources, while optimizing application service-levels, efficiency and collaboration. VFrame DC is an important step in delivering this vision.

Additional information and resources on Cisco VFrame DC can be found at http://newsroom.cisco.com/networkersUS/2007/.