Wednesday, April 18, 2007

eScience 2007 Conference


[I highly recommend the series of eScience conferences. If you have a chance look at the presentations and topics from previous eScience conferences. While many cyber-infrastructure and grid conference focus on the enabling technologies for eScience such as grids, web services, workflow and the like, this conference focus on their use to solve real scientific problems. The range of applications and uses for eScience is quite amazing -- BSA]

------------------------------------------------------------------------
3rd IEEE International Conference on e-Science and Grid Computing
e-Science 2007
Dec. 10-13, 2007, Bangalore, India

http://www.escience2007.org/


Sponsored/Organised By:
IEEE Computer Society's Technical Committee on Scalable Computing
The Centre for Development of Advanced Computing (C-DAC), India
Center For Computation & Technology (CCT), LSU, USA
The University of Melbourne, Australia
Indiana University, USA
-------------------------------------------------------------------------

The next generation of scientific research and experiments will be carried out by communities of researchers from organizations that span national boundaries. These activities will involve geographically distributed and heterogeneous resources such as computational systems, scientific instruments, databases, sensors, software components, networks, and people. Such large-scale and enhanced scientific endeavors, popularly termed as e-Science, are carried out via collaborations on a global scale.

Grid computing has emerged as one of the key computing paradigms that enable the creation and management of an Internet-based utility computing infrastructure, called cyberinfrastructure, for the realization of e-Science and e-Business at the global level. To harness the potential of e-Science and Grid computing paradigms, several national and international projects around the world have been initiated to carry out research and innovation activities that will transform the goal of e-Science and Grid computing into a reality.

The e-Science 2007 conference, sponsored by the IEEE Computer Society’s Technical Committee for Scalable Computing (TCSC), is designed to bring together leading international and interdisciplinary research communities, developers, and users of e-Science applications and enabling IT technologies. The conference serves as a forum to present the results of the latest research and product/tool developments, and highlight related activities from around the world.

Topics

Topics of interest concerning e-Science and Grid computing include, but not limited to, the following:

* Enabling Technologies: Internet and Web Services
* Collaborative Science Models and Techniques
* Service-Oriented Grid Architectures
* Problem Solving Environments
* Application Development Environments
* Programming Paradigms and Models
* Resource Management and Scheduling
* Grid Economy and Business Models
* Autonomic, Real-Time and Self-Organising Grids
* Virtual Instruments and Data Access Management
* Sensor Networks and Environmental Observatories in e-Science
* Security Challenges
* e-Science & Grid applications in Physics, Biology, Astronomy,
Chemistry, Finance, Engineering, and the Humanities
* Web 2.0 Technology and Services for e-Science
[...]

e-Science 2007 will also feature workshops, tutorials, exhibits, and an industrial track. To organize or participate in these please visit the conference website.

Program Chair:

Geoffrey Fox,
Indiana University, USA.

Broadband environmental sustainability challenge


[Although the competition is being held in Australia it is open to all -- BSA]

The Eckermann-TJA Prize is for the best paper offered to the Telecommunications Journal of Australia on the theme of applying broadband telecommunications to deliver significant benefits to environmental sustainability.

Conditions

1. Entries are invited for the Eckermann-TJA Prize for the best paper offered to the Telecommunications Journal of Australia on the theme of applying broadband telecommunications to deliver significant benefits to environmental sustainability. The deadline for submissions is 14 September 2007.
2. Entries will be judged on the extent to which they either:
1. demonstrate the benefits that an existing use of broadband technology is delivering in terms of environmental sustainability; or
2. propose innovative new uses of broadband technology that could deliver significant benefits in terms of environmental sustainability.
3. For the purposes of the Challenge, the following definitions apply:
1. broadband will be defined as communications network access connectivity that is “always on” and operates in excess of 256 kbps user bandwidth in each direction; and
2. environmental sustainability means any measure that conserves natural resources, reduces energy consumption or limits the impact of human habitation on the natural environment.
4. The paper judged best overall will be awarded a prize of $8,000, and the best paper that demonstrates benefits specifically in the area of water conservation will be awarded a prize of $2,000. The same paper may be awarded both prizes. The judging panel will reserve the right to split the $8,000 prize money if two or more papers are found to be of equal merit, or to postpone awarding either prize to the following year if no paper is found to have sufficient merit this year.
5. Prize winners are responsible for any tax liability that may be associated with the award of prize money.
6. Copyright and any intellectual property vested in entries will remain the property of the party submitting the entry. However, it is a condition of the Challenge that the Telecommunication Society of Australia (TSA) be granted permission to publish any or all entries in the Telecommunications Journal of Australia and/or on the TSA web site. It is the responsibility of the entrant to take such measures as are appropriate to safeguard their intellectual property against any risks that this form of publication may expose.
7. Entries should comprise the following elements:
1. a title (maximum 10 words);
2. a brief Abstract (maximum 100 words) that provides an overview of the entry; such Abstracts would be used in citations, and with links on a web-site to help interested parties identify entries that may be of particular interest;
3. an Extended Summary (700-1000 words) that could be published as a stand-alone document to describe the key concepts, benefits etc;
4. the body of the entry (maximum 8,000 words) setting out more detailed explanations, supporting information, calculations and any other material pertinent to the entry, in a format complying with the TJA Guide to Authors.
8. Entries should be submitted electronically by email to editor.tsa@bigpond.com in both PDF and Microsoft Word document format. In publishing any entry, the TSA reserves the right to adjust the layout of the entry and include an endorsement such as: “This paper was an entry in the 2007 Eckermann-TJA Challenge, a competition conducted under the auspices of the Telecommunication Society of Australia. The challenge was conceived by Robin Eckermann and the prize is sponsored by Alcatel-Lucent, Water for Rivers, Multimedia Victoria, Greg Crew and Robin Eckermann.”
9. Entries will be judged by a Panel of not less than three persons appointed by the Telecommunication Society of Australia, including the Editor-in-Chief of TJA or his nominee. The judging panel reserves the right to reject any paper that it deems to be non-compliant with the rules of the competition, in bad taste or otherwise inappropriate given the theme and objectives of the Challenge. Any decisions made by the judging panel will be final / not subject to appeal.
10. Close relatives of members of the judging panel or any individual sponsors, officials of the TSA and employees of any corporate sponsor are not eligible to enter the Challenge.
11. In the event that no acceptable entries are received, the sponsors of the prize kitty may in their sole discretion decide to repeat the Challenge in 2008 or withdraw their sponsorship.
12. Entries may be lodged from 1st September 2007 and must be lodged by 14th September 2006.
13. A decision will be made and winner(s) notified by 15th November 2007. Prizes will be awarded at an appropriate TSA event on a date to be determined following notification.

P2P: From Internet Scourge to Savior


[Here is a set of pointers from various sources on the growing phenomena of P2P applications. Many network operators and universities are trying to limit the impact of p2p applications, as only a small number of P2P users can cause significant traffic congestion on today's networks. But to my mind this is an opportunity, rather than problem: the increasing computational and storage capability at the edge means we may have to fundamentally rethink the architecture and business models of the current Internet.--BSA]


http://gigaom.com/2007/04/15/p2p-now-for-pretty-much-everything/

P2P, now for Pretty Much Everything

[Excerpts from Om Malik's column]

At the dawn of the broadband era, peer-to-peer technology became closely associated with music file sharing, thanks to programs like Napster and Kazaa. Later, the emergence of protocols such as BitTorrent linked P2P to movie and television downloads. Over the years, P2P has found many legitimate uses and has found a way into everyday life. Why… even Akamai, which had scoffed at P2P, decided to acquire Red Swoosh.

RawFlow is a 6-year-old company that has developed a peer-to-peer streaming software platform for sharing live video streams. It is currently developing a personal broadcasting technology called Selfcast, also based on P2P technologies.

Nevertheless, the Akamai-Red Swoosh deal, announced last week, prompted me to think about how pervasive P2P really has become in our lives.

* P2P telephony: Skypeand its $4.2 billion price tag - that says it all.
* P2P TV: Joost and Babelgum are just a start. More like Zattoo are joining the party.
* Personal P2P: P2P sharing of photos, videos and other files with family and friends is becoming increasingly common place. The list of start-ups chasing this nascent market is growing by the month
* P2P Video Delivery is growing in popularity, especially in places where 10 megabit/s broadband connections are commonplace.
* P2P data syncing between computers.
* Distributed computing is another area where we have seen P2P technology shine. SETI@Home is a good example.


[From a posting on Dewayne Hendrick's list -- BSA]

New peer-to-peer video-downloading options should please Internet
users--and stave off a network meltdown.
By Wade Roush



Every few years, someone predicts the imminent collapse of the
Internet. Bob Metcalfe--Ethernet inventor, 3Com founder, and
Technology Review patron--famously said at a 1995 Web conference that
he would eat his words from a pessimistic Infoworld column if the Web
didn't disappear within a year under the strain of traffic overloads
and other problems. In April 1997, Metcalfe contritely drank a
milkshake containing the torn-up bits of his column. In 2004,
Helsinki University of Technology professor Hannu Kari said spam and
viruses would kill off the Internet by 2006. The fact that you're
looking at this website now means Kari was wrong.

The point is that even the sharpest innovators and entrepreneurs have
often had a hard time seeing how the next hurdle in the Internet's
growth would be overcome--only to be surprised by some new resource,
technology, or business idea that emerges in the nick of time. (In
both 1996-97 and 2005-2006, Internet service providers responded in
part by adding more bandwidth to their networks.)

But will that pattern hold out forever? The dam-breaking success of
YouTube and Apple's iTunes Video Store--neither of which existed
prior to 2005--has unleashed a huge new flow of digital video on the
Internet, and many consumers now spend hours a day streaming or
downloading everything from home movies to live sports and prime-time
TV series. Because video files are so large compared with the Web
pages and e-mail messages that used to dominate Internet traffic,
backbone lines are under strain, and backbone operators such as AT&T
and Verizon and Internet service providers such as Comcast are facing
new costs they can't easily recoup, given the flat-rate pricing of
most consumer broadband Internet access plans.

Hui Zhang, a computer scientist at Carnegie Mellon University who
studies broadband networks, says that "2006 will be remembered as the
year of Internet video. Consumers have shown that they basically want
unlimited access to the content owners' video. But what if the entire
Internet gets swamped in video traffic?"

This time around, the Internet may be saved by the unlikeliest of
rescuers: the builders of peer-to-peer file-sharing networks. In the
minds of many consumers--and many studio executives--P2P networks are
still synonymous with digital piracy. After all, Napster, Kazaa, and
other early peer-to-peer networks were playgrounds for copyright
violators, who downloaded millions of music files they hadn't paid
for. But today a number of researchers and entrepreneurs are arguing
that peer-to-peer technology--which allows network members to
retrieve content by tapping into the hard drives of other members who
have already downloaded that content--is also great for distributing
legitimately purchased, copyright-protected music and video. It might
even lessen the burden on service providers and content distributors.

[snip]


[From a posting on Dewayne Hendrick's list -- BSA]
[Note: This item comes from reader Jock Gill. DLH]
Subject: Bootstrapping the Long Tail in Peer to Peer Systems

You might find this paper of interest. Jock



Bootstrapping the Long Tail in Peer to Peer Systems

Bernardo A. Huberman, HP Labs, Palo Alto, CA 94304
bernardo.huberman@hp.com
and Fang Wu, HP Labs, Palo Alto, CA 94304 fang.wu@hp.com

1 Introduction
The provision of digitized content on-demand to millions of users
Presents a formidable challenge. With an ever increasing number of fixed and
Mobile devices with video capabilities, and a growing consumer base with
different preferences, there is a need for a scalable and adaptive way of
delivering a diverse set of files in real time to a worldwide consumer base. Providing such varied content presents two problems. First, files should be accessible in such a way that the constraints posed by bandwidth
and the diversity of demand is met without having to resort to client server
architectures and specialized network protocols. Second, as new content is
created, the system ought to be able to swiftly respond to new demand on specific content, regardless of its popularity. This is a hard constraint on any distributed system, since providers with a finite amount of memory and bandwidth will tend to offer the most popular content, as is the case today with many peer-to-peer systems.

The first problem is naturally solved by peer to peer networks, where
Each peer can be both a consumer and provider of the service. Peer to peer
net-works, unlike client server architectures, automatically scale in
size as demand fluctuates, as well as being able to adapt to system failures.
Examples of such systems are Bittorrent [4] and Kazaa, who account for a sizable percentage of all the use of the Internet. Furthermore, new services like the BBC IMP,(http://www.bbc.co.uk/imp/) show that it is possible to make media con- tent available through a peer-to-peer system while respecting digital rights. It is the second problem, that of an adaptable and efficient system
Capable of delivering any file, regardless of its popularity, that we now
solve. We do so by creating an implementable incentive mechanism that ensures the existence of a diverse set of offerings which is in equilibrium with the available supply and demand, regardless of content and size. Moreover, the mechanism is such that it automatically generates the long tail of offerings which has been shown



http://www.multichannel.com/article/CA6332098.html
it only takes about 10 BitTorrent users bartering files on a node (of around 500) of traditional shared IP network to double the delays experienced by everybody else.

OECD Report - PARTICIPATIVE WEB: USER-CREATED CONTENT


[Here is an excellent report from the OECD on what they call the participative web which explores the economic impacts of the new Web 2.0 and web service tools for users being part of a business or creative process. I personally believe the participative web will have a big impact on all sort of activities from participatory democracy to radically new business models to new ways of doing science and research. Hopefully it will also become a key discussion point for the upcoming OEVD Ministerial meeting on the Future of the Internet in Seoul June 2008 -- BSA


http://www.oecd.org/dataoecd/57/14/38393115.pdf?contentId=38393116

The concept of the “participative web” is based on an Internet increasingly influenced by intelligent web services that empower the user to contribute to developing, rating, collaborating on and distributing Internet content and customising Internet applications. As the Internet is more embedded in people’s lives “users” draw on new Internet applications to express themselves through “user-created content” (UCC).

This study describes the rapid growth of UCC, its increasing role in worldwide communication and draws out implications for policy. Questions addressed include: What is user-created content? What are its key drivers, its scope and different forms? What are new value chains and business models? What are the extent and form of social, cultural and economic opportunities and impacts? What are associated challenges? Is there a government role and what form could it take? Definition, measurement and drivers of user-created content There is no widely accepted definition of UCC, and measuring its social, cultural and economicimpacts are in the early stages. In this study UCC is defined as: i) content made publicly available over the Internet, ii) which reflects a “certain amount of creative effort”, and iii) which is “created outside of professional routines and practices”. Based on this definition a taxonomy of UCC types and hostingplatforms is presented. While the measurement of UCC is in its infancy, available data show thatbroadband users produce and share content at a high rate, and this is particularly high for younger age groups (e.g. 50% of Korean Internet users report having a homepage and/or a blog). Given strong network effects a small number of platforms draw large amounts of traffic, and online video sites and social networking sites are developing to be the most popular websites worldwide. The study also identifies: technological drivers (e.g. more wide-spread broadband uptake, new web technologies), social drivers (e.g. demographic factors, attitudes towards privacy), economic drivers [...]



CANARIE Platforms Workshop and Funding Program




CANARIE’s Network-Enabled Platforms Workshop:
"Convergence of Cyber-Infrastructure and the Next-Generation Internet" June 26-27, 2007 Ottawa, Ontario


Purpose:
The purpose of CANARIE’s Network-Enabled Platforms Workshop is to explore the development of and participation in network-enabled platforms by Canadian researchers and other interested parties. The workshop will be an important step towards the launch of a CANARIE funding program in this area.

Background:
Over the past several months, the CANARIE Board has been discussing the organization’s vision and strategy for the 2007-12 period. One important focus has been the next-generation technologies and related network-based infrastructure that enable distributed communities of researchers and their public and private sector partners to collaborate more effectively. CANARIE’s involvement in this area could involve:
(i) Assisting distributed, collaborative communities to use CANARIE’s network and SOA - and Web 2.0-based middleware in the development of network-enabled “platforms” for research;
(ii) Supporting the development of middleware and other aspects of what has often been called “grid” or “cyber-infrastructure” to support platform development; and
(iii) Developing aspects of the “Next-Generation Internet” (NGI) that pertain to the future evolution of CANARIE’s network as one of the world’s leading-edge research networks. Work in these areas is viewed by the Board and by Industry Canada as complementing CANARIE's primary role in operating and upgrading its network to support a full range of research and educational applications, including network-enabled platforms at the national and international levels.

Network-Enabled Platforms:
The term "network-enabled platform" is taken to comprise the network, a range of infrastructure at the edge of the network and the services and tools that make these resources useable by a distributed community of collaborators. Of special interest are the data, servers, sensors, equipment and other resources used by the community, and the related data acquisition, storage, manipulation, sharing and analysis tools that are of primary concern to the collaborators. Web services, Web 2.0 and workflow tools are the most common middleware for linking these resources and supporting their ease of use. Dedicated, “partitionable” lightpath networks, virtual routers and other facilities are also common features of platforms.

Examples:
One of the clearest examples of a network-enabled platform is the Eucalyptus project (http://www.cims.carleton.ca/60.html). Eucalyptus was funded under the CANARIE CIIP program; the team has developed a portal for collaborative architectural design teams to link HDTV video conferencing, rendering, visualization, grid and UCLP lightpaths.
Other network-based research platforms in Canada are at an earlier stage of development. Many have been supported by CANARIE funding, including those relating to the Neptune project, the Canadian Light Source in Saskatoon, the provision of access to the data collected by the international telescopes in Hawaii, the Grid X1 project for high energy physics. The bioinformatics community has developed several platforms, with much of the work being funded by Genome Canada and the regional genome centres.
Canadians are also participating in the early stages of formation of some international "grid" efforts that are in many ways network-enabled platforms under a different label. Some of these have arisen in the context of the Canada-California Strategic Innovation Partnership, including the creation of Optiputer nodes at CRC and Nortel, Ryerson’s becoming a member of CineGrid, and McGill’s Neurological Institute being linked with BIRN, the Biomedical Informatics Research Network.

Next-Generation Internet:
In parallel with CANARIE’s interest in network-enabled platforms is an on-going concern with next-generation Internet initiatives. The Internet of today continues to evolve rapidly and has become critical infrastructure for the economy and society. A consensus has developed, however, that the Internet's fundamental design constrains its ability to respond to many of the challenges it faces, such as security, scaleability and reliability. Accordingly, new urgency is being placed on research regarding next-generation alternatives to the current Internet.
One model of the future Internet borrows from the concept of platforms. The GENI architecture, for example, proposes to link servers, sensors, equipment and other resources through massive virtualization and parallelization of network facilities in order to support multiple platforms for testing new Internet protocols and applications. CANARIE’s User Controlled LightPath (UCLP) software also enables deployment of multiple platforms on separate, parallel “lightpath” networks, also referred to as Articulated Private Networks, or APNs.

Potential focus of Funding Program:
Needless to say, CANARIE is not in a position to fund the development of a network-enabled platform in its entirety, whether a data-based platform like bioinformatics or a sensor-based platform like project Neptune. Accordingly, any CANARIE funding program would have to be focused, for example by directing it at encouraging those groups that have platforms to engage in development efforts associated with using more advanced middleware and other technologies to support the platform’s use.
One aspect of a CANARIE funding program in this area could be to encourage more Canadian participation in international platform development efforts, especially in new areas (i.e. the astronomy and physics communities, for example, are already active in international efforts). This is an area that requires further discussion at the workshop.
In short, the earmarks of an ideal new project relating to network-enabled platforms might be: 1. The area is a Canadian strength; 2. Pan-Canadian and perhaps international collaboration have been established; 3. The data flows to support the collaboration are potentially significant; 4. The community involved has some familiarity with web services and SOA; and 5. The community feels a need to become more sophisticated in its use of these technologies.
CANARIE also recognizes the potential for the development of network-enabled platforms in areas relating to the social sciences, humanities, digital libraries, collaborative animation, and distributed simulation, among others. As data needs expand and as the need for architectures and tools to support sharing and collaboration become apparent, CANARIE would like to assist communities such as these to become more aware of platform technologies and to use the tools and other resources that have been developed. Specifically what CANARIE’s role should be with communities such as these until such time as their national platforms have been initiated or developed will be explored during the workshop.

Workshop Logistics:
To ensure effective discussion and interaction, participation in this workshop will be on an invitation only basis. Anyone interested in deploying a platform or joining an existing one is invited to submit a request to attend the workshop.
Requests to attend and additional questions should be addressed by e-mail to: Platforms@canarie.ca


CBS to air on Joost

[For those who have not tried Joost, I highly recommend you download the Beta. The quality of the video is outstanding, even on a relatively slow broadband link. Joost is a peer to peer video streaming technology developed by the same people who created Skype -- BSA]

From www.convergedigest.com

CBS to Air on Joost
CBS agreed to provide video programming for the upcoming Joost service, which is expected to launch sometime this spring. Financial terms were not disclosed.

Content to be made available is coming from across the CBS divisions and includes new and previously-aired, full episodes from popular, current CBS programs, including the full CSI franchise: CSI: CRIME SCENE INVESTIGATION, CSI: NY and CSI: MIAMI, NCIS, NUMB3RS, SURVIVOR and brands including SHOWTIME, SHOWTIME CHAMPIONSHIP BOXING, CSTV, CBS EVENING NEWS WITH KATIE COURIC and CBS SPORTSLINE.

CBS is the first broadcast network to join Joost. http://www.joost.com 12-Apr-07

Hit BBC show "Sorted" to be available on Azureus P2P in HD

Hit BBC show "Sorted" to be available on Azureus P2P in HD

[It is heartening to see BBC take such a forward looking step in terms of distributing its content. From Dewayne Hendrick's list -- BSA]


[Note: Along with Joost, the p2p content space is starting to get
very interesting! Vuze is still in beta, check it out like I am.
Unlike Joost, you don't need an invited from someone to give it a
try. DLH]

HIT BBC SHOW SORTED TO MAKE ITS U.S. PREMIERE ON AZUREUS VUZE
PLATFORM IN HIGH DEFINITION


Palo Alto, Calif. – April 5, 2007 — Azureus, a global leader in
aggregating and
distributing long-form, high quality video via the Internet’s most
popular media peer-to-
peer (P2P) application, today announced that it will premiere the
BBC’s hit six-part
series, Sorted, on its next-generation peer-to-peer (P2P) platform
dubbed Vuze
(www.vuze.com), which launched today. Sorted represents the first in
a series of
programs that Azureus and BBC Worldwide will make available in High
Definition on the
Vuze platform, as part of a distribution partnership announced
December 2006, the first
ever peer-to-peer (P2P) deal for BBC.

Sorted follows the lives of six working-class postmen living in
northwest England.
Played by Neil Dudgen (The Street, Bridget Jones: The Edge of
Reason), Hugo Spear
(Love Lies Bleeding, Bleak House, The Full Monty), Dean Lennox Kelly
(Shameless, A
Midsummer Night’s Dream), Will Mellor (Two Pints of Lager and a
Packet of Crisps),
Mark Womack (Mersey Beat, Liverpool 1) and Cal Macaninch, the men and
their
significant others, played by Eva Pope (Shadow Man, Bad Girls), Tracy-
Ann Oberman
(East Enders, Big Train) and Nina Sosanya, experience the
complexities of modern-day
love, work, family, secrets, passions and fears. Storylines are
stitched throughout the
series, but each episode is dedicated to one of the six postmen.

“Sorted is an entertaining, lively and enlightening series, and we
are thrilled that BBC
Worldwide has given us the opportunity to premiere it to a U.S.
audience,” said Gilles
BianRosa, CEO of Azureus. “Through the Vuze platform, viewers can
experience shows
like Sorted that they wouldn’t normally have access to, and enjoy
viewing them in High
Definition.”

Critically acclaimed during its initial broadcast run on BBC1, Sorted
held approximately
20 percent of the audience share, beating out top rated series such
as CSI:Miami. All
six episodes of Sorted are available now at www.vuze.com.

Vuze breaks new ground by offering a powerful tool set of enhanced
features and
functions that enable content providers to easily publish, showcase
and distribute high
resolution, long form content in High Definition or DVD quality over
the Internet.
Premiering today with a revamped navigation system and search engine,
Vuze was
redesigned to accommodate the massive influx of content from
publishers, large and
small, and help viewers quickly find what they want. Formerly code-
named Zudeo, the
site already attracts more than two million unique monthly visitors
after only two months
in existence.

About Azureus Inc.

Azureus Inc. is the provider of the most popular P2P application for
the transfer of large
media files. With more than three years of technology innovation,
proven robustness,
and more than 140 million downloads of its application, Azureus users
connect with one
another from more than 100 countries and 40 languages.

Today, Azureus operates a leading global video aggregation and
distribution platform
driven by the exchange of long-form, High Definition or DVD quality
videos, as well
as licensed digital content from leading media companies. The company
has recently
announced content partnerships with Showtime, A&E Networks (including
A&E, History,
and Biography channels), BBC Worldwide, Bennett Media Worldwide, G4
TV, National
Geographic, and Starz Media.

The new commercial-grade platform is supported by powerful peer-
sharing technology,
enabling its vast global community the ability to browse, share,
search and discover
unique multimedia entertainment in a high-resolution format. Visit
www.vuze.com for
more information.=

Wednesday, April 4, 2007

New research program of "service science" to address needs of service economy

[Thanks to Mike Nelson for this pointer. Some excerpts from NYT article --BSA]

http://www.nytimes.com/2007/03/28/technology/28service.html?_r=1&oref=slogin

New Effort to Tap Technology to Aid the Service Economy

A group of large technology companies, universities and professional associations are creating a new organization to support and promote research into ways that technology can increase productivity and innovation in the economy’s service sector.

The creation of the organization, the Service Research and Innovation Initiative, will be officially announced today. It represents the latest step by technology companies and some universities to promote an emerging field that is being called “service science.”

The early academic programs are a blend of computing, social sciences, engineering and management. The aim of service science is to try to improve productivity and accelerate the development of new offerings in services, which account for about 80 percent of the United States economy and similarly large shares of other Western economies.

In the last couple of years, more than three dozen universities in several countries have added service science courses, and the National Science Foundation has begun financing a few service research projects.

Among corporations, I.B.M. has been a leader in promoting service science programs in universities, and it has reoriented its own research laboratories to focus more on services.

“We need a professional organization to help promote service science,” said James C. Spohrer, director of service research at the I.B.M. Almaden Research Center in San Jose, Calif. “It is one of the seed crystals around which the new discipline will form.”

I.B.M. and Oracle are founding corporate members of the Service Research and Innovation Initiative. Other company members of the organization’s advisory board include Accenture, Cisco, Computer Sciences, EMC, Hewlett-Packard, Microsoft and Xerox.

Researchers from several universities are also members, including some from the University of California, Los Angeles; the Wharton School at the University of Pennsylvania; and Arizona State University. The European Commission and a German research organization, the Fraunhofer Institute, are also members of the advisory committee.

It will hold a symposium on service research on May 30 in Santa Clara, Calif

Dilemmas of Privacy and Surveillance

[Identity Management (IdM) systems are at the heart of most modern security, privacy and digital rights management systems. Many IT organizations are keen to deploy such systems to authenticate and authorize users access to various digital right services such as on-line music, eduroam services, databases etc. However, sometimes anonymity can be a good thing - and often a better business model both for the IT provider and consumer. We are starting to see this with on line music sales where some record companies are recognizing that DRM is hurting sales, and with Eduroam where some universities are recognizing there is much better value in providing to the community open access Wifi as opposed to a complex technology of federated identity systems. Thanks to Johannes Ernst for his comments and pointer to the Berkman site -- BSA]


[Johannes Ernst reports:]

My own wake-up call came in a small workshop put on by the Berkman Center at Harvard about a year ago on Digital Identity, in which people such as Rebecca McKinnon (ex-CNN chief in China, then fellow at Berkman) and Marc Rotenberg (ED of the Electronic Privacy Information Center) gave some rather striking examples how that abuse might occur, and in some cases, has occurred already. And we haven't seen no nothing yet ... think of somebody compromising a regional health information organization (now so much in vogue), which requires identity technology to be able to function, in a place such as Washington, DC. Tom Clancy anybody?


The important thing to remember here is that identity deployments come in two flavors: centrally controlled and user-controlled. This difference is not so much even a matter of technology (although some technologies do not lend themselves to decentralized, user-controlled deployment) than it is a matter of fundamental architecture of the deployment. The more centrally controlled, the more easy to abuse ("single point of control/failure" at the center). The more decentralized and user-controlled, the less easy to abuse for a variety of reasons.


Most identity technologies that people are familiar with are centrally controlled, including, say, passports, driver's licenses, social security numbers, health smart cards in some countries, frequent flier membership cards etc. It is exactly with that background that now newer, user-centric identity technologies are popping up all over the place....

http://cyber.law.harvard.edu/home/home?wid=10&func=viewSubmission&sid=2373




From: Brian Randell [From Dave Farber's IPer list]

The (UK) Royal Academy of Engineering has just issued a report on
"Dilemmas of Privacy and Surveillance" that will I trust be of
considerable interest to IP.

From their press release at:

http://www.raeng.org.uk/news/releases/shownews.htm?NewsID=378

> People think there has to be a choice between privacy and security;
> that increased security means more collection and processing of
> personal private information. However, in a challenging report to
> be published on Monday 26 March 2007, The Royal Academy of
> Engineering says that, with the right engineering solutions, we can
> have both increased privacy and more security. Engineers have a key
> role in achieving the right balance.
>
> One of the issues that Dilemmas of Privacy and Surveillance -
> challenges of technological change looks at is how we can buy
> ordinary goods and services without having to prove who we are. For
> many electronic transactions, a name or identity is not needed;
> just assurance that we are old enough or that we have the money to
> pay. In short, authorisation, not identification should be all that
> is required. Services for travel and shopping can be designed to
> maintain privacy by allowing people to buy goods and use public
> transport anonymously. "It should be possible to sign up for a
> loyalty card without having to register it to a particular
> individual - consumers should be able to decide what information is
> collected about them," says Professor Nigel Gilbert, Chairman of
> the Academy working group that produced the report. "We have
> supermarkets collecting data on our shopping habits and also
> offering life insurance services. What will they be able to do in
> 20 years' time, knowing how many donuts we have bought?"
>
> Another issue is that, in the future, there will be more databases
> holding sensitive personal information. As government moves to
> providing more electronic services and constructs the National
> Identity Register, databases will be created that hold information
> crucial for accessing essential services such as health care and
> social security. But complex databases and IT networks can suffer
> from mechanical failure or software bugs. Human error can lead to
> personal data being lost or stolen. If the system breaks down, as a
> result of accident or sabotage, millions could be inconvenienced or
> even have their lives put in danger.

The full report is at:

http://www.raeng.org.uk/policy/reports/pdf/
dilemmas_of_privacy_and_surveillance_report.pdf

50% European productivity growth due to ICT

[Some excerpts from article in Converge Digest -- BSA]

www.convergedigest.com


Information/Communication Drives 50% of EU growth

Public and private information and communication technology (ICT) continues to grow faster than Europe's overall economy, and contributed nearly 50% of EU productivity growth between 2000 and 2004.

The European Commission's annual progress report on i2010 shows that Europeans are quickly embracing new online services. This is supported by a record number of new broadband connections: 20.1 million new broadband lines, connected in the year to October 2006, with high broadband penetration rates in The Netherlands (30%) and the Nordic Countries (25-29%). The online content market is forecast to grow rapidly for the next five years, as already seen with the explosive growth of online music sales and user-created content.

Six countries – Denmark, The Netherlands, Finland, Sweden, the UK and Belgium – all have higher broadband penetration rates than the US and Japan. Such broadband penetration levels have positive knock-on effects. For example ICT-deployment in Danish schools is the highest in the Europe, and Danish businesses are the EU's most advanced Internet and eBusiness users; the British and Swedish workforce are the most skilled in ICT; the Dutch are the most avid consumers of games and music online; and Finland has Europe's highest use of public access points and invests the most in ICT research (64.3% of its R&D business expenditure) – Sweden and Finland also spend 3.9% and 3.5% of their GDP on research, this being over the EU's 3% target. http://www.europa.eu 30-Mar-07


The Internet unleashes intellectual power of the masses

[Excerpts from the Globe and Mail article. Also a web site worth visiting is the author of Wikinomics Don Tapscott-- BSA]

http://www.theglobeandmail.com/servlet/story/RTGAM.20070329.weinsider29/BNStory/GlobeTQ/

Don Tapscott's web site
http://newparadigm.com/


The rules of innovation and competitive advantage in the new age of digital social networking and Web-based communities are not the same as those we've come to know and trust.

A key principle is to look outside rather than inside your firm for strategic direction and ideas. Rallying Web communities to contribute their thoughts and knowledge is an essential dynamic. Wikinomics is "the art and science, theory and practice of understanding how to harness collaboration for competitiveness," the author says.

The first case study in the book details how a mining firm grasped fortune from failure by appealing to the world of online communities and individuals for help in determining where on its property it should drill for gold. The CEO of Vancouver-based Goldcorp Inc. was inspired by the story of Linux where the operating system's development was achieved through Internet-based collaboration.

He launched the online Goldcorp Challenge contest, offering more than half a million dollars for help in determining the best places to look for mineral deposits on its property. Contestants had access to a file that contained all of the company's geologic data. The result: Approximately 110 targets were identified, half of which had not been earmarked by Goldcorp's own engineers. More than 80 per cent of these sites yielded substantial deposits -- a total of more than eight million ounces of gold.

"Companies that have the myopic view that [the only] unique qualified minds who can do everything for their business exist inside the company are making a huge mistake," Mr. Tapscott said during a recent interview. "This is a new paradigm and the future is going to be a bleak one for companies that don't move to exploit it."

Harvesting ideas outside corporate walls is a startling contention for many. Old thinking suggests that those who don't know your business don't have much to contribute, and the knowledge that exists among the masses can't possibly be of much value.

It might also need a leap of faith in the good of people and their ideas. Mr. Tapscott is a believer in the idea that the great collaborative masses include qualified and brilliant minds that simply haven't had the opportunity to participate. Perhaps they would if they could, and the Web provides the means.


Wikinomics challenges business to think differently about their intellectual property, too. Old views suggest companies need to guard their ideas, but in the new emerging economy such thinking may in fact discourage opportunities. Sharing may have much more value than selling. Companies need to open their minds to that potential, Mr. Tapscott contends.

SOA and grids for wireless and portable devices

[Here are a couple of interesting pointers to using SOA and grids for wireless and portable devices. Thanks to Susan Baldwin and Lee Mcnight for these pointers -- BSA]


NEMOS: Mobile-Agent based Service Architecture for Lightweight Devices

Abstract:
NEMOS is a framework for deploying a Service Oriented Architecture (SOA) on lightweight devices such as sensor networks and cell phones, allowing integration of such mobile networks with the SOA architectures available on the Internet. Execution and coordination of service compositions are performed using highly compact mobile agents, in a lightweight environment tailored for distributed task control. Device services and resources are described using ontological metadata, permitting rich data semantics and facilitating service compositions for interoperability between network devices. This approach can bridge web service architectures deployed on the Internet (such as the Web Services architecture) with ad-hoc networks of mobile devices.

More information found in the NEMOS paper: http://www.ugrad.cs.ubc.ca/~d0r4/NEMOS.PDF


http://wirelessgrids.net/

Wireless Grids Corporation (WGC) is just one company working in the Wireless Grid problem space, so I can only tell you what we are working on. We are using what is currently a proprietary middleware that is based on a Couple of simple protocols. We are using ZeroConf for local discovery and Some simple REST based API's as a control channel. What are we controlling? Well,we use our software to control things like UPnP devices and Samba based File sharing in Windows.

Basically, we are trying to create a grid of resources and controllers inside the home network that will help people utilize the resources they already have as well as enable new interactions. e.g. Windows does file sharing...but it doesn't make it easy to scale file sharing up to several machines with several users in a single house. Most people will never use Windows file sharing because of this.



NSF's New Cyberinfrastructure Vision Document is Available

NSF's New Cyberinfrastructure Vision Document is Available





Cyberinfrastructure Vision for 21st Century Discovery is a sweeping call to reimagine: 1) Cyberinfrastructure resources, tools and related services such as supercomputers, high-capacity mass-storage systems, system software suites and programming environments, scalable interactive visualization tools, productivity software libraries and tools, large-scale data repositories and digitized scientific data management systems, networks of various reach and granularity and an array of software tools and services that hide the complexities and heterogeneity of contemporary cyberinfrastructure while seeking to provide ubiquitous access and enhanced usability, and; 2) The preparation and training of current and future generations of researchers and educators to use cyberinfrastructure to further their research and education goals, while also supporting the scientific and engineering professionals who create and maintain these IT-based resources and systems and who provide essential customer services to the national science and engineering user community. The vision document was developed by the National Science Foundation's Cyberinfrastructure Council.

DMCA Architect Admits Defeat

[From a posting on johnmacsgroup by Michael Geist -- BSA]

Though I'm not sure simply giving up on copyright and going for a
'patronage culture' is the best solution. That methodology produces art
which serves the needs of those with money and power, and the
independant artist who is critical of the established order is left out
in the cold. Ironically, it is the hip, young, anti-commercial leftists
who are leading the way to a world where all professional artists will
work for commercial concerns. (And how will this work for art which does
not enhance the profitability of other media? A movie company would pay
for music to use in its films, for example, but who will pay for books
whose only value is the book itself, not its value as a tie-in to
something else? A quick look at fanfiction.net shows the quality of
purely amateur writing in terrifying detail...)

Again, I state, the solution is cultural, not technological. Violating
copyright should be seen as being on the same moral plane as kicking
puppies or peeing in public -- something civilized people Just Don't Do. ================================ http://www.michaelgeist.ca/content/view/1826/125/

Friday March 23, 2007
McGill University hosted an interesting conference today on music and
copyright reform. The conference consisted of two panels plus an
afternoon of open dialogue and featured an interesting collection of
speakers including Bruce Lehman, the architect of the WIPO Internet
Treaties and the DMCA, Ann Chaitovitz of the USPTO, Terry Fisher of
Harvard Law School, NDP Heritage critic Charlie Angus, famed music
producer Sandy Pearlman, and myself. A video of the event has been
posted in Windows format.

My participation focused on making the case against anti-circumvention
legislation in Canada (it starts at about 54:30). I emphasized the
dramatic difference between the Internet of 1997 and today, the harmful
effects of the DMCA, the growing movement away from DRM, and the fact
that the Canadian market has supported a range of online music services
with faster digital music sales growth than either the U.S. or Europe
but without anti-circumvention legislation.

The most interesting - and surprising - presentation came from Bruce
Lehman, who now heads the International Intellectual Property Institute.
Lehman explained the U.S. perspective in the early 1990s that led to
the DMCA (ie. greater control though TPMs), yet when reflecting on the
success of the DMCA acknowledged that "our Clinton administration
policies didn't work out very well" and "our attempts at copyright
control have not been successful" (presentation starts around 11:00).
Moreover, Lehman says that we are entering the "post-copyright" era for
music, suggesting that a new form of patronage will emerge with support
coming from industries that require music (webcasters, satellite radio)
and government funding. While he says that teens have lost respect for
copyright, he lays much of the blame at the feet of the recording
industry for their failure to adapt to the online marketplace in the
mid-1990s.

In a later afternoon discussion, Lehman went further, urging Canada to
think outside the box on future copyright reform. While emphasizing the
need to adhere to international copyright law (ie. Berne), he suggested
that Canada was well placed to experiment with new approaches. He was
not impressed with Bill C-60, seemingly because he does not believe that
it went far enough in reshaping digital copyright issues. Given ongoing
pressure from the U.S., I'm skeptical about Canada's ability to chart a
new course on copyright, yet if the architect of the DMCA is willing to
admit that change is needed, then surely our elected officials should
take notice.


2007 - The Big year for Broadcast TV over the Internet?


[Around the world broadcasters are preparing plans to launch delivery of prime time TV shows over the Net. CBS is rumoured to be planning a major initiative to offer all their prime TV fare over the Internet a day before it is broadcast on the airwaves. The new Apple TV box may also provide a significant incentive as it allow TV viewers to wirelessly connect their computer to the TV in order to watch movies and shows downloaded or streamed over the Internet. A good example of this trend is in New Zealand where the national broadcaster is making all their video fare available to download over the Net (but so far they have decided not to use P2P like BitTorrent for file transfer). With the advent of Joost and Internet triple play companies like Inuk, 2007 could be the big year for broadcast TV over the Internet. The biggest stumbling block besides the last mile bandwidth problem, still seems to be content and protection rights - I find it hard to imagine the current content business model will survive which is entirely dependent on failed DRM schemes, take-down orders and lawyers. Thanks to pointers from Donald Clark and Dewayne Hendricks -- BSA]


I found a site where you can get streaming video of lots of current
and old TV shows, plus a selection of other media types such as
anime. Its called 'TV Links': . Worth
checking out if you're interested in catching up on some shows that
you've missed over the years. For instance, I've checked out some
early episodes in the 'Black Adder' series from the UK. The video
and sound quality is surprisingly good.

-- Dewayne


Television NZ Ltd has just launched its on demand TV services. A limited, but fair, offering originally (don't know if some of it is restricted on geographical location). They are working hard on the copyright for more content.

http://tvnzondemand.co.nz/content/ondemand_index/ondemand_skin


Using Akamai around the country to spread the network load - but at
200MB+ per 1/2 hr show, this will cripple even further the limited
regional backhaul NZ has and further the growing clamour for "more fibre in the ground" and for community-backed open access passive infrastructures.


From David Farber's IPer list

Begin forwarded message:

Hi-def's DRM: Dead with Rigor Mortis: In a timely illustration for
the camp that contends Digital Rights Management doesn't work, never
did and never will (see "Jobs endorses unchained melodies"), the
proud hackers over at the Doom9 forums announced their latest
breakthroughin high-definition DVDs. Previously, hackers had found
ways, albeit cumbersome, to uncover the "volume key" that would
unlock individual Blu-ray and HD DVD discs (see "You couldn't seem to
agree on one standard, so we took the liberty of hacking them both
for you"). Now a hacker known as Arnezami has teased out the
"processing key" that can be used to unlock, decrypt, and backup
every HD DVD and Blu-ray Disc film on the market.

As Cory Doctorow notes at BoingBoing, this is a copy-protection
scheme that took years to develop and it was broken wide open in
weeks. "For DRM to work, it has to be airtight. There can't be a
single mistake. It's like a balloon that pops with the first prick.
That means that every single product from every single vendor has to
perfectly hide their keys, perfectly implement their code," Doctorow
writes. "There is no future in which bits will get harder to copy.
Instead of spending billions on technologies that attack paying
customers, the studios should be confronting that reality and
figuring out how to make a living in a world where copying will get
easier and easier. They're like blacksmiths meeting to figure out how
to protect the horseshoe racket by sabotaging railroads."


From Dewayne's list

Why piracy is still more common than legal video downloads 12/27/2006 9:27:38 AM, by Ryan Paul



A recent study conducted by consumer and retail analysis group NPD
claims that peer-to-peer (P2P) video downloads (which in the study
are synonymous with illegal downloads) are outpacing purchases from
legitimate video download services five to one. The study, which was
performed with NPD's VideoWatch tracking software on "the home
computers of more than 12,500 U.S. households," states that 8 percent
of Internet-using households downloaded video content from P2P
services, whereas 2 percent paid to download video content from
legitimate providers. The study also indicates that nearly 60 percent
of video files downloaded from P2P sites were adult-film content,
while 20 percent was TV show content and 5 percent was mainstream
movie content.

Avast, matey! Opt-in!

The opt-in methodology used by NPD could lead to significant under-
reporting of P2P downloading since those who are voluntarily tracked
by NPD's software are probably going to be less inclined to violate
copyright law. Chances are that the ratio of "legal" to "illegal"
downloading is further tipped in piracy's favor than NPD's study
indicates. Nevertheless, assuming that NPD's study approximates
reality, one could attribute the strength of piracy and the limited
adoption of commercial and P2P-based video downloading to several
factors.

First, legal movie download services are still relatively new, and
the movie industry's trepidation has prevented a diverse body of
content from becoming commercially available. I still don't know of
any legal video download service that offers my favorite episodes of
Babylon 5, for instance. If the new digital economy is all about the
so-called "Long Tail," then online video stores are missing a major
opportunity by not playing their cards and rapidly expanding their
selection. This is doubly true since the "selection" of content
available on the P2P networks is truly impressive. P2P wins the the
selection category hands down. This is doubly true when you consider
that NPD found that 60 percent of P2P downloads were pornographic in
nature.

Another obvious factor is Content Restriction Annulment and
Protection (CRAP) technologies, more commonly known as DRM. Consumers
who pay for digital video downloads want to be able to play those
videos with the software of their choice, without a lot of trouble or
the imposition of additional limitations. Consumers also want to be
able to convert legitimately downloaded content to other formats so
that it can be played on mobile devices. Pervasive DRM and high
prices make legal video downloading much less appealing to the
average consumer.