[Excerpts from Information Week article -- BSA]
http://www.informationweek.com/news/showArticle.jhtml?articleID=202601956&pgno=4&queryText=
By Andy Dornan
InformationWeek
Forget outsourcing. the real threat to IT pros could be Web 2.0. While there's a lot of hype and hubris surrounding wikis, mashups, and social networking, there's also a lot of real innovation--much of it coming from increasingly tech-savvy business users, not the IT department.
"We've cut IT staff by 20%, and we're providing a whole lot more in terms of IT services," says Ken Harris, CIO at nutritional products manufacturer Shaklee. Harris started with a mashup platform from StrikeIron; he found mashups such an effective way to integrate multiple Web services that he turned to Web-based service providers to replace in-house functions. Now, Shaklee gets its ERP from Workday and search from Visual Sciences, and it's looking at other IT functions that software as a service can replace.
Instead of passive consumers, Web surfers can become active creators.
All that interactivity ought to make Web 2.0 ideally suited for business use.
Of all Web 2.0 technologies, social networking is the one that gets vendors and venture capitalists most excited. At least 17 startups are pitching social networking technology to business customers (see table, Social Networking Technology Startups), while countless social networking Web sites are chasing individual users.
IT'S LOOSENING GRIP
Loss of IT control is a consistent theme as Web 2.0 penetrates business. The greatest upheaval is likely to come from enterprise mashups, which combine the social and technical aspects of Web 2.0 by letting users develop their own applications. Though very few businesses use mashups at present, those that are see great benefits, and larger players such as BEA, IBM, and Oracle are entering the game. Cutting out the middleman--that's the IT department--can be a great way of aligning business and technology.
"Mashups have let end users do more of what used to be done by IT," says Warren Breakstone, executive VP in charge of client services at investment tools provider Thomson Financial. Although not in the IT department, Breakstone started using a hosted mashup service from Serena Software and now runs a team of business analysts who develop Web-based applications for sales, marketing, and other personnel. "Now we're moving into traditional IT services: The IT department is using apps that we built."
Breakstone says this doesn't bring his team into conflict with the IT department. "It frees IT up to do those mission-critical tasks behind the scenes," he says.
Wednesday, October 31, 2007
User Driven Innovation: Profiting from mashups and Open APIs
[Here is a couple more pointers on the democratization of innovation. There are now a number of compnaies specializing in building platforms for maships and open APIs. As well In Europe, one of the expressions is the emergence of this open innovation model is the European network of living labs.. As ArturWe are trying that this new approach of open innovation could help to the redesign of the Internet in a different way, with more implication of the users. For example, we try to engage media companies and media users in this process. Thanks to Michael Weiss and Artur Serra for these pointers
-- BSA]
User Innovation with Open APIs: Profiting from Mashups http://www.ebusinesscluster.com/media_lib/docs/Nov_2007_Wiess.pdf
Open Living Labs
www.openlivinglabs.org
Some telcos "Get it"
[Excerpts from my post on Internet Evolution -- BSA]
http://www.internetevolution.com/author.asp?section_id=506&doc_id=136710&
Although many telcos are treated as the popular whipping boys for all that’s wrong with today’s telecom environment, there’s a small number that are starting to understand the dynamics of the new marketplace.
A good example is Netherlands-based KPN Telecom NV (NYSE: KPN), which recently announced that it's joining forces with Reggefiber to speed up the rollout of FTTH (fiber to the home) in Almere, the fifth largest city in the Netherlands. Reggefiber already owns some networks in smaller towns and in parts of cities, including the project in Amsterdam. [..]
KPN, like many other telcos, is losing many customers to the local cable companies. Cablecos can offer full triple-play services -- high-speed Internet access, television, and telephone -- quite easily, whereas most telcos can only deliver Internet and telephony. Telcos are positioning themselves to take advantage of IPTV, but it’s still an open question whether this technology will succeed over DSL networks. The rollout in Almere is likely to spur more cooperation between Reggefiber and KPN in other parts of The Netherlands.
Swedish telco Telia came to the same conclusion that municipal open access networks may not be such a bad thing. Telia used to be a vociferous opponent of municipal open networks. But over time, many municipal networks are discovering that they don’t have the necessary skills and financing to manage an open network. Many municipal networks in Sweden have issued calls for proposals from third parties to operate their open access networks. Guess who won most of these deals?
[..]
What’s even more surprising is that some telcos are also starting to realize that peer-to-peer (P2P) networks, and other nefarious applications, are actually good services that should be encouraged by their customers rather than penalized. Currently, most telcos treat P2P users as costly parasites on their networks, because only a small number of users consume most of the bandwidth. Their knee-jerk response has been to block such traffic with Layer 7 filters from companies like Sandvine Inc. (London: SAND) or, in extreme cases, to disconnect these power users entirely.
More advanced telcos are starting to deploy technologies from a number of P2P companies, such as BitTorrent Inc. and Joost , which enhance P2P traffic experience for their customers rather than try to block it. By distributing super node servers throughout the network, Telcos can reduce unbalanced P2P traffic loads and provide a much better P2P experience for their customers. Increasingly, when P2P companies obtain licensing arrangements with the music and film industry, the threat of legal action and charges of abetting and aiding piracy become less of a concern.
As more telcos understand that their own survivability is at stake, the smarter ones will realize that new business models and relationships are critical to their success.
[..]
The end of Grid Computing
[I have always been a skeptical of grid computing and networking when it is narrowly defined as "utility computing" or "bandwidth on demand" networks. Both concepts remind me of the bad old days of computing service bureaus and circuit switched networks. In both cases extensive centralized administrative processes are required to ensure adequate capacity and support. Virtualization on the other hand provides a lot more user control of both the computation and network facilities. It also enables greater convergence between cyber-infrastructure and the next generation of the Internet. For a good example of this approach is 4WARD. the EU has just launched a very exciting program on network virtualization which is very similar to CANARIE's UCLP initiative called 4WARD -- BSA]
Grids: On Demand or Virtual? http://www.canarie.ca/canet4/library/recent/Gridnets_Oct_17_2007.ppt
Convergence of Cyber-Infrastructure and Next Generation Internet http://www.canarie.ca/canet4/library/recent_presentations.html
4WARD
The Success of the Internet is also its Failure http://www.emobility.eu.org/Events/2007-09-04_PIMRC_Conference_Athens/General_4WARD_public.pdf
http://telzur.blogspot.com/2007/10/end-of-grid-computing.html
The End of Grid Computing?
In the year 2003 the MIT Technology review ranked "Grid Computing" among the 10 Emerging Technologies That Will Change the World [1]. We are now four years later and something is not going well with "Grid Computing". An indication that there is a problem can easily be seen by looking at the "Google Trends" plot for the term "Grid Computing":
(click on the image to get the current trend).
This finding can be compared with another buzz word, "Virtualization", which is older than "Grid Computing" and yet is gaining more and more momentum:
There is however one exception. The Academic Grid is still having lot's of glory thanks to the huge heavily funded European (EGEE) and other US projects. When LHC data will start to be taken at CERN it will reach it's top importance. But, it seems that for other scientific projects Grid Computing is not going to be such a success. It will remain as "Nice to have" but will never replace High-Performance Computing (HPC) on one hand and classical distributed computing tools such as Condor [2] which exists for more than 20 years on the other hand. Once the governmental fundings will be removed then all the hype of the academic Grid Computing will decline very quickly as well. As was pointed in an interesting talk by Fabrizio Gagliardi about the future of grid computing, at the GridKa07 School, other kinds of Grid Computing infrastructures that will stand on stable financial ground may emerge as the successors, for example Amazon's S3 and EC2 and the joint IBM and Google's cloud computing.
Grids: On Demand or Virtual? http://www.canarie.ca/canet4/library/recent/Gridnets_Oct_17_2007.ppt
Convergence of Cyber-Infrastructure and Next Generation Internet http://www.canarie.ca/canet4/library/recent_presentations.html
4WARD
The Success of the Internet is also its Failure http://www.emobility.eu.org/Events/2007-09-04_PIMRC_Conference_Athens/General_4WARD_public.pdf
http://telzur.blogspot.com/2007/10/end-of-grid-computing.html
The End of Grid Computing?
In the year 2003 the MIT Technology review ranked "Grid Computing" among the 10 Emerging Technologies That Will Change the World [1]. We are now four years later and something is not going well with "Grid Computing". An indication that there is a problem can easily be seen by looking at the "Google Trends" plot for the term "Grid Computing":
(click on the image to get the current trend).
This finding can be compared with another buzz word, "Virtualization", which is older than "Grid Computing" and yet is gaining more and more momentum:
There is however one exception. The Academic Grid is still having lot's of glory thanks to the huge heavily funded European (EGEE) and other US projects. When LHC data will start to be taken at CERN it will reach it's top importance. But, it seems that for other scientific projects Grid Computing is not going to be such a success. It will remain as "Nice to have" but will never replace High-Performance Computing (HPC) on one hand and classical distributed computing tools such as Condor [2] which exists for more than 20 years on the other hand. Once the governmental fundings will be removed then all the hype of the academic Grid Computing will decline very quickly as well. As was pointed in an interesting talk by Fabrizio Gagliardi about the future of grid computing, at the GridKa07 School, other kinds of Grid Computing infrastructures that will stand on stable financial ground may emerge as the successors, for example Amazon's S3 and EC2 and the joint IBM and Google's cloud computing.
The democratization of Innovation - the Economist
[The latest issue of the Economist has an excellent report on "Innovation". One the main themes from the report is that ICT and the Internet are transforming the way innovation is done, and the speed at which it is happening. Innovation used to be largely the purview of "white coat" researchers at universities and industry labs. Governments have poured billions of dollars into university research, but have seen little in return in terms of innovation and new products or services. But now thanks to the Internet and SOA, Web 2.0 tools, open source business models etc a much larger community can extract this knowledge from academic research and be full participants, if not leaders, in the innovation process. This will be a significant advantage for entrepreneurs and small business and will have significant transformative process on our economy. Thanks to Craig Dobson for this pointer. Some excerpts from the Economist report- BSA]
Democratization of Innovation
http://www.canarie.ca/canet4/library/recent/ManagementofInnovation_Carleton_Oct10_2007.ppt
Economist Report http://www.economist.com/specialreports/displayStory.cfm?story_id=9928154
How globalisation and information technology are spurring faster innovation
Now the centrally planned approach is giving way to the more democratic, even joyously anarchic, new model of innovation. Clever ideas have always been everywhere, of course, but companies were often too closed to pick them up. The move to an open approach to innovation is far more promising. An insight from a bright spark in a research lab in Bangalore or an avid mountain biker in Colorado now has a decent chance of being turned into a product and brought to market.
That is why innovation matters. With manufacturing now barely a fifth of economic activity in rich countries, the “knowledge economy” is becoming more important. Indeed, rich countries may not be able to compete with rivals offering low-cost products and services if they do not learn to innovate better and faster.
The move toward open innovation is beginning to transform entire industries
Mr Chesbrough's two books “Open Innovation” and “Open Business Models” have popularised the notion of looking for bright ideas outside of an organisation. As the concept of open innovation has become ever more fashionable, the corporate R&D lab has become decreasingly relevant. Most ideas don't come from there (see chart 4).
IBM is another iconic firm that has jumped on the open-innovation bandwagon. The once-secretive company has done a sharp U-turn and embraced Linux, an open-source software language. IBM now gushes about being part of the “open-innovation community”, yielding hundreds of software patents to the “creative commons” rather than registering them for itself. However, it also continues to take out patents at a record pace in other areas, such as advanced materials, and in the process racks up some $1 billion a year in licensing fees.
For a business that uses open and networked innovation, it matters less where ideas are invented. Managers need to focus on extracting value from ideas, wherever they come from.
Infrastructure as a web service - IaaS
[Web services and SOA are a much broader applicability beyond linking software systems together. These architectures are now being used to represent physical devices as web services as well. A good example of infrastructure as a web service is in the telecommunications environment where tools like UCLP expose very telecommunications element and service (including management and control plane elements) and virtualize them as a web service. This allows end users to compose or orchestrate their own network solutions linking together servers, network links, instruments, control plane knobs and storage devices into a sophisticated articulate private network. Some excerpts from SOA systems --BSA]
Agria Network Infrastructure Web Services http://www.inocybe.ca/
http://soa.sys-con.com/read/439721_1.htm
The many crucial jobs IT performs for a company are hard enough - provisioning employees and keeping their workstations up and running; protecting data to meet the stringent requirements imposed by Sarbanes-Oxley, HIPAA, PCI, and other regulations; managing data recovery and business continuity; and so on. The risk of operating with inadequate resources or burning unnecessarily through corporate funds are unwelcome addenda to the IT department's burden.
Now imagine a world where you can scale your IT capacity up or down on command without any capital expenditure.
This world exists. It's enabled by a new business concept based on virtualizing the IT environment and is called Infrastructure as a Service. IaaS extends the remote hosting concept of Software as a Service (SaaS) to hardware.
The interest in IaaS can be attributed to significant increases in IT-enabled business models such as e-commerce, Web 2.0 and SaaS, which drive demand, and by advances in technology that enable it, including virtualization, utility computing, and data center automation.
Infrastructure as a Service is generally delivered in addition to a utility computing platform. As long as you have a platform like VMware for virtualization, you look identical to your infrastructure provider. So, if you wanted to push 30 machines or 50 machines out of your data center for 90 days, you could easily bring them back because you're both running the same virtual platform.
Gartner's top 10 strategic technologies for 2008
[Some excerpts form Network World article -- BSA]
http://www.networkworld.com/news/2007/100907-10-strategic-technologies-gartner.html?netht=101007dailynews2&&nladname=101007dailynews
1. Green IT
This one is taking on a bigger role for many reasons, including an increased awareness of environmental danger; concern about power bills; regulatory requirements; government procurement rules; and a sense that corporations should embrace social responsibility.
But IT is still responsible for 2% of all carbon releases, and it’s coming from many sources. “Fast memory is getting to be a surprisingly high energy consuming item,” Claunch said.
2. Unified Communications (UC)
3. Business Process Management
BPM is more of a business discipline than a technology, but is necessary to make sure the technology of service-oriented architectures (SOA) deliver business value, Cearley said. It’s also important for dealing with laws like Sarbanes-Oxley that require business to define processes, he said.
“SOA and BPM have common objectives,” Cearley said. “They’re both focused on driving agility, driving business process improvement, flexibility and adaptability within the organization. SOA is a key mechanism that makes BPM easier.”
4. Metadata Management
Metadata is the foundation for information infrastructure and is found throughout your IT systems: in service registries and repositories, Web semantics, configuration management databases (CMDB), business service registries and in application development.
“Metadata is not just about information management,” Cearley said. “You need to look beyond that. Metadata is everywhere.”
5. Virtualization 2.0
“Virtualization 2.0” goes beyond consolidation. It simplifies the installation and movement of applications, makes it easy to move work from one machine to another, and allows changes to be made without impacting other IT systems, which tend to be rigid and interlinked, Claunch said.
There are also disaster recovery benefits, since the technology lets you restack virtual systems in different orders in recovery centers, providing more flexibility.
“Virtualization is a key enabling technology because it provides so many values,” Claunch said. “Frankly it’s the Swiss Army knife of our toolkit in IT today.”
6. Mashups & Composite Applications
Mashups, a Web technology that combines content from multiple sources, has gone from being a virtual unknown among IT executives to being an important piece of enterprise IT systems. “Only like 18 months ago, very few people (knew what a mashup was),” Cearley said. “It’s been an enormous evolution of the market.”
U.S. Army intelligence agents are using mashups for situational awareness by bringing intelligence applications together. Enterprises can use mashups to merge the capabilities of complementary applications, but don’t go too far.
“Examine the application backlog for potential relief via mashups,” the analysts stated in their slideshow. “Investigate power users’ needs but be realistic about their capabilities to use mashups.” Other stories on this topic
7. Web Platform & WOA
Web-oriented architecture, a version of SOA geared toward Web applications, is part of a trend in which the number of IT functions being delivered as a service is greatly expanding. Beyond the well-known software-as-a-service, Cearley said over time everything could be delivered as a service, including storage and other basic infrastructure needs.
“This really is a long-term model that we see evolving from a lot of different parts of the market,” Cearley said. It’s time for IT executives to put this on their radar screens and conduct some “what-if” scenarios to see what makes sense for them, he said.
9. Real World Web
Increasingly ubiquitous network access with reasonably useful bandwidth is enabling the beginnings of what analysts are calling the “real world Web,” Claunch said. The goal is to augment reality with universal access to information specific to locations, objects or people. This might allow a vacationer to snap a picture of a monument or tourist attraction and immediately receive information about the object, instead of flipping through a travel book.
10. Social Software
Social software like podcasts, videocasts, blogs, wikis, social bookmarks, and social networking tools, often referred to as Web 2.0, is changing the way people communicate both in social and business settings.
“It’s really been empowering people to interact in an electronic medium in a much richer fashion than we did with e-mail or corporate collaboration systems,” Cearley said.
The effectiveness of these tools for enterprise use varies, and some tools that have the potential to improve productivity aren’t yet mature enough for enterprise use, Gartner says. For example, wikis are highly valuable and mature enough for safe and effective enterprise use. Meanwhile, Gartner says prediction markets potentially have a lot of enterprise value but so far have low maturity. Podcasts, conversely, can be used safely and effectively but don’t have a lot of business value, the analyst firm said.
Mashup for the Masses - Intel Mash Masker
http://mashmaker.intel.com/
Intel® Mash Maker: Mashups for the Masses
Intel® Mash Maker is an extension to your existing web browser that allows you to easily augment the page that you are currently browsing with information from other websites. As you browse the web, the Mash Maker toolbar suggests Mashups that it can apply to the current page in order to make it more useful for you. For example: plot all items on a map, or display the leg room for all flights.
Intel® Mash Maker learns from the wisdom of the community. Any user can teach Mash Maker new mashups, using a simple copy and paste interface, and once one user has taught Mash Maker a mashup, this mashup will be automatically suggested to other users. Intel® Mash Maker also relies on the community to teach it about the structure and semantics of web pages, using a built in structure editor.
Cyber-infrastructure, libraries and network science
[Richard Ackerman at CISTI maintains an excellent blog on libraries and eScience. His latest posting is well worth reading --BSA]
http://scilib.typepad.com/science_library_pad/2007/09/e-science-and-l.html
E-Science and libraries
As I said over a year and a half ago
A lot of science today is very computation and data intense. I think there is a big role for academic libraries as custodians of data and research output.
Science Library Pad - the role of the academic library and librarian - January 13, 2006
Fortunately, there are lots of people thinking about the role of the library in relation to cyberinfrastructure, as well as e-research and publishing.
This month's D-Lib has a number of relevant articles and opinion pieces, including
Library Support for "Networked Science"
by Bonita Wilson
doi:10.1045/september2007-editorial
Cyberinfrastructure, Data, and Libraries, Part 1: A Cyberinfrastructure Primer for Librarians Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt1
Cyberinfrastructure, Data, and Libraries, Part 2: Libraries and the Data Challenge: Roles and Actions for Libraries Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt2
The editorial by Bonita Wilson points to "The Dawn of Networked Science" in The Chronicle for Higher Education (which is not open to read, so no link for them).
I also recommend the August 2007 CTWatch Quarterly, which was on the topic "The Coming Revolution in Scholarly Communications & Cyberinfrastructure".
A reminder that there will be an E-Science track in the upcoming OECD meeting on October 3, 2007.
For more information on this topic, you can see my E-Science category.
http://scilib.typepad.com/science_library_pad/2007/09/e-science-and-l.html
E-Science and libraries
As I said over a year and a half ago
A lot of science today is very computation and data intense. I think there is a big role for academic libraries as custodians of data and research output.
Science Library Pad - the role of the academic library and librarian - January 13, 2006
Fortunately, there are lots of people thinking about the role of the library in relation to cyberinfrastructure, as well as e-research and publishing.
This month's D-Lib has a number of relevant articles and opinion pieces, including
Library Support for "Networked Science"
by Bonita Wilson
doi:10.1045/september2007-editorial
Cyberinfrastructure, Data, and Libraries, Part 1: A Cyberinfrastructure Primer for Librarians Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt1
Cyberinfrastructure, Data, and Libraries, Part 2: Libraries and the Data Challenge: Roles and Actions for Libraries Anna Gold, Massachusetts Institute of Technology doi:10.1045/september2007-gold-pt2
The editorial by Bonita Wilson points to "The Dawn of Networked Science" in The Chronicle for Higher Education (which is not open to read, so no link for them).
I also recommend the August 2007 CTWatch Quarterly, which was on the topic "The Coming Revolution in Scholarly Communications & Cyberinfrastructure".
A reminder that there will be an E-Science track in the upcoming OECD meeting on October 3, 2007.
For more information on this topic, you can see my E-Science category.
Burlington Telecom FTTH Case Study
[For any community that is looking to deploy a community fiber network, I highly recommend taking a look at this study on the Burlington project. Although the Burlington network bills itself as an "open access" network, its network architecture, like that in Amsterdam is built around home run fiber, terminating on GPON at the central office. GPON is used to reduce interface costs only, so at a future date if a customer wanted layer one connectivity to a given service provider and bypass the PON altogether then it would only require a simple fiber patch. This provides for a future proof architecture as new technologies and business model evolve. Thanks to Frank Culuccio and Tim Nulty for this pointer -- BSA]
Burlington Telecom Case Study
Christopher Mitchell, Director, Telecommunications as Commons Initiative christopher@ilsr.org | August 2007
http://www.newrules.org/info/bt.pdf
Subscribe to:
Posts (Atom)