Tuesday, October 4, 2011

Commercial cloud services for universities and HPC gain momentum - Internet 2 announcement

[Today’s Internet 2 announcement of brokering commercial cloud services from HP, Box and SHI is the start of a major trend that will transform computing at universities and eventually businesses.
This is likely to be the first of many announcements on “above the net” services from Internet 2. Moving university IT departments and researchers from the traditional “client-server” mindset to delivering services from the cloud will enable new applications and services at much lower cost. This will create new business opportunities for NRENs and small businesses. The NSF XSEDE announcement of spending $35m per year to develop these services and applications for researchers is a great example of this approach. Not only will this provide a potential new service revenue for NRENs, but it has the potential to significantly reduce overall costs to universities by several millions of dollars per year per institution. A CANARIE funded study undertaken by IISD, demonstrated that the energy savings alone could pay for the retirement of thousands of servers on campuses in favor of clouds. Even traditional research computing users are also starting to move clouds as more and more computation becomes data intensive. HPC, has long been dominated by the “modeling” community of astrophysics, computational chemistry, fluid dynamics, etc. But now the biggest growth in research computing is data analysis and knowledge extraction in fields such astro-informatics, computational biology and (yes) even computational history. This type of research computing is orders of magnitude larger than traditional HPC modeling and is ideally suited for commercial clouds. Some excerpts – BSA]

IISD study on how energy savings can pay for the move to cloud computing

XSEDE – Accelerating science by outsourcing the mundane

Accelerating Scientific Discovery for Research via the Cloud
Throughout the history of science, data has been scarce and precious. Indeed, the modern scientific method is defined by a careful cycle of hypothesis and experiment, which gathers experimental data to test the hypothesis. In a few short years, scientists and engineers have gone from scarcity to an incredible richness, necessitating a significant change in how they manage and extract insight from all this data. In a parallel shift, many of our scientific, engineering and societal questions increasingly lie at the intersections of traditional disciplines.
Increasing data volumes and the complexity of collaboration on interdisciplinary problems are challenging our historical approaches to discovery and innovation via computing. Most researchers and research institutions are ill-prepared for the large-scale computing infrastructure management challenges posed by large data sets and complex models. The cloud and associated applications and tools offer a possible solution to this challenge by letting scientists be scientists.

The U.S. government can accelerate this transition by encouraging the purchase of cloud services, in addition to the acquisition of local IT infrastructure, and by supporting new tools that facilitate distributed collaboration and simplify access to multidisciplinary scientific data. As I have noted before, Microsoft is acting on this belief, working in partnership with the National Science Foundation.

Fostering Continued Support for Computing Research and Education

Today's cloud technology is derived from basic computing research conducted over the past four decades. To ensure that the U.S. continues to remain at the forefront of cloud technology, continued investment in basic research is critical. There are deep and open questions in areas as diverse as the future of silicon scaling and system-on-a-chip design, energy-efficient systems, primary and secondary storage, data mining and analytics, wired and wireless networks, system resilience and reliability, privacy and security, and user interfaces and accessibility, to name just a few. Insights and innovations from this research will spawn new companies, create jobs and reshape our future.

In addition to continued research investment, it is critical to support the pipeline that produces researchers, and others who will able to invent new uses of the cloud and information technology. The U.S. Bureau of Labor Statistics estimates that the computing sector will have 1.5 million job openings over the next 10 years, yet the number of graduates receiving Bachelors, Masters or Ph.D. computer science degrees is far short of that. In addition, we must strengthen the quality of and access to computing education at all levels. Consistent with these concerns about the IT workforce and computing education, Microsoft is a founding member of theComputing

Internet2 NET+ Services To Deliver Cloud Services To University Faculty, Staff and Students Nationwide
New partnerships announced with HP, SHI International and Box
Raleigh, N.C.—Oct. 4, 2011—Internet2, the world’s most advanced networking consortium, today announced individual partnerships with HP, SHI International and Box to deliver new Internet2 NET+ Services, including cloud computing and infrastructure services, to Internet2 members’ faculty, staff and students nationwide. The announcement was made at today’s Internet2 Fall Member Meeting.

“These partnerships benefit our members by making secure and very affordable cloud computing services available anywhere, anytime,” said Dave Lambert, President and CEO, Internet2. “By leveraging existing community investments, Internet2 provides seamless access to standard and customized offerings from industry leaders for chief information officers in higher education to consider in their technology plans. Internet2 NET+ Services are the next generation of value from Internet2.“

Internet2 NET+ Services provide “above the network” services to Internet2 member organizations, including higher education, government, and industry. Cornell University; Indiana University; Penn State; University of California, Berkeley; University of Michigan; University of Notre Dame; University of Utah; and other Internet2 participating campuses will immediately begin testing and validating some of the specific offerings and services. The new services will create a platform tailored to the needs of the Internet2 community, and will leverage Internet2’s 100G network and InCommon identity management services. Higher education members who are InCommon subscribers may purchase Box and HP services in early 2012, as part of their annual membership.

Based on the HP Converged Infrastructure and HP Cloud System solutions, HP, in conjunction with SHI, will provide a “private community cloud” suite of infrastructure services that are designed to meet required levels of security, performance and availability for higher education. HP and SHI will operate these services for the Internet2 community. SHI Cloud combines best-of-breed technologies to create the most secure, high-performance, and industrial grade cloud offering with private, dedicated, managed or multi-tenant options. Indiana University; Penn State; University of Notre Dame; and the University of Utah will participate in a pilot program after which the service will be made broadly available.

“Top research institutions require flexible, affordable cloud services to be able to conduct some of the most important, compute-intensive research in the world,” said Rich Geraffo, senior vice president and managing director of HP Enterprise business. “HP and SHI are teaming up to deliver Internet2 members the reliability, performance and scalability required to further advance these innovative efforts.”

“Academic communities are driving innovation in every industry, and they require a cloud platform that’s powerful, secure, and scalable to support their computing needs,” says Thai Lee, president and CEO of SHI. “By joining forces with HP and Internet2, we’ll be able to deliver and manage an affordable private cloud that will let research teams focus on their work, not their IT department. Further, our robust industrial-grade cloud platform will support the core mission of Internet2 by delivering technology that goes beyond today and meets future needs of the computing world.”

Supercomputing center targets big, fast storage cloud at academics, industry

By Jon Brodkin | Published about 2 hours ago
A storage cloud with 10 Gigabit Ethernet speed and scalability to hundreds of petabytes has been launched to provide virtually unlimited storage capacity to supercomputing customers.
Built by the San Diego Supercomputer Center at UC San Diego, the SDSC Cloud has 5.5PB to begin with, but “is scalable by orders of magnitude to hundreds of petabytes, with aggregate performance and capacity both scaling almost linearly with growth,” the SDSC says.
The supercomputing center believes this is the largest academic-based cloud storage system in the United States, and said it is designed for researchers, students, academics and industry users who need secure and cost-effective storage for data sets of any size. Each object stored will have a unique URL for sharing.
SDSC’s project is another example of cloud computing expanding the accessibility of high-performance computing(HPC) functionality once reserved for an exclusive set of institutions. Instead of being forced to build out huge clusters inside your own data centers, customers can outsource supercomputing needs to cloud vendors. Amazon offers special cluster compute instances for just such a purpose, and even built a supercomputer on the Elastic Compute Cloud that ranked among the Top 500 supercomputing sites in the world. Another project recently featured by Ars used the Amazon compute cloud to build a 30,000-core cluster for a pharmaceutical company that ran for about seven hours at a peak cost of $1,279 per hour.


The Cloud Server So Large They Should Call It "Hurricane"

As computational-heavy research gains momentum, the amount of data researchers generate is exploding—a single sequencing of DNA requires as much as 28 terabytes. So where do American researchers store their most ginormous data sets? In the largest academic cloud server in the US, at the San Diego Supercomputer Center.
The SDSC Cloud has an initial capacity of 5.5 petabytes—roughly 250 billion pages of text—and achieves sustained read rates of 8-10GB/s—that's 250GB every 30 seconds. And that's just to start. The Cloud is scalable, on-demand, up to hundreds of petabytes. "We believe that the SDSC Cloud may well revolutionize how data is preserved and shared among researchers, especially massive datasets that are becoming more prevalent in this new era of data-intensive research and computing," said Michael Norman, director of SDSC said in a press release. "The SDSC Cloud goes a long way toward meeting federal data sharing requirements, since every data object has a unique URL and could be accessed over the Web."
Green Internet Consultant. Practical solutions to reducing GHG emissions such as free broadband and electric highways. http://green-broadband.blogspot.com/
email: Bill.St.Arnaud@gmail.com
twitter: BillStArnaud
blog: http://billstarnaud.blogspot.com/
skype: Pocketpro