Convey Computer DataDirect Networks
HPCwire

Since 1986 - Covering the Fastest Computers
in the World and the People Who Run Them

Language Flags

Datanami
Digital Manufacturing Report
HPC in the Cloud

Tabor Communications
Corporate Video

Helix Nebula Cloud Contributes to Higgs Particle Discovery


The scientific community is abuzz over the recent discovery of a Higgs-like particle based on experiments performed on the Large Hadron Collider (LHC) at CERN. What is less well known is that some of the data generated by the breakthrough experiments were processed using a pan-European cloud computing project, known as the Helix Nebula.  The project was introduced in March to provide scientists with cloud-based computing and analytics resources.

A partnership between IT service providers and scientific facilities, Helix Nebula – The Science Cloud consists of research institutions CERN, the European Molecular Biology Laboratory (EMBL) and the European Space Agency (ESA). The program is currently undergoing a two-year pilot phase with €1.8 million in funding from the European Commission.  In addition to particle discovery research, the cloud has also assisted with studies in earth observation and molecular biology. During this pilot phase, scientists have deployed applications with tens of thousands of jobs across multiple datacenters.

Michael Symonds, principal architect for Atos, one of the cloud resource providers expressed positive feedback regarding the project’s status. "Setting up a public style cloud for very demanding research organizations is very different to providing private enterprise cloud services to companies,” he said. “It has taken a lot of effort but we are all pleased with these early results and are confident we can build on this in the future."

In September, representatives from each of the Helix Nebula research facilities will deliver a keynote at the ISC Cloud’12 Conference in Mannheim, Germany. Wolfgang Gentszch, general chair of the event and contributor to HPCintheCloud, interviewed CERN’s Bob Jones, ESA’s Wolfgang Lengert, and EMBL’s Rupert Lueck to talk about the new science cloud. 

Jones explained the main difference between the science cloud and CERN’s previous LHC Computing Grid. “The [LHC Computing Grid] that has been essential to the LHC experiments' work to observe a particle consistent with long-sought Higgs boson consists of publicly-managed data centers,” he explained. “Helix Nebula is a public-private partnership” In this case, research is being processed at commercial datacenters.

When asked what benefits the cloud would provide, Lengert mentioned that the infrastructure would simplify access to data, tools, models and a collaboration platform. The European space agency’s ERS/Envisat missions have resulted in datasets containing over two decades of information about the land, oceans, atmosphere and cryosphere. 

Looking to the future, Jones predicts additional collaborators and adopters to hop on board. “Assuming the pilot phase is successful,” he said, “we expect Helix Nebula to grow to include more commercial cloud services providers and public organizations as consumers.”


Full story at HPC in the Cloud

HPCwire on Twitter

Discussion

There are 0 discussion items posted.

Join the Discussion

Join the Discussion

Become a Registered User Today!


Registered Users Log in join the Discussion

July 12, 2012

July 11, 2012

July 10, 2012

July 09, 2012

July 06, 2012

July 05, 2012

July 04, 2012

July 03, 2012

July 02, 2012

June 29, 2012


Xyratex ClusterStor 6000

Feature Articles

DOE Primes Pump for Exascale Supercomputers

Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...

Hybrid Memory Cube Angles for Exascale

Computer memory is currently undergoing something of an identity crisis. For the past 8 years, multicore microprocessors have been creating a performance discontinuity, the so-called memory wall. It's now fairly clear that this widening gap between compute and memory performance will not be solved with conventional DRAM products. But there is one technology under development that aims to close that gap, and its first use case will likely be in the ethereal realm of supercomputing.
Read more...

Green500 Turns Blue

The latest Green500 rankings were announced last week, revealing that top performance and power efficiency can indeed go hand in hand. According to the latest list, the greenest machines, in fact the top 20 systems, were all IBM Blue Gene/Q supercomputers. Blue Gene/Q, of course, is the platform that captured the number one spot on the latest TOP500 list, and is represented by four of the ten fastest supercomputers in the world.
Read more...

Sponsored Whitepapers

Tackling the Data Deluge: File Systems and Storage Technologies

06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.

Sponsored Multimedia

Michael Wolfe Webinar: PGI Accelerator with OpenACC

Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.

Think Tank HPCwire

Newsletters

The Portland Group

HPC Job Bank


Featured Events








HPC Wire Events