July 10, 2012
CAMBRIDGE, UK, July 10 -- Through a combination of high speed research networks, advanced sonification techniques and grid computing the world can now ‘hear’ the newly discovered Higgs Boson-like particle.
Research networks, including the pan-European GÉANT network, were critical components in the global infrastructure that helped find the new particle, delivering immense volumes of experimental data from the Large Hadron Collider (LHC) to thousands of scientists around the world for analysis and then providing the connectivity for them to share their results amongst the entire research community.
On Wednesday 4th July 2012, scientists at CERN announced that they had found a Higgs-like particle after analysing results from the Large Hadron Collider. Researchers detected a "bump" in their data corresponding to a particle weighing in at 126 gigaelectronvolts (GeV), consistent with the Higgs Boson, which is believed to give mass to all other particles. This consequently proves the Standard Model, which is the dominant theory of how the universe works at the subatomic level.
Building on this achievement, the same research networks have now been a central part of turning these scientific findings into music using data sonification. Working from results supplied by the Atlas experiment at the Large Hadron Collider (LHC), researchers have created melodies that make the results easier to understand.
Sonification requires enormous amounts of networking and processing power to produce results. Creating the Higgs melody consequently relied on high-speed research networks including the pan-European GÉANT network, which operates at speed of up to 10Gbps and the EGI grid computing infrastructure. Grid computing works by linking together multiple computers in different locations via high speed networks, combining their processing power to deliver faster results when analysing enormous volumes of data.
The project was coordinated by Domenico Vicinanza of DANTE (the UK-based organisation that operates the GÉANT network on behalf of European national research and education networks (NRENs)), in collaboration with Mariapaola Sorrentino of ASTRA Project, Cambridge, who contributed to the sonification process and Giuseppe La Rocca from INFN Catania, responsible for the computing framework.
In the music the peak of high notes in the second bar is the appearance of the Higgs-like particle (about 3.5 seconds into the recording). The researchers created two versions, one as a piano solo, and the second with added bass, percussion, marimba and xylophone.
“The discovery of the Higgs-like particle is a major step forward in our knowledge of the world around us,” said Domenico Vicinanza, DANTE. “By using sonification we are able to make this breakthrough easier to understand by the general public, highlighting the depth and breadth of the enormous research efforts by the thousands of scientists around the world involved with the Large Hadron Collider. Neither the discovery of the particle or this sonification process would have been possible without the high speed research networks that connect scientists across the world, enabling them to collaborate, analyse data and share their results.”
Previous sonification projects from the team include the creation of music from volcanic activity around the world, making it easier to spot potential eruptions by listening to changes in musical pitch.
There are 0 discussion items posted.
Join the Discussion |
Intel Corporation has acquired Whamcloud, a startup devoted to supporting the open source Lustre parallel file system and its user community. The deal marks the latest in a line of high performance computing acquisitions that Intel has made over the past few years to expand its HPC footprint.
Read more...
Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...
Computer memory is currently undergoing something of an identity crisis. For the past 8 years, multicore microprocessors have been creating a performance discontinuity, the so-called memory wall. It's now fairly clear that this widening gap between compute and memory performance will not be solved with conventional DRAM products. But there is one technology under development that aims to close that gap, and its first use case will likely be in the ethereal realm of supercomputing.
Read more...
Jul 16, 2012 |
EUV lithography, the technology chipmakers are counting on to keep Moore's Law alive, is behind schedule.
Read more...
Jul 12, 2012 |
State says supercomputing center can’t pay bills to keep machine running.
Read more...
Jul 11, 2012 |
Computer scientist builds intelligent machine with single-core laptop and some slick algorithms.
Read more...
Jul 10, 2012 |
Science cloud in proof-of concept stage.
Read more...
Jul 09, 2012 |
EU project offers software that makes datacenters more energy-efficient.
Read more...
06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.
Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.