HPC on Wall Street HPCwire Job Bank
HPCwire

Since 1986 - Covering the Fastest Computers
in the World and the People Who Run Them

Language Flags

Datanami
Digital Manufacturing Report
HPC in the Cloud

Tabor Communications
Corporate Video

Storage Environment for UK Cluster Provided by DDN


July 17 -- A new storage environment providing 7.8PB of storage and an additional 19.5PB of backup capability is to improve long-term data storage for hundreds of UK users of the HECToR (High-End Computing Terascale Resource) supercomputer. HECToR is hosted by EPCC at the University of Edinburgh and funded by the Engineering and Physical Sciences Research Council (EPSRC), and the Natural Environment Research Council (NERC).

The additional storage compliments HECToR’s existing 1 Petabyte of disk space. Although tightly integrated with HECToR, the new storage environment is built independently and - because it is designed to out-live HECToR - will be available for use with successive supercomputers. 

The storage environment was designed and built by data processing, data management and storage provider OCF plc. It uses storage hardware from DataDirect Networks (DDN), and archive hardware and file management software from IBM.

“We needed a more a data-centric view of high performance computing,” says Professor Arthur Trew, University of Edinburgh. “Data persists beyond any computer, including HECToR, so we’re prioritising data storage, management and analysis. Doing this enables us to upgrade HECToR and integrate its successor without fear of impacting access to research data. Our expectation is that any future computer must be able to integrate seamlessly with our storage.” 

Scientists currently store highly complex simulations on site at Edinburgh – file sizes vary from user to user, but each can potentially be gigabytes in size. The passage of data for further interrogation is unique to each researcher and may involve transferring the data to other data repositories off site, moving data to different parts of the country or simply “taking it home” using portable media.    

Julian Fielden, OCF managing director, says: “There is lots of talk and consensus at the moment that the problem with big data isn’t really the capacity to store it, but how to access, use and find the data and, in doing so, make it into useful information. The collective investment of the research councils is cleverly helping to avoid this problem by making storage independent of the machine that generated it. Combined with good network access and IBM’s parallel file system GPFS, the data becomes easy to locate and use by any researcher irrespective of location.”

“As we enter the big data era, organisations in every field of endeavour are addressing the World’s most pressing scientific and medical questions – questions that would have been too complex to address just a few years ago,” says Bill Cox, DDN Vice President of Worldwide Channel Sales. “EPCC and its partner organisations have built a technologically advanced, state-of-the-art facility at the University of Edinburgh that opens a world of possibility to researchers across the UK. DDN is very pleased to join with OCF in assisting on this important project.”

The storage environment built by OCF now uses:

  • DDN Storage Fusion Architecture (SFA) 10K-X, a leading integrated storage appliance that maximises application performance while minimising total cost of ownership for big data, cloud, and content-intensive environments. The SFA 10K-X provides 7.8 Petabytes of useable storage capacity.
  • The IBM System Storage TS3500 Tape Library (TS3500 tape library) which is designed to provide a highly scalable, automated tape library for mainframe and open systems backup and archive in midrange to enterprise environments. The TS3500 storage library provides 19.5 Petabytes of capacity.
  • IBM GPFS software to enable

          seamless storage capacity expansion to handle the explosive growth of big data and digital             information;

          Improved efficiency through enterprise wide, interdepartmental file sharing;

          Proven commercial-grade reliability to eliminate production outages and eases information             life cycle management with policy-driven automation;

          Cost-effective disaster recovery and business continuity;

Active File Management to enable asynchronous access and control of local and remote files.

-----

Source: OCF

HPCwire on Twitter

Discussion

There are 0 discussion items posted.

Join the Discussion

Join the Discussion

Become a Registered User Today!


Registered Users Log in join the Discussion

July 18, 2012

July 17, 2012

July 16, 2012

July 13, 2012

July 12, 2012

July 11, 2012

July 10, 2012

July 09, 2012

July 06, 2012



Feature Articles

Researchers Squeeze GPU Performance from 11 Big Science Apps

In a report published this week, researchers documented that GPU-equipped supercomputers enabled application speedups between 1.4x and 6.1x across a range of well-known science codes. While those results aren't the order of magnitude performance increases that were being bandied about in the early days of GPU computing, the researchers were encouraged that the technology is producing consistently good results with some of the most popular HPC science applications in the world.
Read more...

Intel Expands HPC Collection with Whamcloud Buy

Intel Corporation has acquired Whamcloud, a startup devoted to supporting the open source Lustre parallel file system and its user community. The deal marks the latest in a line of high performance computing acquisitions that Intel has made over the past few years to expand its HPC footprint.
Read more...

DOE Primes Pump for Exascale Supercomputers

Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...

Around the Web

Aussie Supercomputer Simulates Common Cold's Susceptibility to New Drug

Jul 18, 2012 | Blue Gene/Q super gives researchers better picture of drug-virus interaction.
Read more...

HPC Community Remembers Allan Snavely

Jul 17, 2012 | Co-creator of Gordon supercomputer suffers fatal heart attack.
Read more...

Developing Diminutive Transitors is a Fight Against Physics

Jul 16, 2012 | EUV lithography, the technology chipmakers are counting on to keep Moore's Law alive, is behind schedule.
Read more...

New Mexico to Pull Plug on Encanto, Former Top 5 Supercomputer

Jul 12, 2012 | State says supercomputing center can’t pay bills to keep machine running.
Read more...

Computer Program Learns Games by Watching People

Jul 11, 2012 | Computer scientist builds intelligent machine with single-core laptop and some slick algorithms.
Read more...

Sponsored Whitepapers

Tackling the Data Deluge: File Systems and Storage Technologies

06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.

Sponsored Multimedia

Michael Wolfe Webinar: PGI Accelerator with OpenACC

Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.

Think Tank HPCwire

Newsletters


HPC Job Bank


Featured Events





  • September 24, 2012 - September 25, 2012
    ISC Cloud ‘12
    Mannheim,
    Germany




HPC Wire Events