Nvidia Maxeler Technologies
HPCwire

Since 1986 - Covering the Fastest Computers
in the World and the People Who Run Them

Language Flags

Visit additional Tabor Communication Publications

Datanami
Digital Manufacturing Report
HPC in the Cloud

ARM, HP, SK hynix Join Hybrid Memory Cube Consortium


BOISE, Idaho, June 27 -- The Hybrid Memory Cube Consortium (HMCC), led by Micron Technology and Samsung Electronics Co., Ltd., today announced that new members ARM, HP, and SK hynix, Inc. have joined the global effort to accelerate widespread industry adoption of Hybrid Memory Cube (HMC) technology. The HMCC is a collaboration of original equipment manufacturers (OEMs), enablers and integrators who are cooperating to develop and implement an open interface standard for the innovative new memory technology.

Micron and Samsung, the initial developing members of the HMCC, are working closely with Altera, IBM, Microsoft, Open-Silicon, Xilinx and now ARM, HP and SK hynix – to draft an industry-wide specification that should pave the way for a wide range of electronic advances.

“The strong collection of companies who have joined the consortium – representing a broad range of technology interests – reflects the perceived high value of HMC as the next standard for high-performance memory applications,” said Robert Feurle, Micron’s vice president for DRAM marketing.  “With the addition of ARM, HP and SK hynix as developers, who will help to determine the specific features, the consortium is well positioned to provide a new open standard for next-gen electronics.”

HMC features will enable highly efficient memory solutions for applications ranging from industrial products to high-performance computing and large-scale networking. The HMCC’s team of developers plans to deliver a draft interface specification to the growing number of “adopters” joining the consortium. Then, the combined team of developers and adopters will refine the draft and release a final interface specification, currently targeted for the end of this year.

As envisioned, HMC capabilities will leap beyond current and near-term memory architectures in the areas of performance, packaging and power efficiencies, offering a major alternative to present memory technology.

 One of the primary challenges facing the industry -- and a key motivation for forming the HMCC -- is that the memory bandwidth required by high-performance computers and next-generation networking equipment has increased beyond what conventional memory architectures can provide. The term “memory wall” has been used to describe this challenge. Breaking through the memory wall requires architecture such as HMC that can provide increased density and bandwidth with significantly lower power consumption.

Adopter membership in the HMCC is available to any company interested in joining the consortium and participating in the specification development. Already, the HMCC has responded to interest from more than 90 prospective adopters. 

Additional information, technical specifications, tools and support for adopting the technology can be found at www.hybridmemorycube.org.

 

About the HMCC

Founded by leading members of the world’s semiconductor community, the Hybrid Memory Cube Consortium (HMCC) is dedicated to the development and establishment of an industry-standard interface specification for the Hybrid Memory Cube technology. Members of the consortium include Altera, ARM, HP, IBM,  SK hynix, Micron, Microsoft,  Open-Silicon, Samsung, and Xilinx.  More than 90 prospective adopters are exploring consortium membership. To learn more about the HMCC, visit www.hybridmemorycube.org.

-----

Source: Hybrid Memory Cube Consortium

HPCwire on Twitter

Discussion

There are 0 discussion items posted.

Join the Discussion

Join the Discussion

Become a Registered User Today!


Registered Users Log in join the Discussion

June 28, 2012

June 27, 2012

June 26, 2012

June 25, 2012

June 21, 2012

June 20, 2012

June 19, 2012

June 18, 2012


Most Read Features

Most Read Around the Web

Most Read This Just In


Feature Articles

The Uber-Cloud Experiment

Even with its promise of easy access to pay-per-use computing, HPC-as-a-Service as a delivery model has yet to be widely embraced by high performance computing users. In this article, authors Wolfgang Gentzsch and Burak Yenier describe an HPC service experiment that brings together industry users, resource providers, software providers, and HPC experts, which they believe will help pave the way for wider adoption.
Read more...

Lawrence Livermore, IBM Offer Petascale Supercomputer to Industry

One by one, US government HPC labs are getting into the industry partnership business. The latest is Lawrence Livermore National Laboratory (LLNL), who this week announced it was teaming with IBM to form "Deep Computing Solutions," a collaboration that is being folded into LLNL’s new High Performance Computing Innovation Center,
Read more...

An HPC Programming Model for the Exascale Age

As the supercomputing faithful prepare for exascale computing, there is a great deal of talk about moving beyond the two-decades-old MPI programming model . The HPC programmers of tomorrow are going to have to write codes that are able to deal with systems hundreds of times larger than the top supercomputers of today, and the general feeling is that MPI, by itself, will not make that transition gracefully. One of the alternatives being offered is a PGAS model known as GASPI...
Read more...

Around the Web

Supercomputer Learns How to Recognize Cats

Jun 28, 2012 | Google scientists build neural network with visual smarts.
Read more...

Changing the Phase of Memory

Jun 26, 2012 | Researchers look to boost speed of phase change memory.
Read more...

Supercomputer Sails Through World History

Jun 25, 2012 | SGI 's new UV 2 super swallows Wikipedia and maps the history of the world.
Read more...

Sponsored Whitepapers

Tackling the Data Deluge: File Systems and Storage Technologies

06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.

Sponsored Multimedia

Think Tank HPCwire

Newsletters


HPC Job Bank


Featured Events






HPC Wire Events