July 13, 2012
This statement appeared on Whamcloud's website on Friday afternoon:
Dear Valued Customer:
On July 13, 2012, Intel Corporation acquired Whamcloud. On behalf of Intel Corporation, I want to take this opportunity to assure you of Intel’s commitment to our current and future customers as well as a seamless transition.
The Whamcloud acquisition extends Intel’s software and service portfolio in the high performance computing space in addition to reinforcing Intel’s position in the open source community. Working as one company, we are now in a stronger position to advance our mutual goals and continue providing vendor neutral solutions, delivering greater value to our customers, and moving the industry to exascale performance.
In the near term, please continue contacting the same Whamcloud representative you have been working with prior to the acquisition until directed otherwise through future communications.
The sales support and order management system provided by Whamcloud prior to the acquisition will continue in a ‘business as usual mode’ for the near future. Please be assured that as transition plans progress, we will provide you with detailed updates. Our goal is to make the transition as easy as possible for our customers.
As former president and CEO of Whamcloud and now General Manager of the High Performance Data Division at Intel, I can assure you Intel is excited about this acquisition and values your business. We look forward to working with you during this transition and in the future. Thank you in advance for your support.
Sincerely,
Brent Gorda
GM, High Performance Data Division
Intel Corporation
-----
Source: Whamcloud
There are 0 discussion items posted.
Join the Discussion |
Intel Corporation has acquired Whamcloud, a startup devoted to supporting the open source Lustre parallel file system and its user community. The deal marks the latest in a line of high performance computing acquisitions that Intel has made over the past few years to expand its HPC footprint.
Read more...
Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...
Computer memory is currently undergoing something of an identity crisis. For the past 8 years, multicore microprocessors have been creating a performance discontinuity, the so-called memory wall. It's now fairly clear that this widening gap between compute and memory performance will not be solved with conventional DRAM products. But there is one technology under development that aims to close that gap, and its first use case will likely be in the ethereal realm of supercomputing.
Read more...
Jul 17, 2012 |
Co-creator of Gordon supercomputer suffers fatal heart attack.
Read more...
Jul 16, 2012 |
EUV lithography, the technology chipmakers are counting on to keep Moore's Law alive, is behind schedule.
Read more...
Jul 12, 2012 |
State says supercomputing center can’t pay bills to keep machine running.
Read more...
Jul 11, 2012 |
Computer scientist builds intelligent machine with single-core laptop and some slick algorithms.
Read more...
Jul 10, 2012 |
Science cloud in proof-of concept stage.
Read more...
06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.
Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.