July 03, 2012
July 3 -- FIT4Green –project concentrated in finding new solutions for saving energy in data centres. The project designed and implemented an energy-aware plug-in on top of the current data centres' management tools to orchestrate the allocation of ICT resources and turning off unused equipment. Project achieved its goal: 20 % direct ICT equipment energy savings without compromising compliance with Service Level Agreements (SLA) and Quality of Service (QoS) metrics. The achieved savings in CO2emissions were on the same scale as in energy. The direct energy savings in the ICT equipment induce also remarkable additional savings due to the reduced needs for cooling, for example.
FIT4Green plug-in is designed to be applicable to any data centre type. The plug-in was validated in three representative data centres: service/enterprise portal at ENI, supercomputing data centre at Jülich Supercomputing Centre with a federated site at VTT Technical Research Centre of Finland, and cloud computing platform at HP. VTT's work in the project concentrated on the optimizations in the supercomputing scenario. The target of 20% was reached in each test bed, and in some cases the savings were even up to 50%. The comparison point for all the savings was the same system without any energy optimizations.
All the 16 public deliverables of the project are freely available on the project web site at http://www.fit4green.eu. The plug-in code has also been released as open source software.
FIT4Green was coordinated by GFI Informática with HP Italy Innovation Centre as the technological leader. Other partners besides VTT were University of Passau, Jülich Supercomputing Centre, Imperial College London, University of Mannheim, Create-Net, Eni S.p.A., and Almende BV.
-----
Source: FIT4Green
There are 0 discussion items posted.
Join the Discussion |
Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...
Computer memory is currently undergoing something of an identity crisis. For the past 8 years, multicore microprocessors have been creating a performance discontinuity, the so-called memory wall. It's now fairly clear that this widening gap between compute and memory performance will not be solved with conventional DRAM products. But there is one technology under development that aims to close that gap, and its first use case will likely be in the ethereal realm of supercomputing.
Read more...
The latest Green500 rankings were announced last week, revealing that top performance and power efficiency can indeed go hand in hand. According to the latest list, the greenest machines, in fact the top 20 systems, were all IBM Blue Gene/Q supercomputers. Blue Gene/Q, of course, is the platform that captured the number one spot on the latest TOP500 list, and is represented by four of the ten fastest supercomputers in the world.
Read more...
Jul 12, 2012 |
State says supercomputing center can’t pay bills to keep machine running.
Read more...
Jul 11, 2012 |
Computer scientist builds intelligent machine with single-core laptop and some slick algorithms.
Read more...
Jul 10, 2012 |
Science cloud crunched data that helped build the case for the historic announcement.
Read more...
Jul 09, 2012 |
EU project offers software that makes datacenters more energy-efficient.
Read more...
Jul 05, 2012 |
Processor speed and power consumption are now at odds, which will force chipmakers to rethink their designs..
Read more...
06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.
Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.