ie8 fix

supercomputer

Titan steals No. 1 spot on Top500 supercomputer list

Predictions that Oak Ridge National Laboratory's Titan supercomputer had become the most powerful machine in the world have turned out to be right.

The machine, powered by Nvidia graphics processors and Advanced Micro Devices computer chips, stole the No. 1 spot on the Top500's list from another U.S. machine, Lawrence Livermore National Laboratory's Sequoia.

Sequoia, which uses processors from IBM, became the top computer in June with a performance of 16.32 petaflops a second. Titan beat that showing, sending Sequoia to second place on the list, with a result of 17.59 petaflops per second. … Read more

Titan supercomputer debuts for open scientific research

Forecasting for weather like this week's "Frankenstorm" may become a lot more accurate with the help of the Department of Energy's Titan supercomputer, a system that launched this month for open research development.

The computer, an update to the Jaguar system, is operated in Tennessee by Oak Ridge National Laboratory, part of the DOE's network of research labs. Researchers from academia, government labs, and various industries will be able to use Titan -- believed to be one of the two most powerful machines in the world -- to research things such as climate change and … Read more

Supercomputer clicked together from Legos and Raspberry Pi's

The flexible, affordable Raspberry Pi Linux computer system has been hacked, tinkered, and transformed into all sorts of creations since its introduction. There's a Raspberry Pi Apple TV, a Raspberry Pi ocean explorer, and Raspberry Pi smart glasses.

Now there's a Raspberry Pi supercomputer. How do you turn a 700MHz mini system into a supercomputer? You use 64 of them and mount them in a rack made out of Legos.… Read more

3D computer model helps screen millions of chemo drugs

Researchers have long used still images of proteins known to be related to recurring cancers in an attempt to understand exactly why these proteins make some chemotherapies fail.

Now, biochemists at Southern Methodist University are using a 3D computer model of the human protein P-glycoprotein -- believed to play a pivotal role in the failure of chemotherapy in many recurring cancers -- to screen more than 8 million potential drug compounds in the hunt for one that will help stop this failure.

"This has been a good proof-of-principle," biochemist John G. Wise said in a school news release. &… Read more

U.S. retakes Top500 supercomputer crown

Sequoia, an IBM Blue Gene/Q supercomputer at the Lawrence Livermore National Laboratory, reached 16.32 petaflops, while previous leader K Computer trailed with 10.5 petaflops, according to the Top500 list. The list was published today at the International Supercomputing Conference in Hamburg.

The latest edition of the list, which is published twice a year, shows that Intel is slipping and IBM is recapturing lost ground, while the U.S. is back on top after losing its lead three years ago. New technologies reign, from updated IBM chips to a build of Fujitsu's novel interconnect product.

Intel processors … Read more

Crave visits the Cray-1, a true museum piece

LOS ALAMOS, N.M. -- Many great masterpieces reside in museums. There's the "Mona Lisa" at the Louvre. "Nighthawks at the Diner" graces the wall at the Art Institute of Chicago. And the Cray-1 sits at the Bradbury Science Museum here in Los Alamos.

The first Cray-1 was installed at Los Alamos National Laboratory in 1976 at a cost of $8.8 million. It set a new world record speed of 160 million floating-point operations per second and boasted 8MB of main memory. According to the museum, it was the first computer to break the megaflop barrier.

By today's hardware standards, the Cray-1 is a great lumbering beast. The dramatic lighting shining on it at the Bradbury exhibit shows off its curves and hulking size. But by 1976 standards, it was a svelte creation whose circular shape kept the complex wiring compact. … Read more

End of an era: NASA shuts down its last mainframe

There was a time when IBM's mainframes were cutting-edge machines for scientific and engineering calculations.

Those days began in the 1960s, when IBM's System 360 rewrote the rules of computing and before humans walked on the moon. Big Blue long since has moved its high-performance technical computing effort toward its high-end Blue Gene systems and more conventional Linux servers using Intel and AMD x86 chips and Unix servers with its own Power processor. IBM's System Z mainframe line is now geared for commercial customers who are willing to pay a premium for reliability and high performance for … Read more

Intel's Qlogic deal pumps up InfiniBand's future

Intel apparently believes there's life beyond Ethernet and USB.

Those industry-standard interfaces are taking over an ever larger number of jobs connecting one digital device to another. Its work with Apple to develop and promote Thunderbolt shows that the company doesn't think USB is the only way to plug a device into a PC, and a deal to acquire InfiniBand assets from Qlogic shows that it sees limits to Ethernet, too.

Intel didn't disclose terms of the deal but said it should close this quarter. Along with the InfiniBand product lines and related assets, Intel said it … Read more

Amazon takes supercomputing to the cloud

You may not need to use the 42nd fastest supercomputer on Earth, but if you want to, you can for just $1,279 per hour.

As reported by Wired, Amazon Web Services latest salvo into the computing on demand landscape is a platform known as the Elastic Cloud Computer, which at $1279 per hour, or $11 million a year if run full time, is probably on par in comparison to the time, effort and expense of procuring the same level of compute power in your own data center.

Amazon's virtual super computer is capable of running 240 trillion calculations … Read more

Supercomputer network blasts torrent of data

The petabytes of data being generated at the Large Hadron Collider and other research bodies are facing a physical bottleneck: the network.

A team of physicists, computer scientists, and network engineers have demonstrated a network capable of blasting 186 gigabits per second of data between two supercomputers. They call this new bandwidth record a crucial tool for scientific inquiry and a signal of where commercial networking products are going.

During the SuperComputing conference 2011 last month, researchers installed high-end servers and 100-gigabit-per-second networking gear to connect a supercomputer at the conference in Seattle to another in Victoria, British Columbia.

They … Read more