BBC Future
In Depth

Building Titan: The ‘world’s fastest’ supercomputer

About the author

John Pavlus is an award-winning filmmaker and writer focusing on science, technology and design topics. His work has appeared in Scientific American, Technology Review, Wired, National Public Radio, Nature Publishing Group, The New York Times Magazine, Fast Company, and elsewhere. He lives in Brooklyn, NY. 

But where video game physics only have to look real enough to a distracted teenager, supercomputer simulations have to be scientifically accurate down to the level of individual atoms - which is why Titan needs tens of thousands of GPUs all working together on the same problem, not to mention enough Random Access Memory (RAM) to hold the entire simulation in memory at once. (Titan has 710 terabytes of RAM, about as much as a stack of iPads 7km high.)

But supercomputers have been getting along without GPUs for decades. A CPU chip - the same general-purpose silicon "brain" inside your laptop, your smartphone, and every computer at Google or Facebook - can run high-performance scientific calculations, too, if you chain enough of them together. The current fastest supercomputer, IBM's "Sequoia" system at Lawrence Livermore National Laboratory in California, contains over 98,000 CPUs, each with 18 cores.

What GPUs offer that CPUs can't is a blast of relatively cheap, energy-efficient horsepower. Scaling up the Jaguar supercomputer from 1.75 petaflops to 20 could have been done by adding more cabinets stuffed full of CPUs. But those take up space, and more importantly, suck up power. Off-the-shelf GPUs, meanwhile, aren't designed to act self-sufficiently like normal chips  - they're add-ons "that accelerate a CPU like a turbo engine," says Gupta - so they consume much less energy than a CPU would to do the same amount of calculating. By bolting a GPU onto each one of the 18,688 AMD Opteron CPU chips already in Jaguar, the DoE was able to create a next-generation supercomputer without scrapping the one they already had - or blowing up their electric bill.

Bigger is better

The new machine, like any supercomputer, is all about speed: "time to solution," as Jack Wells, director of science for Oak Ridge’s computing facility, puts it. "It's about solving problems that are so important that you can't wait," he says. "If you can afford to wait, you're not doing supercomputing." Competition among research projects for "core hours" on Titan is intense. Of the 79 new-project proposals received by Oak Ridge's selection panel, only 19 will run on Titan in 2013.

Winning proposals will apply Titan's computational might to problems in areas such as astrophysics (simulating Type-1A supernovae and core collapses), biology (modeling human skin and blood flow at a molecular level), earth science (global climate simulations and seismic hazard analysis of the San Andreas fault in California), and chemistry (optimizing biofuels and engine combustion turbulence). According to Buddy Bland, project director of the Oak Ridge computing facility, Titan will typically run four or five of these supercomputing "jobs" at once.

But some jobs are so complex that they'll take over Titan entirely. The Princeton Plasma Physics Laboratory, for example, will use all of Titan's computing cores to help design components for the International Thermonuclear Experimental Reactor (Iter), a prototype nuclear fusion project in France. "Their goal is to have this reactor online by 2017," Bland says. "It'll use magnetic fields to circulate plasma through a big donut-shaped reactor at 100 million degrees Fahrenheit. How do you contain that kind of energy? That's what they need Titan to help them figure out."

As fast as Titan is, these simulations can still take days, weeks, or even months to complete. And the very idea of "fast" has a different meaning to computational scientists than it does to users of consumer apps like Photoshop or Final Cut Pro. "It's not so much about running our applications and calculations faster - we want to run them bigger," says Tom Evans, a scientist at Oak Ridge who uses the supercomputer to model nuclear reactor systems. "Maybe that means adding four times more spatial resolution in our simulations, or replacing approximations with more accurate physics. Of course we always like to go faster. But it's less interesting to do the same science faster than it is to do something new that you couldn't even do before."

BBC © 2013 The BBC is not responsible for the content of external sites. Read more.

This page is best viewed in an up-to-date web browser with style sheets (CSS) enabled. While you will be able to view the content of this page in your current browser, you will not be able to get the full visual experience. Please consider upgrading your browser software or enabling style sheets (CSS) if you are able to do so.