HPCwire

Since 1986 - Covering the Fastest Computers
in the World and the People Who Run Them

Language Flags

Datanami
Digital Manufacturing Report
HPC in the Cloud

Tabor Communications
Corporate Video

Indiana University Team Accelerates Gene Expression Software


BLOOMINGTON, Ind., July 17 -- Key software used to study gene expression now runs four times faster, thanks to performance improvements put in place by a team from the Indiana University Pervasive Technology Institute (PTI), the Broad Institute of MIT and Harvard and Technische Universität Dresden.

The timesaving breakthroughs will allow bioinformaticians and biologists who study RNA sequences to analyze more data in a shorter amount of time. This will speed the understanding of biological processes in fields as diverse as ecology, evolution, biofuels and medicine.

Robert Henschel and Richard D. LeDuc, of PTI and IU's National Center for Genome Analysis Support (NCGAS), announced the findings today at the XSEDE12 conference in Chicago. Henschel and LeDuc, along with partners from the Broad Institute and the Center for Information Services and High Performance Computing (ZIH) at Technische Universität Dresden, teamed up to announce this advance in a fast-growing area of computational biology.

The software, known as Trinity, was developed by researchers at the Broad Institute and Hebrew University. It produces high-quality RNA sequence assemblies used by scientists studying gene expression. These RNA sequence assemblies allow scientists to know which genes are active within a living creature. Trinity is especially useful for studying organisms without a complete genome sequence, such as agricultural pests, ecological indicator species and human parasites.

The software has long been considered a leader in the field, but it needed some fine tuning.

"IU research technologists strive to deliver tools and services that accelerate discoveries for scientists all over the world. By collaborating with our counterparts at Broad and ZIH, we were able to do just that with Trinity. This is just one example of how the various centers affiliated with PTI—such as NCGAS—improve the capabilities of scientists at home and abroad," said Craig Stewart, executive director of IU's Pervasive Technology Institute and principal investigator of the National Science Foundation grant that funds NCGAS.

"In the past, Trinity was a high quality tool but the run time was too long," said Henschel. "Now with our performance improvements, it runs as fast as the competition—if not faster—and still produces superior quality sequence assemblies."

The partners first used standard high performance computing techniques to improve the software's speed. Specifically, this involved building Trinity with an optimizing compiler for the Intel® Xeon® architecture and using optimizing compiler flags. In addition, the team properly configured the application to take full advantage of multicore, multisocket compute nodes in today's clusters.

Next, the team finetuned each part of the Trinity package to improve the overall scalability of the application. They used Vampir performance analysis tools, developed at ZIH, to gain insights into the software's performance. The optimizations included improving and parallelizing input/output, simplifying data structures for better performance and optimizing parallel regions in the application.

Henschel is hopeful that IU's work with Trinity will continue. "We are working on establishing a continued collaboration between IU, Broad and ZIH to further optimize Trinity," said Henschel. "We hope these performance improvements are just the beginning of a longer term relationship that will continue to benefit biological research."

About XSEDE12 and XSEDE

XSEDE12 is the first conference of the Extreme Science and Engineering Discovery Environment (XSEDE), a national collaboration that provides cyberinfrastructure services and resources to support scientific discovery in fields such as medicine, engineering, earthquake science, epidemiology, genomics, astronomy and biology.

XSEDE is funded through a five-year, $121 million National Science Foundation (NSF) grant. For more, see http://www.xsede.org.

About Indiana University Pervasive Technology Institute

The Pervasive Technology Institute is IU's flagship initiative for advanced information technology research, development and delivery in support of research, scholarship and artistic performances. The National Center for Genome Analysis Support (which includes LeDuc) and the Research Technologies division (which includes Robert Henschel) are both Service and CyberInfrastructure Centers affiliated with PTI. For more, see http://pti.iu.edu.

About the Broad Institute of MIT and Harvard

The Eli and Edythe L. Broad Institute of Harvard and MIT was launched to empower creative scientists to transform medicine. The Broad Institute seeks to describe all the molecular components of life and their connections; discover the molecular basis of major human diseases; develop effective new approaches to diagnostics and therapeutics; and disseminate discoveries, tools, methods and data openly to the entire scientific community. For more, see http://www.broadinstitute.org.

About the Center for Information Services and High Performance Computing at Technische Universität Dresden

The Center for Information Services and High Performance Computing (ZIH) at Technische Universität Dresden in Germany supports  other departments and institutions in their research and education for all matters related to information technology and computer science. For more, see http://www.tu-dresden.de/zih.

-----

Source: Indiana University

HPCwire on Twitter

Discussion

There are 0 discussion items posted.

Join the Discussion

Join the Discussion

Become a Registered User Today!


Registered Users Log in join the Discussion

July 18, 2012

July 17, 2012

July 16, 2012

July 13, 2012

July 12, 2012

July 11, 2012

July 10, 2012

July 09, 2012

July 06, 2012




Feature Articles

Researchers Squeeze GPU Performance from 11 Big Science Apps

In a report published this week, researchers documented that GPU-equipped supercomputers enabled application speedups between 1.4x and 6.1x across a range of well-known science codes. While those results aren't the order of magnitude performance increases that were being bandied about in the early days of GPU computing, the researchers were encouraged that the technology is producing consistently good results with some of the most popular HPC science applications in the world.
Read more...

Intel Expands HPC Collection with Whamcloud Buy

Intel Corporation has acquired Whamcloud, a startup devoted to supporting the open source Lustre parallel file system and its user community. The deal marks the latest in a line of high performance computing acquisitions that Intel has made over the past few years to expand its HPC footprint.
Read more...

DOE Primes Pump for Exascale Supercomputers

Intel, AMD, NVIDIA, and Whamcloud have been awarded tens of millions of dollars by the US Department of Energy (DOE) to kick-start research and development required to build exascale supercomputers. The work will be performed under the FastForward program, a joint effort run by the DOE Office of Science and the National Nuclear Security Administration (NNSA) that will focus on developing future hardware and software technologies capable of supporting such machines.
Read more...

Around the Web

Aussie Supercomputer Simulates Common Cold's Susceptibility to New Drug

Jul 18, 2012 | Blue Gene/Q super gives researchers better picture of drug-virus interaction.
Read more...

HPC Community Remembers Allan Snavely

Jul 17, 2012 | Co-creator of Gordon supercomputer suffers fatal heart attack.
Read more...

Developing Diminutive Transitors is a Fight Against Physics

Jul 16, 2012 | EUV lithography, the technology chipmakers are counting on to keep Moore's Law alive, is behind schedule.
Read more...

New Mexico to Pull Plug on Encanto, Former Top 5 Supercomputer

Jul 12, 2012 | State says supercomputing center can’t pay bills to keep machine running.
Read more...

Computer Program Learns Games by Watching People

Jul 11, 2012 | Computer scientist builds intelligent machine with single-core laptop and some slick algorithms.
Read more...

Sponsored Whitepapers

Tackling the Data Deluge: File Systems and Storage Technologies

06/25/2012 | NetApp | A single hour of data collection can result in 7+ million files from just one camera. Collection opportunities are limited and must be successful every time. As defense and intelligence agencies seek to use the data collected to make mission-critical battlefield decisions, there’s greater emphasis on smart data and imagery collection, capture, storage and analysis to drive real-time intelligence. The data gathered must accurately and systematically be analyzed, integrated and disseminated to those who need it – troops on the ground. This reality leads to an inevitable challenge – warfighters swimming in sensors, drowning in data. With the millions, if not billions, of sensors providing all-seeing reports of the combat environment, managing the overload demands a file system and storage infrastructure that scales and performs while protecting the data collected. Part II of our whitepaper series highlights NetApp’s scalable, modular, and flexible storage solution to handle the demanding requirements of sophisticated ISR environments.

Sponsored Multimedia

Michael Wolfe Webinar: PGI Accelerator with OpenACC

Join Michael for a look at the first PGI Accelerator Fortran and C compilers to include comprehensive support for OpenACC, the new open standard for programming accelerators using compiler directives.

Think Tank HPCwire

Newsletters

HPC on Wall Street

HPC Job Bank


Featured Events





  • September 24, 2012 - September 25, 2012
    ISC Cloud ‘12
    Mannheim,
    Germany




HPC Wire Events