• Professor of Geophysics Mark Simons stands in front of Caltech's newly upgraded high-performance computing cluster, which is housed in the basement of the South Mudd Building. The cluster occupies a floor area that is roughly 23 feet long and 3 feet wide, and weighs about 20,000 pounds. It has five times the computational power of a previous cluster that occupied this space.
  • Caltech's CITerra supercomputer, which can perform 150 trillion calculations per second, is a large network of ordinary computers that work together to solve challenging problems in the geosciences.
12/06/2012 00:01:15

Calculated science

A new supercomputer helps Caltech researchers tackle more complicated problems

One of the most powerful computer clusters available to a single department in the academic world just got stronger.

The California Institute of Technology's CITerra supercomputer, a high-performance computing cluster of the type popularly known as a Beowulf cluster, was replaced this year with a faster and more efficient system. The new cluster capitalizes on improvements in fiber-optic cables and video chips—the kind found in many gaming devices and mobile phones—to increase processing capacity and calculation speeds. With access to this improved supercomputer, Caltech's researchers are able to use advanced algorithms to analyze and simulate everything from earthquakes to global climate and weather to the atmospheres of other planets.

The new $2 million supercomputer, which is administered by the Division of Geological and Planetary Sciences, performs with five times the computational power of the previous cluster while using roughly half the energy.  It has 150 teraflops of computing capacity, meaning it can perform 150 trillion calculations per second. The upgrade was made possible in part with the private support of many individuals, including members of GPS's chair's council, a volunteer leadership board.

So what does a faster, more energy-efficient supercomputer mean for Caltech's geoscientists and geophysicists?

"There is a whole new class of problems that we can now address," says Professor of Geophysics Mark Simons, who oversees the cluster. "We can not only solve a given problem faster, but because it takes less time to solve those problems, we can choose to work on harder problems."

Simons, for instance, is working to develop models to understand what happens underground after an earthquake—and what is likely to occur in the months and years after—by analyzing ground motion observed on the surface. In 2011, for instance, a Caltech research team led by Simons used data from GPS satellites and broadband seismographic networks to develop a comprehensive mechanical model for how the ground moved after Japan's 9.0 earthquake.

"Mark's team developed the framework allowing them to do millions of computations where seismologists had only been able to do hundreds before," says Michael Gurnis, the John E. and Hazel S. Smits Professor of Geophysics and director of the Seismological Laboratory. "The ability to routinely compute at a level that is so much higher than anyone else had previously done—to have the computational resources immediately available during the hectic days after a devastating earthquake—was an amazing advance for geophysics."

Simons is not alone in using advanced computation to unlock Earth's greatest mysteries. He and Gurnis—who studies the forces driving Earth's tectonic plates—are among a group of 15 GPS faculty members who, with their students, routinely use the cluster. The division is unique among universities in that it provides its faculty with access to such a large computational facility, giving almost any of its researchers the ability to number crunch when they need to—and for extended periods of time.

Research done using computations from the previous cluster led to more than 140 published papers, which crossed the fields of atmospheric science, planetary science, Earth science, seismology, and earthquake engineering.

One of the biggest users of the new cluster is Andrew Thompson, an assistant professor of environmental science and engineering, who uses it to simulate complex ocean currents and ocean eddies. Capturing the dynamics of these small ocean storms requires large simulations that need to run for weeks.

Thanks to the size of Caltech's cluster, Thompson has been able to simulate large regions of the ocean, in particular the ocean currents around Antarctica, at high resolution. These models have led to a better understanding of how changes in off-shore currents, related to changing climate conditions, affect ocean-heat transport toward and under the ice shelves. Ocean-driven warming is believed to be critical in the melting of the West Antarctic Ice Sheet.

"Oceanography without modeling and simulations would be really challenging science," says Thompson, who arrived at Caltech in the fall of last year. "These models indicate where we need improved or more frequent observations and help us to understand how the ice sheets might respond to future ocean circulation variability. It is remarkable to have these resources at Caltech. Access to the Caltech cluster eliminates some of the need to apply for time on federal computing facilities, and has allowed my research group to hit the ground running."

Written by Shayna Chabner McKinney