"Like It or Not, We Are Living on This Planet"

The number of large destructive earthquakes in 2010, plus a flurry of medium magnitude quakes in California, led many people to ask, Are we in a period of heightened temblor activity, and is it likely to continue? It's also raised questions among both scientists and laypeople about whether these events are related—and if so, how. The eruption of an Icelandic volcano, which disrupted air traffic in Europe for weeks, serves as an additional reminder that we live on a volatile planet. What, if anything, does this apparent uptick in geological activity portend, and how does it compare to events in Earth's past history? E&S sat down with Hiroo Kanamori, the Smits Professor of Geophysics, Emeritus, and Joe Kirschvink, the Van Wingen Professor of Geobiology, to hear their thoughts.

Between February and September 2010, earthquakes ranging from magnitude 6.8 to 8.8 occurred in regions as far-flung as Sumatra, China, Chile, New Zealand, and Baja, California. Are we in fact seeing more large quakes than usual?

Hiroo Kanamori: There are a couple of ways to answer this question. If you look at very major earthquakes, we are not seeing as much activity as between 1950 and 1965, when there were three events of magnitude 9 or greater in which an enormous amount of energy was released.

However, if we total up the number of quakes over magnitude 8 that have occurred since the first great Sumatran quake of 2004, we do find that these numbers really have increased. On average about one quake per year is magnitude 8 or larger. Since 2004, on average we have had two quakes of that size or more annually.

Is this statistically significant?

HK: We don't really know! Thanks to a study that's been going on for about the last 18 years, we do know a great deal more than we used to about triggering events in earthquakes. We now know that every large earthquake sends out seismic waves that can travel some distance and potentially activate seismic activity elsewhere. 

How well do scientists understand the physical mechanisms that might touch off a quake cascade like this?

HK: We have several different models and theories. The most straightforward mechanism would be one in which the seismic waves increase stress on other faults that they're passing through. If those faults are already close to rupture, this seismic impact may be enough to push things over the edge.

There are also cases in which this activity is delayed. This appears to be what happened this summer when the magnitude 7.2 quake that had occurred in Baja California in April touched off two moderate quakes in June and July on the San Jacinto fault in Southern California.

Joe, you've made in-depth studies of ancient geological upheavals. Can you put these recent events in perspective for us?

JK: Just to take volcanoes, the Icelandic eruption that we saw this spring was tiny compared to eruptions that have happened previously in Earth's history. In California alone, about three-quarters of a million years ago—which geologically is nothing—the Long Valley Caldera, between Mono Lake and Mammoth, blew its top. The eruption covered the southwestern United States with a blanket of ash that extended all the way to the Mississippi. The sediments that washed off from the Mississippi delta produced deposits that in some places were hundreds of meters thick. That episode was far, far worse than anything in human memory. There was a similar eruption about two million years ago in what is today Yellowstone. 

One question we often hear from both the public and the media is, will we ever be able to predict earthquakes the way we can—more or less—forecast the weather? What are your views?

HK: There are such fundamental differences between weather forecasting and earthquake prediction. With weather, the situation basically changes on an almost daily basis. With quakes, we are dealing with long-term processes in which the timescale for stress buildup and release is very long—100 to 1,000 years or more—while the length of time in which quakes occur is very short.

As I said earlier, we have made major advances in our understanding of how these seismic processes operate over these lengthy timescales. But to be able to say there's a strong likelihood that a magnitude 8 earthquake will occur in some specific area within the next hundred years or so is not necessarily very useful for the average layperson. You simply can't handle it like a weather forecast. If the forecast says, "rain tomorrow," you may take your umbrella, and either it rains or it doesn't. However, in the case of earthquakes, if you say that something big is going to happen tomorrow and nothing happens, that can be a problem. And that's really a key difference between climatology and seismology.

JK: I agree with Hiroo. If you want to see how completely distinct the two areas are, just turn the analogy around. Certainly, meteorology averaged over a very long period of time gives you climate. Or, to put it another way, climate is just long-term weather. But I certainly wouldn't advocate analyzing ancient climates to determine whether you'll have a thunderstorm next Tuesday.

To read the full interview, go to Engineering and Science online.

Heidi Aspaturian
Exclude from News Hub: 
News Type: 
Research News

Caltech Geobiologists Uncover Links between Ancient Climate Change and Mass Extinction

PASADENA, Calif.—About 450 million years ago, Earth suffered the second-largest mass extinction in its history—the Late Ordovician mass extinction, during which more than 75 percent of marine species died. Exactly what caused this tremendous loss in biodiversity remains a mystery, but now a team led by researchers at the California Institute of Technology (Caltech) has discovered new details supporting the idea that the mass extinction was linked to a cooling climate.

"While it’s been known for a long time that the mass extinction is intimately tied to climate change, the precise mechanism is unclear," says Seth Finnegan, a postdoctoral researcher at Caltech and the first author of the paper published online in Science on January 27. The mass extinction coincided with a glacial period, during which global temperatures cooled and the planet saw a marked increase in glaciers. At this time, North America was on the equator, while most of the other continents formed a supercontinent known as Gondwana that stretched from the equator to the South Pole.

By using a new method to measure ancient temperatures, the researchers have uncovered clues about the timing and magnitude of the glaciation and how it affected ocean temperatures near the equator. "Our observations imply a climate system distinct from anything we know about over the last 100 million years," says Woodward Fischer, assistant professor of geobiology at Caltech and a coauthor.

The fact that the extinction struck during a glacial period, when huge ice sheets covered much of what's now Africa and South America, makes it especially difficult to evaluate the role of climate. "One of the biggest sources of uncertainty in studying the paleoclimate record is that it’s very hard to differentiate between changes in temperature and changes in the size of continental ice sheets," Finnegan says. Both factors could have played a role in causing the mass extinction: with more water frozen in ice sheets, the world’s sea levels would have been lower, reducing the availability of shallow water as a marine habitat. But differentiating between the two effects is a challenge because until now, the best method for measuring ancient temperatures has also been affected by the size of ice sheets.

The conventional method for determining ancient temperature requires measuring the ratios of oxygen isotopes in minerals precipitated from seawater. The ratios depend on both temperature and the concentration of isotopes in the ocean, so the ratios reveal the temperature only if the isotopic concentration of seawater is known. But ice sheets preferentially lock up one isotope, which reduces its concentration in the ocean. Since no one knows how big the ice sheets were, and these ancient oceans are no longer available for scientists to analyze, it's hard to determine this isotopic concentration. As a result of this "ice-volume effect," there hasn’t been a reliable way to know exactly how warm or cold it was during these glacial periods.

Rock strata on Anticosti Island, Quebec, Canada, one of the sites from which the researchers collected fossils.

But by using a new type of paleothermometer developed in the laboratory of John Eiler, Sharp Professor of Geology and professor of geochemistry at Caltech, the researchers have determined the average temperatures during the Late Ordovician—marking the first time scientists have been able to overcome the ice-volume effect for a glacial episode that happened hundreds of millions of years ago. To make their measurements, the researchers analyzed the chemistry of fossilized marine animal shells collected from Quebec, Canada, and from the midwestern United States.

The Eiler lab’s method, which does not rely on the isotopic concentration of the oceans, measures temperature by looking at the "clumpiness" of heavy isotopes found in fossils. Higher temperatures cause the isotopes to bond in a more random fashion, while low temperatures lead to more clumping.

"By providing independent information on ocean temperature, this new method allows us to know the isotopic composition of 450-million-year-old seawater," Finnegan says. "Using that information, we can estimate the size of continental ice sheets through this glaciation." And with a clearer idea of how much ice there was, the researchers can learn more about what Ordovician climate was like—and how it might have stressed marine ecosystems and led to the extinction.

"We have found that elevated rates of climate change coincided with the mass extinction," says Aradhna Tripati, a coauthor from UCLA and visiting researcher in geochemistry at Caltech.

The team discovered that even though tropical ocean temperatures were higher than they are now, moderately sized glaciers still existed near the poles before and after the mass extinction. But during the extinction intervals, glaciation peaked. Tropical surface waters cooled by five degrees, and the ice sheets on Gondwana grew to be as large as 150 million cubic kilometers—bigger than the glaciers that covered Antarctica and most of the Northern Hemisphere during the modern era’s last ice age 20,000 years ago.

"Our study strengthens the case for a direct link between climate change and extinction," Finnegan says. "Although polar glaciers existed for several million years, they only caused cooling of the tropical oceans during the short interval that coincides with the main pulse of mass extinction."

In addition to Finnegan, Eiler, Tripati, and Fischer, the other authors on the Science paper, "The magnitude and duration of Late Ordovician-Early Silurian glaciation magnitude," are Kristin Bergmann, a graduate student at Caltech; David Jones of Amherst College; David Fike of Washington University in St. Louis; Ian Eisenman, a postdoctoral scholar at Caltech and the University of Washington; and Nigel Hughes of the University of California, Riverside.

This research was funded by the Agouron Institute and the National Science Foundation.

Marcus Woo

New Flume On the Block

Last Wednesday morning, Caltech received a rather large delivery. About 50 feet long and 5 feet wide, a big black chunk of metal was unloaded from a truck and slowly pushed into the Central Engineering Services Building.

The 30,000-pound shipment, arriving from Wisconsin, was a giant chute, the showpiece of the new Earth Surface Dynamics Laboratory—also called the Flume Lab—that’s being assembled in the warehouse. Michael Lamb, assistant professor of geology, along with a team of engineers and geologists, had been designing the new facility for more than three years. And after more than two years of designing, building, and testing, the main component finally arrived, putting the “flume” in Flume Lab.

Once completed, the lab will be a state-of-the-art facility used by Lamb and his colleagues to study how water and sediment flow in carefully controlled settings—a near-impossible task in nature. Only a handful of labs like this exist in the world, Lamb says. From these experiments, the researchers hope to learn more about the dangers of mudslides and other debris flows—hopefully saving lives as a result. They will also study the role of river flows and erosion in climate change and try to understand the fundamental processes that shape Earth’s landscape.

Work on the lab space has been under way for more than a year. First, the space, formerly a machine shop, had to be cleared out. Then, a large hole—about 16 by 45 feet in area and about 15 feet deep—was dug out to make room for a massive water pump and reservoir. About a dozen 30-foot, corkscrew-shaped posts were inserted underground to hold the weight of the new equipment. As one of the last steps before installing the flume, the steel support columns were erected.

“I came in last night and saw these columns for the first time, and I thought, holy tamales—this is going to be cool,” Lamb says.

When the flume arrived, crews put it on rollers and gently nudged it along with a crane, working into the early afternoon to maneuver it into place at the back of the warehouse. Prior to the flume’s arrival, workers from Facilities cut an eight-foot opening in a concrete wall to make way for the new addition. After negotiating the narrow passageway, the flume had to squeeze between the steel columns that supported the building. It was a tight fit, to say the least. (Watch a time-lapse video of the flume fitting into place.)

After a long ride from Wisconsin, the flume is unloaded from the truck.
Credit: Marcus Woo

The flume will be able to tilt up to 15 degrees, allowing water, pebbles, rocks, and other sediments to stream toward the bottom, where the water is collected in a reservoir that can hold 20,000 gallons. An experiment will be able to run continuously, as a conveyor belt and water pump—capable of forcing 8,000 gallons per minute—bring the sediment and water, respectively, back to the top.

More parts of the lab will come in over the next few weeks, as it will take a couple of months to finish construction. Lamb says he’s hopeful that the first experiments will run in March. “It’s going to be really exciting,” he adds. “We’re going to observe things that nobody has ever seen before.”

For more photos of the construction in progress, see the laboratory page here.

Marcus Woo

Thomas J. Ahrens, 74

Thomas J. Ahrens, the Fletcher Jones Professor of Geophysics, Emeritus, at Caltech, died at his home in Pasadena on November 24. He was 74.

An expert in the behavior and properties of rocks and minerals undergoing shock compression, Ahrens studied the dynamics of high-pressure materials inside Earth and other planets. His research also included planetary impacts and the formation of craters and planets.

"Tom was both a highly productive and broadly knowledgeable scientist and a dedicated mentor to dozens of students, postdocs, and visitors who now fill the ranks of mineral physics positions at universities around the world," says Professor of Geology and Geochemistry Paul Asimow, who credits Ahrens as his most important mentor while Asimow was a junior faculty member at Caltech. Together, they ran the Lindhurst Laboratory of Experimental Geophysics, which Ahrens built in 1974 when the Seismological Laboratory moved to South Mudd. "Our relationship was symbiotic," Asimow says. "Tom wanted to ensure beyond his retirement the ongoing productivity of the remarkable lab that he built, and I wanted to learn from his accumulated wisdom and to carry out experiments in a lab far beyond the scale and expense of anything I would have considered building from scratch."

Some of Ahrens's most important contributions, according to Asimow, were in developing experimental methods for measuring shock temperatures and the density of liquids at high pressure. When applied to iron, the first technique allowed researchers to determine the temperature structure of Earth's core. The second method is so far the only way to measure the density of molten rocks that might form in Earth's mantle at depths greater than a few hundred kilometers. Ahrens's work has led to a basic understanding of how objects—such as meteorites and comets—carrying volatile materials smash into planets. His research has provided insight into the source and origin of water on Earth and into the environmental effects of meteorite collisions such as the one that struck Earth 65 million years ago and likely led to the extinction of the dinosaurs.

Born in Germany, Ahrens received his BS from the Massachusetts Institute of Technology in 1957, his MS from Caltech in 1958, and his PhD from Rensselaer Polytechnic Institute in 1962. He was a geophysicist with the Pan American Petroleum Corporation from 1958 to 1959, worked as a second lieutenant for the U.S. Army in the Ballistics Research Laboratory from 1959 to 1960, and was the head of the geophysics section in the Poulter Laboratory of the Stanford Research Institute from 1962 to 1967. He joined Caltech in 1967 as an associate professor of geophysics. He became professor of geophysics in 1976 and was the W. M. Keck Foundation Professor of Earth Sciences from 1996 to 2001; he was named the Fletcher Jones Professor of Geophysics in 2004 and became Jones Professor, Emeritus, in 2005.  

Ahrens published more than 375 papers, held three U.S. patents, and received numerous honors and awards for his research. He was a member of the National Academy of Sciences and the American Academy of Arts and Sciences, and was a Foreign Associate of the Russian Academy of Sciences. He won the 1995 Arthur L. Day Medal of the Geological Society of America, the 1996 Harry H. Hess Medal of the American Geophysical Union, and the 1997 Barringer Medal of the Meteoritical Society, and he had an asteroid named after him.

He is survived by his wife, Earleen; children Earl, Eric, and Dawn; and grandchildren Greta, Violet, Jacqueline, and Samuel.


Marcus Woo
Exclude from News Hub: 
News Type: 
In Our Community

Recent News on the Debate over Pluto's Planethood

Earlier this month, Eris—the distant world first discovered by Caltech's Mike Brown and colleagues back in 2005, paving the way for the eventual demotion of Pluto from planet to dwarf planet—passed fortuitously in front of a faint star in the constellation Cetus. That passage, or occultation, allowed the first direct measurement of Eris's size. And it produced a surprising result that reignited—in the media, at least—the debate over Pluto's planethood: Eris and Pluto are, within the uncertainties, essentially the same size. But since Eris is 27% more massive than Pluto, Eris is substantially denser. The two objects, once thought to be slightly differently sized twins, are in fact very different.

But does this really mean that Pluto's demotion was unjustified?

Certainly not, Brown says. Pluto was not demoted simply because it was thought to be smaller than Eris, he explains, so even though the two are now known to be essentially the same size, the logic behind keeping both of them out of the planetary club remains the same. What is different, he says, is how much more interesting this discovery makes Eris.

"When we first discovered Eris, we thought it was just a slightly larger copy of Pluto. Finding a slightly larger copy doesn't teach you much more than the original, so even though Eris was always important to the public, it never garnered that much attention from astronomers, " Brown says. "Now that we know it has a substantially different composition from Pluto, we are scrambling to figure out ways to understand how a planetary system can produce such seemingly different objects out of what is supposed to be the same material."

Read more about these new observations in Brown's blog, "Mike Brown's Planets."

Kathy Svitil

Caltech Receives $10 Million in Gifts to Help Launch New Terrestrial Hazard Center

Center will focus on developing innovative ways to reduce the risks and costs of natural hazards

PASADENA, Calif.-In an effort to find ways to minimize the damage caused by natural hazards, the California Institute of Technology (Caltech) has established the Terrestrial Hazard Observation and Reporting Center (THOR), funded by $6.7 million from Foster and Coco Stanback of Irvine, California, and $3.35 million from the Gordon and Betty Moore matching program.

THOR will have the unique mandate of bringing together-under one program-innovative efforts to reduce the risks and costs associated with natural hazards. The center will span two divisions at Caltech, Geological and Planetary Sciences (GPS) and Engineering and Applied Science (EAS).

The study of natural hazards and solutions is ordinarily undertaken in separate academic disciplines with little intellectual interaction. THOR will provide a new focal point that will unify these efforts and allow investigators to focus on critical societal issues.

"From the current flooding in Pakistan, to the recent earthquake in Haiti, to the constant threat of wildfires in our own backyard, we are consistently reminded of the devastating impact natural hazards can have on society," says Caltech president Jean-Lou Chameau. "Now, with the generous support of Foster and Coco Stanback, Caltech scientists and engineers will be able to study these critical issues in a unique interdisciplinary environment.  THOR will help communities around the world determine how to best prepare for, anticipate, and respond to various natural hazards, hopefully saving lives in the process." 

Natural hazards that will fall under THOR's purview include global climate change, earthquakes, tsunamis, landslides, wildfires, and extreme weather events such as droughts, among others.

By providing support for the development of techniques and physical inventions, THOR will focus on practical societal aspects of natural hazards and their public policy implications.

For instance, THOR may help guide the distribution of limited resources following a major hazard such as an earthquake or tsunami, or lead to early-warning systems.

"The interdisciplinary and interactive nature of engineering at Caltech allows us to translate scientific knowledge and discovery into applications with direct societal impact," says Ares Rosakis, the von Kármán Professor of Aeronautics, professor of mechanical engineering, and chair of the division of Engineering and Applied Science. "One of the areas of pioneering research and innovation made possible by THOR is seismo-engineering. The boundaries of seismo-engineering are fuzzy ones and lie exactly in the interface between seismology and earthquake engineering.  We are delighted to have the opportunity to explore these boundaries."

Caltech has a number of highly visible areas of expertise that already touch on natural-hazard issues, including the Seismological Laboratory, the Linde Center for Global Environmental Science, missions of the Caltech-managed Jet Propulsion Laboratory (JPL) that provide critical high-precision data on Earth's climate and environment, multiple studies supported by the Keck Institute for Space Studies (KISS) focused on future Earth-observing missions, and the Resnick Institute for Science, Energy, and Sustainability.   

"The THOR center will provide a unique platform for collaboration among scientists, students, and policymakers, empowering them with the extensive resources of Caltech and the Jet Propulsion Laboratory," says THOR donor Foster Stanback. "By linking our eyes in the sky with the many eyes on the ground, we will be far better prepared to anticipate, mitigate, and eliminate many environmental hazards."

THOR's attention and resources will be applied in several ways, including the dissemination of the results of work supported by THOR; supporting efforts to transfer ideas and technologies that show promise of practical implementation; and prioritizing, seeding, and nurturing ideas encompassing research activities, along with the invention of technologies.

"THOR will give faculty in GPS and EAS the opportunity to develop innovative new ways to help mitigate the consequences and costs of the natural hazards society faces, from climate change to earthquakes to water scarcity," says Ken Farley, the W.M. Keck Foundation Professor of Geochemistry and chair of the division of Geological and Planetary Sciences. "This very applied research is difficult to support from federal sources, so my hope is that the gift will catalyze entirely new endeavors. THOR will also allow us to bring to our educational program a new focus on the societal and policy implications and relevance of our work."

The center will be housed within the newly renovated Linde + Robinson Laboratory on the Caltech campus.

Jon Weiner
Exclude from News Hub: 
News Type: 
In Our Community

Caltech Mineral Physicists Find New Scenery at Earth's Core-Mantle Boundary

PASADENA, Calif.—Using a diamond-anvil cell to recreate the high pressures deep within the earth, researchers at the California Institute of Technology (Caltech) have found unusual properties in an iron-rich magnesium- and iron-oxide mineral that may explain the existence of several ultra-low velocity zones (ULVZs) at the core-mantle boundary. A paper about their findings was published in a recent issue of Geophysical Research Letters (GRL).

ULVZs—which were first discovered in the early 1990s by researchers at Caltech led by Donald V. Helmberger, Smits Family Professor of Geological and Planetary Sciences—are found in a patchwork distribution just above the core-mantle boundary, which is located at a depth of 2,900 kilometers. In the ULVZs, which range from a few kilometers to tens of kilometers in depth and are up to 100 kilometers across, the velocities of seismic waves slow down by up to 30 percent.

Previously, geophysicists had suggested that the ULVZs might be composed of liquid-bearing, partially melted materials; in this region of the lower mantle, the earth is solid—but plastic and flowing—rock. The idea was that if these rocks contained some liquid, seismic waves would propagate more slowly through them. And, indeed, "the area is very hot and close to the core," says Jennifer M. Jackson, assistant professor of mineral physics at Caltech and coauthor of the GRL paper. The catch, she says, "is that the temperature and composition of this region are not very well known." In addition, she says, "these ULVZs are not always associated with surface hot spots" such as the Hawaiian islands.

Instead of being patches of partially melted (i.e., liquid-bearing) rock, the ULVZs—Jackson and her colleagues believe—are composed of an entirely solid, compositionally distinctive rock type with unusual properties that cause sound waves to slow down. "What we're suggesting here is that many ULVZs—it need not be all of them—are likely to be solid, not molten," Jackson says. 

In reaching this conclusion, Jackson and her team studied the properties of iron-rich magnesium-iron oxides [(Mg, Fe)O]—similar to the mineral periclase known at the earth's surface—using specially prepared diamond-anvil cells. Within the piston-like chamber of the 4-inch-tall cell, two semiflawless natural diamonds—a quarter of a carat each—were squeezed together, sandwiching a small sample of the oxide and proportionally increasing its pressure.

After the pressurized samples were created, they were taken to the Advanced Photon Source at Argonne National Laboratory in Illinois and exposed to X-rays, causing the scattering of photons at energies related to the speed at which sound would travel through. 

Jackson and her colleagues conducted the measurements at pressures ranging from ambient pressure (at Earth's surface) up to 121 Gigapascals (GPa)—or over 17 million pounds per square inch, equivalent to about 2700 km depth. "The measurements stopped at pressures where the diamond-anvils broke," Jackson says. 

Normally, solid materials increase in stiffness under increasing pressure, causing sound waves to travel at higher and higher velocities. But in iron-rich (Mg,Fe)O, the sound velocities took a surprising and significant dip of about 10 percent at just under 28 GPa. The sound velocities in the minerals did not return to their ambient-pressure levels until pressures of 50 to 60 GPa were reached.

To their surprise, the researchers found they could compress the iron-rich mineral to very high pressures and very high densities, "and yet it is still highly compressible" Jackson says. In fact, even at 121 GPa, sound velocities in the mineral were still much lower than in other known mantle materials. "It's quite unusual to have a solid this compressible under these pressures. Compared to silicate"—the main constituent of Earth's crust—"at the same pressures, it's like squishing a pat of butter between two bricks."

Jackson and her colleagues suspect that the velocity drops in this particular mineral are related to magnetic transitions that can make iron-rich oxides more compressible than many silicates. "Silicate acts more like a very stiff spring; iron-rich oxide is like a very weak spring," she explains. With increasing pressure, "the iron-oxide goes through a series of complex magnetic transitions that couple to its sound waves"—and thus explain the unusually low velocities, even at extremely high pressures. "But," she adds, "because the mineral is so iron-rich, it is likely only to exist near the core."

While the new result does not rule out partial melting as a cause of some ULVZs, Jackson says, "iron-rich periclase certainly provides one of the most robust explanations so far."

"The composition of the boundary layer between the earth's liquid outer core and silicate mantle not only influences the current thermal and chemical evolution of the earth's interior, but may hold the key to unlocking the planet's past thermo-chemical evolution," she says. "The iron-rich periclase patches may be fossil remnants of large-scale melting events occurring billions of years ago inside the earth. Having an iron-rich oxide in contact with the earth's core would facilitate propagation of the planet's magnetic field and contribute to the dynamic coupling between the core and the mantle."

The paper, "Very low sound velocities in iron-rich (Mg,Fe)O: Implications for the core-mantle boundary," was coauthored by Caltech graduate student June K. Wicks and Wolfgang Sturhahn, formerly at Argonne National Laboratory and now at the Jet Propulsion Laboratory. The work was funded by the National Science Foundation and the U.S. Department of Energy's Office of Basic Energy Sciences.

Kathy Svitil

New View of Tectonic Plates

PASADENA, Calif.—Computational scientists and geophysicists at the University of Texas at Austin and the California Institute of Technology (Caltech) have developed new computer algorithms that for the first time allow for the simultaneous modeling of the earth's mantle flow, large-scale tectonic plate motions, and the behavior of individual fault zones, to produce an unprecedented view of plate tectonics and the forces that drive it.

A paper describing the whole-earth model and its underlying algorithms will be published in the August 27 issue of the journal Science and also featured on the cover.

The work "illustrates the interplay between making important advances in science and pushing the envelope of computational science," says Michael Gurnis, the John E. and Hazel S. Smits Professor of Geophysics, director of the Caltech Seismological Laboratory, and a coauthor of the Science paper.

To create the new model, computational scientists at Texas's Institute for Computational Engineering and Sciences (ICES)—a team that included Omar Ghattas, the John A. and Katherine G. Jackson Chair in Computational Geosciences and professor of geological sciences and mechanical engineering, and research associates Georg Stadler and Carsten Burstedde—pushed the envelope of a computational technique known as Adaptive Mesh Refinement (AMR).

Partial differential equations such as those describing mantle flow are solved by subdividing the region of interest (such as the mantle) into a computational grid. Ordinarily, the resolution is kept the same throughout the grid. However, many problems feature small-scale dynamics that are found only in limited regions. "AMR methods adaptively create finer resolution only where it's needed," explains Ghattas. "This leads to huge reductions in the number of grid points, making possible simulations that were previously out of reach.”

"The complexity of managing adaptivity among thousands of processors, however, has meant that current AMR algorithms have not scaled well on modern petascale supercomputers," he adds. Petascale computers are capable of one million billion operations per second. To overcome this long-standing problem, the group developed new algorithms that, Burstedde says, "allows for adaptivity in a way that scales to the hundreds of thousands of processor cores of the largest supercomputers available today." 

With the new algorithms, the scientists were able to simulate global mantle flow and how it manifests as plate tectonics and the motion of individual faults. According to Stadler, the AMR algorithms reduced the size of the simulations by a factor of 5,000, permitting them to fit on fewer than 10,000 processors and run overnight on the Ranger supercomputer at the National Science Foundation (NSF)-supported Texas Advanced Computing Center.  

A key to the model was the incorporation of data on a multitude of scales. "Many natural processes display a multitude of phenomena on a wide range of scales, from small to large," Gurnis explains. For example, at the largest scale—that of the whole earth—the movement of the surface tectonic plates is a manifestation of a giant heat engine, driven by the convection of the mantle below. The boundaries between the plates, however, are composed of many hundreds to thousands of individual faults, which together constitute active fault zones. "The individual fault zones play a critical role in how the whole planet works," he says, "and if you can't simulate the fault zones, you can't simulate plate movement"—and, in turn, you can't simulate the dynamics of the whole planet.

In the new model, the researchers were able to resolve the largest fault zones, creating a mesh with a resolution of about one kilometer near the plate boundaries. Included in the simulation were seismological data as well as data pertaining to the temperature of the rocks, their density, and their viscosity—or how strong or weak the rocks are, which affects how easily they deform. That deformation is nonlinear—with simple changes producing unexpected and complex effects.

"Normally, when you hit a baseball with a bat, the properties of the bat don't change—it won't turn to Silly Putty. In the earth, the properties do change, which creates an exciting computational problem," says Gurnis. "If the system is too nonlinear, the earth becomes too mushy; if it's not nonlinear enough, plates won't move. We need to hit the 'sweet spot.'"

After crunching through the data for 100,000 hours of processing time per run, the model returned an estimate of the motion of both large tectonic plates and smaller microplates—including their speed and direction. The results were remarkably close to observed plate movements.

In fact, the investigators discovered that anomalous rapid motion of microplates emerged from the global simulations. "In the western Pacific," Gurnis says, "we have some of the most rapid tectonic motions seen anywhere on Earth, in a process called 'trench rollback.' For the first time, we found that these small-scale tectonic motions emerged from the global models, opening a new frontier in geophysics."

One surprising result from the model relates to the energy released from plates in earthquake zones. "It had been thought that the majority of energy associated with plate tectonics is released when plates bend, but it turns out that's much less important than previously thought," Gurnis says. "Instead, we found that much of the energy dissipation occurs in the earth's deep interior. We never saw this when we looked on smaller scales."

The paper, "The Dynamics of Plate Tectonics and Mantle Flow: From Local to Global Scales," was also coauthored by Lucas C. Wilcox of the University of Texas at Austin and Laura Alisic of Caltech. The work was supported by the NSF, the Department of Energy's Office of Science, and—at the Caltech Tectonics Observatory—by the Gordon and Betty Moore Foundation. 

Kathy Svitil

NRC Recommends Three Astronomy/Astrophysics Projects with Potential Major Caltech Roles

PASADENA, Calif.—In an announcement August 13 at the National Academies in Washington, D.C., the National Research Council (NRC) recommended three space- and ground-based astronomy and astrophysics projects with potential major roles for researchers at the California Institute of Technology (Caltech): CCAT, a submillimeter telescope to be erected in the Chilean Andes, which will help unravel the cosmic origins of stars, planets, and galaxies; the Laser Interferometer Space Antenna (LISA), designed to detect gravitational waves, ripples in the fabric of space and time formed by the most violent events in the universe; and the development of a Giant Segmented Mirrored Telescope (GSMT)—the Thirty Meter Telescope (TMT) being one of two such telescopes under development—which will yield the clearest and deepest view of the universe.

The recommendations were the result of the Astro2010 a decadal survey, in which a panel of experts was convened by the NRC to look at the coming decade and prioritize research activities in astronomy and astrophysics, as well as activities at the interface of these disciplines with physics.

"It is particularly gratifying to see that Caltech faculty are prepared to play leading roles in three of the major projects identified by the Astro2010 study," says Tom Soifer, professor of physics, director of the Spitzer Science Center, and chair of the Division of Physics, Mathematics and Astronomy at Caltech. "Many of the important discoveries in the coming decades will be based on CCAT, TMT, and LISA, and our deep involvement in these projects will ensure that Caltech remains a vibrant, exciting center for astrophysics."


CCAT was recommended for an immediate start. CCAT is a collaboration between Caltech and the Jet Propulsion Laboratory (JPL), managed by Caltech for NASA; Cornell University; the University of Colorado; and consortia in both Germany and Canada. It will be a 25-meter telescope located on a mountain site in Chile at an elevation of 18,500 feet above sea level. Taking advantage of recent advances in detector technology, CCAT will employ cameras and spectrometers to survey the sky at millimeter and submillimeter wavelengths, providing an unprecedented combination of sensitivity and resolution across a wide field of view. CCAT will reveal young galaxies, stars, and solar systems enshrouded in clouds of dust that make these objects very faint or invisible at other wavelengths.

"We are making rapid progress on all fronts—in detectors, instruments, and new facilities—and this is leading to important scientific discoveries," says Jonas Zmuidzinas, Merle Kingsley Professor of Physics at Caltech, director of the Microdevices Laboratory at JPL, and a CCAT project scientist. "With CCAT, we will gain real insight into the evolution of stars and galaxies."

According to Riccardo Giovanelli, CCAT director and professor of astronomy at Cornell University, "CCAT will allow us to explore the process of formation of galaxies, which saw its heyday about a billion years after the Big Bang, some 13 billion years ago; to peek into the interior of the dusty molecular clouds within which stars and planets form; and to survey the pristine chunks of material left intact for billions of years on the outskirts of our solar system." 

CCAT is a natural complement to the international Atacama Large Millimeter Array (ALMA) project now under construction in Chile, which will provide detailed, high-resolution images of individual objects over narrow fields of view. This complementarity provides a strong impetus to begin construction of CCAT as soon as possible. Indeed, as Nobel Laureate and CCAT Design Review Committee Chair Robert W. Wilson noted, "CCAT is very timely and cannot wait."

This sentiment was echoed in the Decadal Review Committee's report, which stated that "CCAT is called out to progress promptly to the next step in its development because of its strong science case, its importance to ALMA, and its readiness." The Committee recommended federal support for 33 percent of the construction cost for CCAT, as well as support in the operations phase.


The decadal survey recommended LISA as one of NASA's next major space missions, to start in 2016 in collaboration with the European Space Agency (ESA). In the U.S. the LISA project is managed by the NASA Goddard Space Flight Center and includes significant participation by JPL, which is managed by Caltech for NASA.

LISA is designed to be complementary to the ground-based observatories (such as the Laser Interferometer Gravitational Wave Observatory, or LIGO) that currently are actively searching for signs of gravitational waves, which carry with them information about their origins and about the nature of gravity that cannot be obtained using conventional astronomical tools.

The LISA instrument will consist of three spacecraft in a triangular configuration with 5-million-kilometer arms moving in an Earth-like orbit around the sun. Gravitational waves from sources throughout the universe will produce slight oscillations in the arm lengths (changes as small as about 10 picometers, or 10 million millionths of a meter). LISA will capture these motions using laser links to monitor the displacements of gold–platinum test masses floating inside the spacecraft. It is slated for launch in the early 2020s. The instrument will observe gravitational waves in a lower frequency band (0.1 milliHertz to 1 Hertz) than that detectable by LIGO and other ground-based instruments and will sense ripples coming simultaneously from tens of thousands of sources in every direction.

The survey recommended LISA because of the expectation that observations of gravitational waves in space will answer key scientific questions about the astrophysics of the cosmic dawn and the physics of the universe.

"We are very pleased with the NRC's recognition of LISA's revolutionary research opportunities in astrophysics and fundamental physics and we are looking forward to unveiling a new window on the universe by observing thousands of gravitational wave sources," says Tom Prince, professor of physics at Caltech, senior research scientist at the Jet Propulsion Laboratory (JPL), and the U.S. chair of the LISA International Science Team.


The decadal survey rated the development of a Giant Segmented Mirrored Telescope as a high priority. The TMT is one of two such telescopes under development by consortia with major involvement by private and public entities in the U.S., including Caltech.

Building on the success of the twin Keck telescopes, the core technology of TMT will be a 30-meter segmented primary mirror. This will give TMT nine times the collecting area of today's largest optical telescopes and three-times-sharper images. The TMT has begun full-scale polishing of the 1.4-meter mirror blanks that will make up the primary mirror. TMT also has developed many of the essential prototype components for the telescope, including key adaptive optics technologies and the support and control elements for the 492 mirror segments.

The TMT project is an international partnership among Caltech, the University of California, and the Association of Canadian Universities for Research in Astronomy, joined by the National Astronomical Observatory of Japan, the National Astronomical Observatories of the Chinese Academy of Sciences, and the Department of Science and Technology of India.

Kathy Svitil

Caltech, Canadian Space Agency Awarded NASA Project to Develop Spectrometer Headed to Mars

Instrument will look for evidence of life and volcanic activity on Mars; will fly aboard ExoMars Trace Gas Orbiter in 2016

PASADENA, Calif.—The California Institute of Technology (Caltech) and the Canadian Space Agency (CSA) announced today that they will be partnering on the development of the Mars Atmospheric Trace Molecule Occultation Spectrometer (MATMOS) instrument to be flown aboard the ExoMars Trace Gas Orbiter when it launches in 2016.

The project will be funded by a grant from NASA, with additional support coming from the CSA.

NASA participation in the ExoMars Trace Gas Orbiter is managed by the Jet Propulsion Laboratory, a division of Caltech, in partnership with the European Space Agency.

"The ExoMars investigation is designed to study the composition of Mars's atmosphere, with a focus on biogenically or volcanically derived trace gases," says Paul Wennberg, the R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering at Caltech and director of the Ronald and Maxine Linde Center for Global Environmental Science. Wennberg is the principal investigator on the MATMOS team.

The ExoMars Orbiter's circular path around Mars will point the MATMOS telescope at the center of the sun as the spacecraft goes into orbital sunrise and sunset. During these periods, as the sun sets and rises through the atmosphere, MATMOS will take spectra of the sunlight, recording the absorption of numerous gases. The sun's long path length through the Martian atmosphere will allow MATMOS to measure the trace gases with very high sensitivity, notes Wennberg.

"If you take the spectra fast," says Geoffrey Toon, senior research scientist at JPL and a visiting associate in planetary sciences at Caltech, "you can measure the gas abundance at many different heights above the planet —70 measurements as the sun rises, and 70 as it sets."

Among the gases of interest to the team are those "diagnostic of active geological and biogenic activity," says Wennberg—gases like methane, as well as carbon, sulfur, and nitrogen-containing molecules, sulfur dioxide, and hydrogen sulfide.

MATMOS will be so exquisitely sensitive, says Wennberg, that it will be able to measure the concentrations of these gases down to parts per trillion.

"We did a calculation which shows that the microbial community found in three cows' bellies would produce an amount of methane that, in the Mars atmosphere, would be observable by MATMOS," says Mark Allen, principal scientist at JPL and a visiting associate in planetary sciences at Caltech.

MATMOS is based on the Atmospheric Trace Molecule Spectroscopy (ATMOS) experiment, developed by JPL, and the Atmospheric Chemistry Experiment–Fourier Transform Spectrometer (ACE–FTS), pioneered by University of Waterloo and the Canadian Space Agency. The ATMOS instrument has flown four times on the Space Shuttle since 1985. ACE–FTS was launched in 2003 and is still operational.

"MATMOS is an excellent instrument and an opportunity for a great partnership," says CSA Senior Planetary Scientist Victoria Hipkin, co-principal investigator on the MATMOS project. "We have taken the optical systems of one of Canada's flagship satellite instruments (Scisat ACE–FTS) and combined it with state-of-the-art data processing from the US (JPL's ATMOS and MrkIV). Our team—with US and Canadian leadership and Canadian, US, French, and UK members—is looking forward to providing fundamental data about this fascinating planet."

Lori Oliwenstein