Xena Awarded "Dwarf Planet" Status, IAU Rules; Solar System Now Has Eight Planets

PASADENA, Calif.—The International Astronomical Union (IAU) today downgraded the status of Pluto to that of a "dwarf planet," a designation that will also be applied to the spherical body discovered last year by California Institute of Technology planetary scientist Mike Brown and his colleagues. The decision means that only the rocky worlds of the inner solar system and the gas giants of the outer system will hereafter be designated as planets.

The ruling effectively settles a year-long controversy about whether the spherical body announced last year and informally named "Xena" would rise to planetary status. Somewhat larger than Pluto, the body has been informally known as Xena since the formal announcement of its discovery on July 29, 2005, by Brown and his co-discoverers, Chad Trujillo of the Gemini Observatory and David Rabinowitz of Yale University. Xena will now be known as the largest dwarf planet.

"I'm of course disappointed that Xena will not be the tenth planet, but I definitely support the IAU in this difficult and courageous decision," said Brown. "It is scientifically the right thing to do, and is a great step forward in astronomy.

"Pluto would never be considered a planet if it were discovered today, and I think the fact that we've now found one Kuiper-belt object bigger than Pluto underscores its shaky status."

Pluto was discovered in 1930. Because of its size and distance from Earth, astronomers had no idea of its composition or other characteristics at the time. But having no reason to think that many other similar bodies would eventually be found in the outer reaches of the solar system—or that a new type of body even existed in the region—they assumed that designating the new discover as the ninth planet was a scientifically accurate decision.

However, about two decades later, the famed astronomer Gerard Kuiper postulated that a region in the outer solar system could house a gigantic number of comet-like objects too faint to be seen with the telescopes of the day. The Kuiper belt, as it came to be called, was demonstrated to exist in the 1990s, and astronomers have been finding objects of varying size in the region ever since.

Few if any astronomers had previously called for the Kuiper-belt objects to be called planets, because most were significantly smaller than Pluto. But the announcement of Xena's discovery raised a new need for a more precise definition of which objects are planets and which are not.

According to Brown, the decision will pose a difficulty for a public that has been accustomed to thinking for the last 75 years that the solar system has nine planets.

"It's going to be a difficult thing to accept at first, but we will accept it eventually, and that's the right scientific and cultural thing to do," Brown says.

In fact, the public has had some experience with the demotion of a planet in the past, although not in living memory. Astronomers discovered the asteroid Ceres on January 1, 1801—literally at the turn of the 19th century. Having no reason to suspect that a new class of celestial object had been found, scientists designated it the eighth planet (Uranus having been discovered some 20 years earlier).

Soon several other asteroids were discovered, and these, too, were summarily designated as newly found planets. But when astronomers continued finding numerous other asteroids in the region (there are thought to be hundreds of thousands), the astronomical community in the early 1850s demoted Ceres and the others and coined the new term "minor planet."

Xena was discovered on January 8, 2005, at Palomar Observatory with the NASA-funded 48-inch Samuel Oschin Telescope. Xena is about 2,400 kilometers in diameter. A Kuiper-belt object like Pluto, but slightly less reddish-yellow, Xena is currently visible in the constellation Cetus to anyone with a top-quality amateur telescope.

Brown and his colleagues in late September announced that Xena has at least one moon. This body has been nicknamed Gabrielle, after Xena's sidekick on the television series.

Xena is currently about 97 astronomical units from the sun (an astronomical unit is the distance between the sun and Earth), which means that it is some nine billion miles away at present. Xena is on a highly elliptical 560-year orbit, sweeping in as close to the sun as 38 astronomical units. Currently, however, it is nearly as far away as it ever gets.

Pluto's own elliptical orbit takes it as far away as 50 astronomical units from the sun during its 250-year revolution. This means that Xena is sometimes much closer to Earth than Pluto—although never closer than Neptune.

Gabrielle is about 250 kilometers in diameter and reflects only about 1 percent of the sunlight that its parent reflects. Because of its small size, Gabrielle could be oddly shaped.

Brown says that the study of Gabrielle's orbit around Xena hasn't yet been fully completed. But once it is, the researchers will be able to derive the mass of Xena itself from Gabrielle's orbit. This information will lead to new insights on Xena's composition.

Based on spectral data, the researchers think Xena is covered with a layer of methane that has seeped from the interior and frozen on the surface. As in the case of Pluto, the methane has undergone chemical transformations, probably due to the faint solar radiation, that have caused the methane layer to redden. But the methane surface on Xena is somewhat more yellowish than the reddish-yellow surface of Pluto, perhaps because Xena is farther from the sun.

Brown and Trujillo first photographed Xena with the 48-inch Samuel Oschin Telescope on October 31, 2003. However, the object was so far away that its motion was not detected until they reanalyzed the data in January of 2005.

The search for new planets and other bodies in the Kuiper belt is funded by NASA. For more information on the program, see the Samuel Oschin Telescope's website at http://www.astro.caltech.edu/palomarnew/sot.html.

For more information on Mike Brown's research, see http://www.gps.caltech.edu/~mbrown.

Writer: 
Robert Tindol
Writer: 

Study of 8.7-Magnitude Earthquake Lends New Insight into Post-Shaking Processes

PASADENA, Calif.—Although the magnitude 8.7 Nias-Simeulue earthquake of March 28, 2005, was technically an aftershock, the temblor nevertheless killed more than 2,000 people in an area that had been devastated just three months earlier by the December 2004, magnitude 9.1 earthquake. Now, data returned from instruments in the field provide constraints on the behavior of dangerous faults in subduction zones, fueling a new understanding of basic mechanics controlling slip on faults, and in turn, improved estimates of regional seismic risk.

In the June 30 issue of the journal Science, a team including Ya-Ju Hsu, Mark Simons, and others of the California Institute of Technology's new Tectonics Observatory and the University of California, San Diego, report that their analysis of Global Positioning System (GPS) data taken at the time of the earthquake and during the following 11 months provide insights into how fault slippage and aftershock production are related.

"In general, the largest earthquakes occur in subduction zones, such as those offshore of Indonesia, Japan, Alaska, Cascadia, and South America," says Hsu, a postdoctoral researcher at the Tectonics Observatory and lead author of the paper. "Of course, these earthquakes can be extremely damaging either directly, or by the resulting tsunami.

"Therefore, understanding what causes the rate of production of aftershocks is clearly important to earthquake physics and disaster response," Hsu adds.

The study finds that the regions on the fault surrounding the area that slipped during the 8.7 earthquake experienced accelerated rates of slip following the March shock. The region dividing the area that slipped during the earthquake, and that which has slipped after the earthquake, is clearly demarcated by a band of intense aftershocks.

A primary conclusion of the paper is that there is a strong relationship between the production of aftershocks and post-earthquake fault slip-in other words, the frequency and location of aftershocks in a subduction megathrust are related to the amount and location of fault slip in the months following the main earthquake. Hsu and her colleagues believe that the aftershocks are controlled by the rate of aseismic fault slip after the earthquake.

"One conjecture is that, if the aseismic fault slip occurs quickly, then lots of aftershocks are produced," says Simons, an associate professor of geophysics at Caltech. "But there are other arguments suggesting that both the aftershocks and the post-earthquake aseismic fault slip are caused by some third underlying process."

In any case, Simons and Hsu say that the study demonstrates that the placing of additional remote sensors in subduction zones leads to better modeling of earthquake hazards. In particular, the study shows that the rheology, or mechanical properties, of the region can be inferred from the accumulation of postseismic data.

A map of the region constructed from the GPS data reveals that certain areas slip in different manners than others because some parts of the fault seem to be more "sticky." Because of the nature of seismic waves, the manner in which the fault slips in the months following a large earthquake has huge implications for human habitation.

"An important question is how slip on a fault varies as a function of time," Simons explains. "The extent to which an area slips is related to the risk, because you have a finite budget. Whether all the stress is released during earthquakes or whether it creeps is important for us to know. We would be very happy if all faults slipped as a slow creep, although I guess seismologists would be out of work."

The fact that the Nias-Simeulue's postseismic slip following the December 28, 2004, earthquake can be modeled so intricately shows that other subduction zones can also be modeled, Hsu says. "In general, understanding the whole seismic cycle is very important. Most of the expected hazards of earthquakes occur in subduction zones."

The Tectonics Observatory is establishing a network of sensors in areas of active plate-boundary deformation such as Chile and Peru, the Kuril Islands off Japan, and Nepal. The observatory is supported by the Gordon and Betty Moore Foundation.

The other authors of the paper are Jean-Philippe Avouac, a professor of geology at Caltech and director of the Tectonics Observatory; Kerry Sieh, the Sharp Professor of Geology at Caltech; John Galetzka, a professional staff member at Caltech; Mohamed Chlieh, a postdoctoral scholar at Caltech; Danny Natawidjaja of the Indonesian Institute of Sciences; and Linette Prawfrodirdjo and Yehuda Bock, both of the University of California at San Diego's Institute of Geophysics and Planetary Sciences.

Writer: 
Robert Tindol
Writer: 

Hubble Space Telescope Obtains Best-Ever Size Measurement of Xena; Still Larger Than Pluto

PASADENA, Calif.—To paraphrase a certain young lady from literature, the tenth planet Xena is getting curiouser and curiouser. Data released today by the Space Telescope Science Institute reveals that Xena is about 5 percent larger than Pluto, which means that it must be the most reflective planet in the solar system.

According to Mike Brown, a California Institute of Technology planetary scientist who codiscovered Xena last year, the Hubble Space Telescope measurement shows that Xena is about 2,400 kilometers in diameter, give or take 100 kilometers. The planet's smaller-than-expected size, together with its distance from Earth and its brightness, mean that Xena must reflect 86 percent of all light.

"This makes it more reflective than the nine known planets, and more reflective than everything else other than Enceladus," explains Brown. Enceladus is a moon orbiting Saturn.

Brown and his colleagues, Chad Trujillo of the Gemini Observatory and David Rabinowitz of Yale University, publicly announced the discovery of Xena on July 29, 2005. Their argument from the beginning has been that Xena deserves to be formally declared the tenth planet because it is larger than Pluto. If the designation is approved by the International Astronomical Union, Xena will assume a formal new name that will presumably be taken from Greek or Roman mythology, and will become only the fourth planet to have been discovered in historical times.

Also designated as 2003 UB313, Xena was originally found by Brown, Trujillo, and Rabinowitz with the NASA-funded 48-inch Samuel Oschin Telescope at Caltech's Palomar Observatory. Like its nine siblings, Xena independently orbits the sun, and like Pluto, is a Kuiper-belt object.

Brown says the high reflectivity of Xena is surprising, but perhaps explainable. "I think what is going on, is that the temperature of Xena-which is about minus 400 degrees Fahrenheit-causes the atmosphere to be frozen to the surface. This frozen atmosphere would make a nice, bright layer a few inches thick.

"When Xena gets closer to the sun and heats up to a sultry 370 degrees below zero, the frozen atmosphere probably re-evaporates and the surface probably looks more like Pluto for a time. But Xena is currently 10 billion miles from the sun, and because of its distance is about as cold as it ever gets."

The situation will change, however, when Xena approaches closer to the sun on its 560-year elliptical orbit. Though we won't live to see it, Xena will eventually get within 38 astronomical units of Earth (in other words, 38 times the distance between the sun and Earth), and things will heat up significantly.

"Xena is about to undergo the worst case of global warming of any planet in the solar system," says Brown. "The change will be equivalent to Earth's heating up to an average temperature of 400 degrees! But then the cycle will repeat and Xena will get cold again."

Xena's closest planetary neighbor is Pluto, which Xena resembles in various ways. Pluto's own elliptical orbit takes it as far away as 50 astronomical units from the sun during its 250-year trip around the sun. This means that Xena is sometimes much closer to Earth than Pluto-although never closer than Neptune.

Since the discovery, Brown and his colleagues have intensely studied the planet with a variety of instruments, and have already learned many of its characteristics. In the course of their investigations, they have also discovered that the new planet has a small moon. This body, unofficially nicknamed "Gabrielle" after Xena's sidekick on the television show, will also be formally named at a later date. Gabrielle is about 250 kilometers in diameter and reflects only about 1 percent of the sunlight that its parent reflects. Because of its small size, Gabrielle may very well be oddly shaped.

Brown says that Gabrielle's orbit around Xena hasn't yet been fully determined. But once it is, the researchers will be able to derive the mass of the planet itself. That's because the entire mass of the system (planet and moon together) orbits a common center of gravity. Therefore, once the researchers figure out the distance between moon and planet, how fast the moon revolves around the planet, and how much the moon makes the planet wobble, then they'll know how much Xena weighs. And this information will lead to new insights on its composition.

Based on spectral data, the researchers think the planet is covered with a layer of methane that has seeped from the interior. As in the case of Pluto, the methane has undergone chemical transformations, probably due to the faint solar radiation that has caused the methane layer to redden. But the methane surface on Xena is somewhat more yellowish than the reddish-yellow surface of Pluto, perhaps because Xena is farther from the sun.

Brown and Trujillo first photographed the new planet with the 48-inch Samuel Oschin Telescope on October 31, 2003. However, the object was so far away that its motion was not detected until they reanalyzed the data in January of last year.

The search for new planets and other bodies in the Kuiper belt is funded by NASA. For more information on the program, see the Samuel Oschin Telescope's Website at http://www.astro.caltech.edu/palomarnew/sot.html

For more information on Mike Brown's research, see http://www.gps.caltech.edu/~mbrown.

Writer: 
Robert Tindol
Writer: 

Watson Lecture: Bacterial Biofilms

PASADENA, Calif.- Next time you're brushing your teeth in the morning, give a thought to biofilms, the complex communities of bacteria that form the slippery scum you're scouring off your teeth, along with the slime on river rocks, the gunk in clogged drains, and filmy coatings on just about any surface, anywhere, that's exposed to water.

Biofilms--and the bacteria that comprise them--"get a bad rap," says Dianne K. Newman, "but pathogenic bacteria are only a very minor fraction of those in nature. The vast majority of bacteria do really wonderful and important things."

On Wednesday, April 12, Newman, professor of geobiology at the California Institute of Technology and investigator at the Howard Hughes Medical Institute, will discuss how biofilms and bacteria in general are essential for our health and how they have sustained and shaped our environment throughout Earth's history. "I hope that people come to appreciate that bacteria are more than just germs and are, in fact, remarkably metabolically sophisticated," she says. Her talk, "Bacterial Biofilms: Far More Than a Collection of Germs," is the second program of the Winter/Spring 2006 Earnest C. Watson Lecture Series.

The lecture will take place at 8 p.m. in Beckman Auditorium, 332 S. Michigan Avenue, south of Del Mar Boulevard, on the Caltech campus in Pasadena. Seating is available on a free, no-ticket-required, first-come, first-served basis. Caltech has offered the Watson Lecture Series since 1922, when it was conceived by the late Caltech physicist Earnest Watson as a way to explain science to the local community.

For more information, call 1(888) 2CALTECH (1-888-222-5832) or (626) 395-4652. ###

Contact: Kathy Svitil (626) 395-8022 ksvitil@caltech.edu Visit the Caltech Media Relations website at: http://pr.caltech.edu/media

Writer: 
KS
Tags: 
Writer: 

Fault That Produced Largest Aftershock Ever Recorded Still Poses Threat to Sumatra

PASADENA, Calif.—A mere three months after the giant Sumatra-Andaman earthquake and tsunami of December 2004, tragedy struck again when another great earthquake shook the area just to the south, killing over 2,000 Indonesians. Although technically an aftershock of the 2004 event, the 8.7-magnitude Nias-Simeulue earthquake just over a year ago was itself one of the most powerful earthquakes ever recorded. Only six others have had greater magnitudes.

In the March 31 issue of the journal Science, a team of researchers led by Richard Briggs and Kerry Sieh of the California Institute of Technology reconstruct the fault rupture that caused the March 28, 2005, event from detailed measurements of ground displacements. Their analysis shows that the fault broke along a 400-kilometer length, and that the length of the break was limited by unstrained sections of the fault on either end.

The researchers continue to express concern that another section of the great fault, south of the 2005 rupture, is likely to cause a third great earthquake in the not-too-distant future. The surface deformation they observed in the 2005 rupture area may well be similar to what will occur when the section to the south ruptures.

Briggs, a postdoctoral scholar in Caltech's new Tectonics Observatory, and his colleagues determined the vertical displacements of the Sumatran islands that are directly over the deeply buried fault whose rupture generated the 2005 earthquake. The main technique they used was the examination of coral heads growing near the shore. The tops of these heads stay just at the waterline, so if they move higher or lower, it indicates that there has been uplift or subsidence.

The researchers also obtained data on ground displacements from GPS stations that they had rushed into place after the 2004 earthquake. "We were fortunate to have installed the geodetic instruments right above the part that broke," says Kerry Sieh, who leads the Sumatran project of Caltech's Tectonics Observatory. "This is the closest we've ever gotten to such a large earthquake with continuously recording GPS instruments."

From the coral and GPS measurements, the researchers found that the 2005 earthquake was associated with uplift of up to three meters over a 400-kilometer stretch of the Sunda megathrust, the giant fault where Southeast Asia is overriding the Indian and Australian plates. This stretch lies to the south of the 1600-kilometer section of the fault that ruptured in 2004.

Actual slippage on the megathrust surface (about 25 kilometers below the islands) was over 11 meters. The data permitted calculation of the earthquake's magnitude at 8.6, nearly the same as estimates based on seismological recordings.

Most of the deaths in the 2005 earthquake were the direct result of shaking and the collapse of buildings. The earthquake did not trigger a disastrous tsunami comparable to the one that followed the 2004 event. In part, this was because the 2005 rupture was smaller-about one-quarter the length and one-half the slip.

In addition, the largest uplift lay under offshore islands, where there was no water to be displaced. Finally, by rising during the earthquake, the islands gained some instant, extra protection for when the tsunami reached them tens of minutes later.

The scientists were surprised to find that the southern end of the 2004 rupture and the northern end of the 2005 rupture did not quite abut each other, but were separated by a short segment under the island of Simeulue on which the amount of slip was nearly zero. They infer that this segment had not accumulated enough strain to rupture during either event-perhaps, they speculate, because it slips frequently and therefore relieves strain without generating large earthquakes.

Thus, this segment might act as a barrier to rupture propagation. A similar 170-kilometer "creeping" section of the San Andreas fault, between San Francisco and Los Angeles, separates the long section that produced Northern California's great 1906 earthquake from the long section that ruptured during Southern California's great 1857 earthquake.

The southern end of the 2005 rupture was at another short "creeping" segment or weak patch. "Both ends of the 2005 rupture seem to have been at the edges of a weak patch," Sieh explains. The 2005 event therefore probably represents a "characteristic earthquake" that has recurred often over geological time. In fact, old historical records suggest that a very similar earthquake was caused by a rupture of this segment in 1861.

Sieh suggests that installation of GPS instruments along the world's other subduction megathrusts could help more clearly to define those sections that creep stably versus the segments that are locked and thus more likely to break in infrequent, but potentially devastating, ruptures.

Previous work by the Caltech group and their Indonesian colleagues has shown that south of the southern creeping segment lies another locked segment, about 600 kilometers long, which has not broken since a magnitude 9.0 earthquake in 1833. Corals and coastlines along the southern segment record decades of continual, pronounced subsidence, similar to the behavior of the northern region prior to its abrupt uplift during the 2005 fault rupture.

"This southern part is very likely about ready to go again," Sieh says. "It could devastate the coastal communities of southwestern Sumatra, including the cities of Padang and Bengkulu, with a combined population of well over a million people. It could happen tomorrow, or it could happen 30 years from now, but I'd be surprised if it were delayed much beyond that."

Sieh and his colleagues are engaged in efforts to increase public awareness and preparedness for future great earthquakes and tsunamis in Sumatra.

The Science paper is titled "Deformation and slip along the Sunda megathrust in the great 2005 Nias-Simeulue earthquake." The other authors are Aron Meltzner, John Galetzka, Ya-ju Hsu, Mark Simons, and Jean-Philippe Avouac, all at Caltech's Tectonics Observatory; Danny Natawidjaja, Bambang Suwargadi, Nugroho Hananto, and Dudi Prayudi, all at the Indonesian Institute of Sciences; Imam Suprihanto of Jakarta; and Linette Prawirodirdjo and Yehuda Bock at the Scripps Institution of Oceanography.

The research was funded by the Gordon and Betty Moore Foundation, the National Science Foundation, and NASA.

Writer: 
Robert Tindol
Writer: 

Study of 2004 Tsunami Disaster Forces Rethinking of Theory of Giant Earthquakes

PASADENA, Calif.—The Sumatra-Andaman earthquake of December 26, 2004, was one of the worst natural disasters in recent memory, mostly on account of the devastating tsunami that followed it. A group of geologists and geophysicists, including scientists at the California Institute of Technology, has delineated the full dimensions of the fault rupture that caused the earthquake.

Their findings, reported in the March 2 issue of the journal Nature, suggest that previous ideas about where giant earthquakes are likely to occur need to be revised. Regions of the earth previously thought to be immune to such events may actually be at high risk of experiencing them.

Like all giant earthquakes, the 2004 event occurred on a subduction megathrust-in this case, the Sunda megathrust, a giant earthquake fault, along which the Indian and Australian tectonic plates are diving beneath the margin of southeast Asia. The fault surface that ruptured cannot be seen directly because it lies several kilometers deep in the Earth's crust, largely beneath the sea.

Nevertheless, the rupture of the fault caused movements at the surface as long-accumulating elastic strain was suddenly released. The researchers measured these surface motions by three different techniques. In one, they measured the shift in position of GPS stations whose locations had been accurately determined prior to the earthquake.

In the second method, they studied giant coral heads on island reefs: the top surfaces of these corals normally lie right at the water surface, so the presence of corals with tops above or below the water level indicated that the Earth's crust rose or fell by that amount during the earthquake.

Finally, the researchers compared satellite images of island lagoons and reefs taken before and after the earthquake: changes in the color of the seawater or reefs indicated a change in the water's depth and hence a rise or fall of the crust at that location.

On the basis of these measurements the researchers found that the 2004 earthquake was caused by rupture of a 1,600-kilometer-long stretch of the megathrust-by far the longest of any recorded earthquake. The breadth of the contact surface that ruptured ranged up to 150 kilometers. Over this huge contact area, the surfaces of the two plates slid against each other by up to 18 meters.

On the basis of these data, the researchers calculated that the so-called moment-magnitude of the earthquake (a measure of the total energy released) was 9.15, making it the third largest earthquake of the past 100 years and the largest yet recorded in the few decades of modern instrumentation.

"This earthquake didn't just break all the records, it also broke some of the rules," says Kerry Sieh, who is the Sharp Professor of Geology at Caltech and one of the authors of the Nature paper.

According to previous understanding, subduction megathrusts can only produce giant earthquakes if the oceanic plate is young and buoyant, so that it locks tightly against the overriding continental plate and resists rupture until an enormous amount of strain has accumulated.

Another commonly accepted idea is that the rate of relative motion between the colliding plates must be high for a giant earthquake to occur. Both these conditions are true off the southern coast of Chile, where the largest earthquake of the past century occurred in 1960. They are also true off the Pacific Northwest of the United States, where a giant earthquake occurred in 1700 and where another may occur before long.

But at the site of the 2004 Sumatra-Andaman earthquake the oceanic crust is old and dense, and the relative motion between the plates is quite slow. Yet another factor that should have lessened the likelihood of a giant earthquake in the Indian Ocean is the fact that the oceanic crust is being stretched by formation of a so-called back-arc basin off the continental margin.

"For all these reasons, received wisdom said that the giant 2004 earthquake should not have occurred," says Jean-Philippe Avouac, a Caltech professor of geology, who is also a contributor to the paper. "But it did, so received wisdom must be wrong. It may be, for example, that a slow rate of motion between the plates simply causes the giant earthquakes to occur less often, so we didn't happen to have seen any in recent times-until 2004."

Many subduction zones that were not considered to be at risk of causing giant earthquakes may need to be reassessed as a result of the 2004 disaster. "For example, the Ryukyu Islands between Taiwan and Japan are in an area where a large rupture would probably cause a tsunami that would kill a lot of people along the Chinese coast," says Sieh.

"And in the Caribbean, it could well be an error to assume that the entire subduction zone from Trinidad to Barbados and Puerto Rico is aseismic. The message of the 2004 earthquake to the world is that you shouldn't assume that your subduction zone, even though it's quiet, is incapable of generating great earthquakes."

According to Sieh, it's not that all subduction zones should now be assigned a high risk of giant earthquakes, but that better monitoring systems-networks of continuously recording GPS stations, for example-should be put in place to assess their seismic potential.

"For most subduction zones, a $1 million GPS system would be adequate," says Sieh. "This is a small price to pay to assess the level of hazard and to monitor subduction zones with the potential to produce a calamity like the Sumatra-Andaman earthquake and tsunami. Caltech's Tectonics Observatory has, for example, begun to monitor the northern coast of Chile, where a giant earthquake last occurred in 1877."

In addition to Sieh and Avouac, the other authors of the Nature paper are Cecep Subarya of the National Coordinating Agency for Surveys and Mapping in Cibinong, Indonesia; Mohamed Chlieh and Aron Meltzner, both of Caltech's Tectonics Observatory; Linette Prawirodirdjo and Yehuda Bock, both of the Scripps Institution of Oceanography; Danny Natawidjaja of the Indonesian Institute of Sciences; and Robert McCaffrey of Rensselaer Polytechnic Institute.

 

Writer: 
Robert Tindol
Writer: 

Watson Lecture: The 10th Planet

PASADENA, Calif.- In 2005, after seven years scanning half the sky for planets in our solar system beyond Pluto and discovering dozens of large new objects, Michael E. Brown and his colleagues finally found 2003 UB313, aka "Xena," the first object larger than Pluto, and the first that might be called a new planet.

The discovery of 2003 UB313 inspired "a new avalanche of scientific questions," says Brown, professor of planetary astronomy at the California Institute of Technology. Perhaps more importantly for planetary science, it forced into the spotlight the lingering debate over what constitutes a planet.

On Wednesday, February 22, Brown will discuss the discovery of 2003 UB313 and the planet controversy. "By the end of the talk, listeners will know if 2003 UB313 is a planet, what a planet is and how to find one, and how many more planets might be hiding out there," he says. His talk, "Beyond Pluto: Discovery of the 10th Planet," is the first program of the Winter/Spring 2006 Earnest C. Watson Lecture Series.

The talk will take place at 8 p.m. in Beckman Auditorium, 332 S. Michigan Avenue south of Del Mar Boulevard, on the Caltech campus in Pasadena. Seating is available on a free, no-ticket-required, first-come, first-served basis. Caltech has offered the Watson Lecture Series since 1922, when it was conceived by the late Caltech physicist Earnest Watson as a way to explain science to the local community.

Upcoming lectures in the Winter/Spring 2006 series include

o Dianne K. Newman, associate professor of geobiology and environmental science and engineering, Caltech, and investigator at the Howard Hughes Medical Institute, on "Bacterial Biofilms: Far More Than a Collection of Germs," April 12.

o R. Preston McAfee, J. Stanley Johnson Professor of Business Economics and Management and Executive Officer for the Social Sciences, Caltech, on "Why Are Prices So Bizarre?," May 3.

o Paul H. Patterson, Anne P. and Benjamin F. Biaggini Professor of Biological Sciences, Caltech, on "Can One Make a Mouse Model of Mental Illness, and Why Try?," May 17.

For more information, call 1(888) 2CALTECH (1-888-222-5832) or (626) 395-4652.

###

Contact: Kathy Svitil (626) 395-8022 ksvitil@caltech.edu

Visit the Caltech Media Relations website at: http://pr.caltech.edu/media

Writer: 
KS
Writer: 

Dust Found in Earth Sediment Traced to Breakup of the Asteroid Veritas 8.2 Million Years Ago

PASADENA, Calif.—In a new study that provides a novel way of looking at our solar system's past, a group of planetary scientists and geochemists announce that they have found evidence on Earth of an asteroid breakup or collision that occurred 8.2 million years ago.

Reporting in the January 19 issue of the journal Nature, scientists from the California Institute of Technology, the Southwest Research Institute (SwRI), and Charles University in the Czech Republic show that core samples from oceanic sediment are consistent with computer simulations of the breakup of a 100-mile-wide body in the asteroid belt between Mars and Jupiter. The larger fragments of this asteroid are still orbiting the asteroid belt, and their hypothetical source has been known for years as the asteroid "Veritas."

Ken Farley of Caltech discovered a spike in a rare isotope known as helium 3 that began 8.2 million years ago and gradually decreased over the next 1.5 million years. This information suggests that Earth must have been dusted with an extraterrestrial source.

"The helium 3 spike found in these sediments is the smoking gun that something quite dramatic happened to the interplanetary dust population 8.2 million years ago," says Farley, the Keck Foundation Professor of Geochemistry at Caltech and chair of the Division of Geological and Planetary Sciences. "It's one of the biggest dust events of the last 80 million years."

Interplanetary dust is composed of bits of rock from a few to several hundred microns in diameter produced by asteroid collisions or ejected from comets. Interplanetary dust migrates toward the sun, and en route some of this dust is captured by the Earth's gravitational field and deposited on its surface.

Presently, more than 20,000 tons of this material accumulates on Earth each year, but the accretion rate should fluctuate with the level of asteroid collisions and changes in the number of active comets. By looking at ancient sediments that include both interplanetary dust and ordinary terrestrial sediment, the researchers for the first time have been able to detect major dust-producing solar system events of the past.

Because interplanetary dust particles are so small and rare in sediment-significantly less than a part per million-they are difficult to detect using direct measurements. However, these particles are extremely rich in helium 3, in comparison with terrestrial materials. Over the last decade, Ken Farley has measured helium 3 concentrations in sediments formed over the last 80 million years to create a record of the interplanetary dust flux.

To assure that the peak was not a fluke present at only one site on the seafloor, Farley studied two different localities: one in the Indian Ocean and one in the Atlantic. The event is recorded clearly at both sites.

To find the source of these particles, William F. Bottke and David Nesvorny of the SwRI Space Studies Department in Boulder, Colorado, along with David Vokrouhlicky of Charles University, studied clusters of asteroid orbits that are likely the consequence of ancient asteroidal collisions.

"While asteroids are constantly crashing into one another in the main asteroid belt," says Bottke, "only once in a great while does an extremely large one shatter."

The scientists identified one cluster of asteroid fragments whose size, age, and remarkably similar orbits made it a likely candidate for the Earth-dusting event. Tracking the orbits of the cluster backwards in time using computer models, they found that, 8.2 million years ago, all of its fragments shared the same orbital orientation in space. This event defines when the 100-mile-wide asteroid called Veritas was blown apart by impact and coincides with the spike in the interplanetary seafloor sediments Farley had found.

"The Veritas disruption was extraordinary," says Nesvorny. "It was the largest asteroid collision to take place in the last 100 million years."

As a final check, the SwRI-Czech team used computer simulations to follow the evolution of dust particles produced by the 100-mile-wide Veritas breakup event. Their work shows that the Veritas event could produce the spike in extraterrestrial dust raining on the Earth 8.2 million years ago as well as a gradual decline in the dust flux.

"The match between our model results and the helium 3 deposits is very compelling," Vokrouhlicky says. "It makes us wonder whether other helium 3 peaks in oceanic cores can also be traced back to asteroid breakups."

This research was funded by NASA's Planetary Geology & Geophysics program and received additional financial support from Czech Republic grant agency and the National Science Foundation's COBASE program. The Nature paper is titled "A late Miocene dust shower from the breakup of an asteroid in the main belt."

 

 

Writer: 
Robert Tindol
Writer: 

Caltech researchers invent new technique for studying the thermal history of rocks

PASADENA, Calif.—The beautiful valleys of the southern Coast Mountains of British Columbia exist for us to enjoy today because of glacial action in the past. Geologists know, for example, that a giant glacier carved a deep groove in the mountain range to form the present-day Klinaklini Valley. But how fast the cutting actually took place, and when, has hitherto been conjecture.

Now, a 2005 graduate of the California Institute of Technology and his supervising professor have successfully employed a new technique for measuring erosion rates by determining how fast rocks cooled off after being churned up by the glacier. Reporting in the December 9 issue of the journal Science, David Shuster (now at Berkeley Geochronology Center) and Ken Farley, chair of the geological and planetary sciences division at Caltech, along with researchers from Occidental College and the University of Michigan, describe their success in determining how quickly the Klinaklini Valley came into being.

The results show that the two kilometers of overlying rock were removed by the glacier at a rate exceeding five millimeters per year, meaning that the glacial valley deepening occurred six times faster than the normal erosion rate before the glacier came along. The study experimentally confirms the hypothesis that glaciers erode material faster than rivers and are responsible for the topographic relief of mountains. According to Shuster, the study also provides an additional tool for the study of how global climate variations may have influenced mountainous topography.

"This study was possible because of a technique Ken and I developed called helium-helium thermochronometry," Shuster said in a phone interview. "It's an unwieldy name, but it gives us a new way to study the rate at which rocks approached Earth's surface in the past."

The new technique rests on three facts: one, that rocks on the surface have often come from beneath the surface; two, that the ground gets steadily warmer as depth increases; and three, that helium leaks out of a warm rock faster than a cold one. Therefore, if one can figure out how fast the helium leaked out of a rock, then it's also possible to determine how fast the rock cooled and, ultimately, how deeply it was buried, as well as when and how fast it got uncovered.

Helium-helium thermochronometry—or more specifically, 4He/3He thermochronometry—is a novel technique because it requires a rare isotope known as helium-3 to be artificially created in the sample. For this particular study, the researchers collected a calcium phosphate called apatite. As a natural consequence of the decay of uranium and thorium, which exist as trace minerals in apatite, the rock already contains helium-4, but no helium-3.

However, the helium-4 is not dispersed evenly throughout the rock. The dispersal would indeed be even if a rock could just sit for millions of years unheated and unperturbed, but such is not likely in the real world. Rather, the helium-4 tends to leak out during times when the rock is heated. As a consequence, the apatite specimens from the Klinaklini Valley have uneven distributions of helium-4 because the rocks themselves were once beneath the surface. In fact, the ground temperature in the region increases about 25 degrees Celsius for each kilometer of depth.

But while the helium-4 distribution in the apatite samples is a record of how fast the rocks cooled, the researchers cannot determine the rate by looking at helium-4 by itself. For the data necessary to arrive at the rate of cooling, the researchers took the apatite samples to a medical lab in Massachusetts that operates a cyclotron for treating cancer with proton bombardment. By hitting each 100-micron apatite crystal with energetic protons, the researchers managed to create an even dispersal of helium-3 in the samples. The helium-3 comes about as a natural consequence of the proton bombardment.

With both helium-4 and helium-3 now in each apatite sample, the researchers could then compare the ratio of the two isotopes by heating each sample at progressively higher temperatures, making more and more helium leak out, and measuring how much of each isotope was released at each temperature. This allowed them to figure out how the helium-4 was dispersed in each sample and, thus, how fast the rocks had cooled.

"Our technique is for temperatures between about 30 and 70 degrees Celsius," says Farley. "This is the last cooling before a rock comes to the surface, and no other technique accesses this information."

As a result, the team showed that the cooling of the rock happened very quickly about 1.8 million years ago, that the entire valley was carved out in about 300,000 years, and that the total lowering of the area by glacial action was about two kilometers.

"We can say that the glacier was ripping out a huge amount of material and dumping it into the ocean," adds Farley, who is also the Keck Foundation Professor of Geochemistry. "And rather than taking evidence from a single instant, we can for the first time see an integral of hundreds of thousands of years. So this is a new way to get at the rate at which glaciers do their work."

"Why this intense erosion occurred 1.8 million years ago is not well understood, but it seems to coincide with some very interesting changes that took place in Earth's climate system at that time," says Shuster.

According to Shuster, various minerals can be used in the proton-bombardment procedure, although apatite was ideal for the study of the Coast Mountains.

In addition to Shuster and Farley, the other authors are Todd Ehlers, a former postdoctoral researcher at Caltech who is now a member of the University of Michigan faculty, and Margaret Rusmore, a geology professor at Occidental College.

The title of the paper is "Rapid Glacial Erosion at 1.8 Ma Revealed by 4He/3He Thermochronometry."

Writer: 
Robert Tindol
Tags: 
Writer: 

Powerful New Supercomputer Analyzes Earthquakes

PASADENA, Calif.- One of the most powerful computer clusters in the academic world has been created at the California Institute of Technology in order to unlock the mysteries of earthquakes.

The Division of Geological and Planetary Sciences' new Geosciences Computational Facility will feature a 2,048-processor supercomputer, housed in the basement of the Seeley G. Mudd Building of Geophysics and Planetary Science on campus.

Computer hardware fills long rows of black racks in the facility, each contains about 35 compute nodes. Massive air conditioning units line an entire wall of the 20-by-80-foot room to re-circulate and chill the air. Miles of optical-fiber cables tie the processors together into a working cluster that went online in September.

The $5.8 million parallel computing project was made possible by gifts from Dell, Myricom, Intel, and the National Science Foundation.

According to Jeroen Tromp, McMillan Professor of Geophysics and director of the Institute's Seismology Lab, who spearheaded the project, "The other crucial ingredient was Caltech's investment in the infrastructure necessary to house the new machine," he says. Some 500 kilowatts of power and 90 tons of air conditioning are needed to operate and cool the hardware.

David Kewley, the project's systems administrator, explained that's enough kilowatts to power 350 average households.

Tromp's research group will share use of the cluster with other division professors and their research groups, while a job-scheduling system will make sure the facility runs at maximum possible capacity. Tromp, who came to Caltech in 2000 from Harvard, is known as one of the world's leading theoretical seismologists. Until now, he and his Institute colleagues have used a smaller version of the machine, popularly known as a Beowulf cluster. Helping revolutionize the field of earthquake study, Tromp has created 3-D simulations of seismic events. He and former Caltech postdoctoral scholar Dimitri Komatitsch designed a computer model that divides the earth into millions of elements. Each element can be divided into slices that represent the earth's geological features.

In simulations involving tens of millions of operations per second, the seismic waves are propagated from one slice to the next, as they speed up, slow down, and change direction according to the earth's characteristics. The model is analogous to a CAT scan of the earth, allowing scientists to track seismic wave paths. "Much like a medical doctor uses a CAT scan to make an image of the brain, seismologists use earthquake-generated waves to image the earth's interior," Tromp says, adding that the earthquake's location, origin time, and characteristics must also be determined.

Tromp will now be able to deliver better, more accurate models in less time. "We hope to use the new machine to do much more detailed mapping. In addition to improving the resolution of our images of the earth's interior, we will also quantitatively assess the devastating effects associated with earthquakes based upon numerical simulations of strong ground motion generated by hypothetical earthquakes."

"One novel way in which we are planning to use the new machine is for near real-time seismology," Tromp adds. "Every time an earthquake over magnitude 3.5 occurs anywhere in California we will routinely simulate the motions associated with the event. Scientific products that result from these simulations are 'synthetic' seismograms that can be compared to actual seismograms."

The "real" seismograms are recorded by the Southern California Seismic Network (SCSN), operated by the Seismo Lab in conjunction with the U.S. Geological Survey. Of interest to the general public, Tromp expects that the collaboration will produce synthetic ShakeMovies of recent quakes, and synthetic ShakeMaps which can be compared to real ShakeMaps derived from the data. "These products should be available within an hour after the earthquake," he says. The Seismology Lab Media Center will be renovated with a large video wall on which scientists can show the results of simulations and analysis.

The new generation of seismic knowledge may also help scientists, engineers, and others lessen the potentially catastrophic effects of earthquakes.

"Intel is proud to be a sponsor of this premier system for seismic research which will be used by researchers and scientists," said Les Karr, Intel Corporate Business Development Manager. "The project reflects Caltech's growing commitment, in both research and teaching, to a broadening range of problems in computational geoscience. It is also a reflection of the growing use of commercial, commodity computing systems to solve some of the world's toughest problems."

The Dell equipment consists of 1,024 dual Dell PowerEdge 1850 servers that were pre-assembled for easy implementation. Dell Services representatives came to campus to complete the installation.

"CITerra, as this new research tool is known on the TOP500 Supercomputer list, is a proud accomplishment both for Caltech and for Myricom," said Charles Seitz, founder and CEO of Myricom, and a former professor of computer science at Caltech. "The talented technical team of Myricom about half of whom are Caltech alumni/ae, are eager for people to know that the architecture, programming methods, and technology of cluster computing was pioneered at Caltech 20 years ago. Those of us at Myricom who have drawn so much inspiration from our Caltech years are delighted to give some of the results of our efforts back to Caltech."

About Myricom: Founded in 1994, Myricom, Inc. created Myrinet, the high-performance computing (HPC) interconnect technology used in thousands of computing clusters in more than 50 countries worldwide. With its next-generation Myri-10G solutions, Myricom is bridging the gap between the rigorous demands of traditional HPC applications and the growing need for affordable computing speed in mainstream enterprises. Privately held, Myricom achieved and has sustained profitability since 1995 with 42 consecutive profitable quarters through September 2005. Based in Arcadia, California, Myricom solutions are sold direct and through channels. Myrinet clusters are supplied by OEM computer companies including IBM, HP, Dell, and Sun, and by other leading cluster integrators worldwide.

About Intel: Intel, the world's largest chipmaker, is also a leading manufacturer of computer, networking, and communications products. Intel processors, platform architectures, interconnects, networking technology, software tools, and services power some of the fastest computers in the world at price points that have expanded high performance computing beyond the confines of elite supercomputer centers and into the broad community of customers in mainstream industries. Those industries span automotive, aerospace, electronics manufacturing, energy and oil and gas in addition to scientific, research and academic organizations.

About the National Science Foundation: The NSF is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense..." With an annual budget of about $5.5 billion, it is the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science, and the social sciences, NSF is the major source of federal backing.

###

Contact: Jill Perry (626) 395-3226 jperry@caltech.edu Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

Writer: 
JP
Writer: 

Pages

Subscribe to RSS - GPS