Don Anderson Will Be Awarded the 1998 Crafoord Prize

PASADENA—The Royal Swedish Academy of Sciences is to award the 1998 Crafoord Prize in geosciences, with special emphasis upon "the dynamics of the deeper parts of the Earth," to Don L. Anderson of the California Institute of Technology and to Adam M. Dziewonski from Harvard University for their fundamental contributions to our knowledge of the structures and processes in Earth's interior. The 1998 Crafoord Prize is valued at $500,000 dollars, and will be presented to the prizewinners at a ceremony on September 16 in Sweden.

On hearing the news that he had been awarded the 1998 Crafoord Prize, Caltech's Eleanor and John R. McMillan Professor of Geophysics Don Anderson said, "I think it's very significant that deep-Earth geophysics is being honored by this award. It is rare for our field to be acknowledged in this way. I am really delighted that Adam Dziewonski, a close colleague of mine, is also being honored for his work. Most people, when they think of geophysics, think of earthquakes, but seismologists do other things, such as x-raying Earth using seismic tomography to see what is going on in the deep Earth."

Caltech president David Baltimore congratulated Professor Anderson and noted that "the Institute is very proud and pleased that Don will be receiving the Crafoord. It is exciting news. Don's work is truly deserving of this great prize. He is one of the world's most prominent scientists in the area."

According to the Royal Academy, Anderson and Dziewonski have together developed a generally accepted standard model of how Earth is organized and of the dynamics of the processes at its core and in its mantle that govern continental drift, volcanism, and earthquakes.

Anderson and his team have researched changes arising from the pressure deep down in Earth's mantle. Sudden changes in the rock types at depths of 400 kilometers and 660 kilometers are explained by conversions undergone by the rock types, so that they contain minerals entirely unknown at Earth's surface. At 400 kilometers, the mineral olivine, common in lava, changes to spinel, a high-pressure mineral. At 660 kilometers, the mineral perovskite is formed, a mineral otherwise only produced in the laboratory at very high pressures and temperatures. Anderson's research has shown that such changes in composition of the mantle may explain the occurrence of tensions in Earth's crust that can lead to earthquakes. Anderson and his research team have also used seismic data to study convection currents in the mantle, important for understanding continental drift and volcanism. Recently, Anderson has also used geochemical and chemical-isotope methods not only for mapping Earth's development, but also for understanding the development of the moon and the planets Mars and Venus.

Anderson was born in 1933 in Maryland and received his doctorate in geophysics from Caltech in 1962. He has been a leading figure in "deep Earth" research since the 1960s. He was director of the Seismological Laboratory at Caltech from 1967 to 1989. In 1989 he published his "Theory of the Earth," a remarkable synthesis of his broad and provocative research and a guide for geo-researchers from different fields for future exploration of the dynamics of the deep parts of Earth.

The Crafoord Prize is awarded at a ceremony held on September 16, Crafoord Day. On this occasion, the prizewinner gives a public lecture and the Royal Academy organizes an international scientific symposium on a subject from the chosen discipline of the year.

The Anna-Greta and Holger Crafoord's Fund was established in 1980 to promote basic research in mathematics, astronomy, the biosciences (particularly ecology), the geosciences, and polyarthritis. Both an international prize and research grants to Swedish scientists are awarded among the scientific fields mentioned above.

Sue Pitts McHugh
Exclude from News Hub: 

Geologists find more evidence for an active fault beneath downtown and east Los Angeles

LONG BEACH--Geologists report new evidence for a fault beneath Los Angeles that could cause damaging earthquakes in the future.

According to Michael Oskin, a graduate student at the California Institute of Technology (Caltech), the new study concerns an 11-mile-long, previously known geologic fold that runs through the hills north and east of Downtown Los Angeles. This fold provides indirect evidence for an underlying fault.

"Our evidence from the surface is that the fold is still growing," says Oskin. "This indicates that the fault that lies beneath it must also be active."

The fold, first associated with earthquakes at the time of the Whittier Narrows Earthquake, in 1987, is formally known as the Elysian Park Anticlinorium and runs northwest-southeast from Hollywood to Whittier Narrows. Three smaller "wrinkles" formed upon the southwest-facing flank of this fold have been investigated in detail by the Caltech scientists.

Their studies of sediment deposited by the Los Angeles River and its tributaries indicate that these small folds have been active during the past 60,000 years. During that time, the area has been contracting north-south at a rate of at least a half-millimeter per year.

"Our evidence that this structure is active does not increase the overall hazard in the metropolitan region," says coauthor Kerry Sieh, a professor of geology at Caltech. "Rather, it allows us to be more specific about how, where, and how fast deformation is occurring in the area.

The length of the surface features suggests that the underlying fault is about 11 miles in length and may extend 10 or so miles into the earth. Such a fault, if it ruptured all at once, could produce a 6.5- to 6.8-magnitude earthquake.

The rate of deformation suggests that such events might occur, on average, about once every one to three thousand years.

Also contributing data and resources to the study were the Southern California Earthquake Center, Metropolitan Transit Authority (MTA), Engineering Management Consultant, and Earth Technology Corporation.

Oskin and Sieh reported on their work at the Geological Society of America meeting in Long Beach April 7, 1998.

Robert Tindol

Yucca Mountain Is Possibly More Seismically Active Than Once Believed, Geologists Discover

PASADENA—Recent geodetic measurements using Global Positioning System (GPS) satellites show that the Yucca Mountain area in southern Nevada is straining roughly 10 to 100 times faster than expected on the basis of the geologic history of the area. And for the moment at least, geologists are at a loss to explain the anomaly.

In the March 28 issue of the journal Science, Brian Wernicke of the California Institute of Technology (Caltech) and his colleagues at the [Smithsonian Astrophysical Observatory] in Cambridge, Massachusetts, report on Global Positioning System surveys they conducted from 1991 to 1997. Those surveys show that the Yucca Mountain area is stretching apart at about one millimeter per year east-southeastward.

"The question is, why are the predicted geological rates of stretching so much lower than what we are measuring?" asks Wernicke. "That's something we need to think through and understand."

The answer is likely to be of interest to quite a few people, because Yucca Mountain has been proposed as a site for the permanent disposal of high-level radioactive waste. Experts believe that the waste-disposal site can accommodate a certain amount of seismic activity, but they nonetheless would like for any site to have a certain amount of stability over the next 10,000 to 100,000 years.

Yucca Mountain was already known to have both seismic and volcanic activity, Wernicke says. An example of the former is the 5.4-magnitude "Little Skull Mountain" earthquake that occurred in 1992. And an example of the latter is the 80,000-year-old volcano to the south of the mountain. The volcano is inactive, but still must be studied according to Department of Energy regulations.

The problem the new study poses is that the strain is building up in the crust at a rate about one-fourth that of the most rapidly straining areas of the earth's crust, such as near the San Andreas fault, Wernicke says. But there could be other factors at work.

"There are three possibilities that we outline in the paper as to why the satellite data doesn't agree with the average predicted by the geological record," he says. "Either the average is wrong, or we are wrong, or there's some kind of pulse of activity going on and we just happened to take our data during the pulse."

The latter scenario, Wernicke believes, could turn out to be the case. But if Yucca Mountain is really as seismically active as the current data indicate at face value, the likelihood of magmatic and tectonic events could be 10 times higher than once believed.


Robert Tindol

Mars Global Surveyor already bringing in scientific payoff

PASADENA—Despite a 12-month delay in aerobraking into a circular orbit, the Mars Global Surveyor is already returning a wealth of data about the atmosphere and surface of the Red Planet.

According to mission scientist Arden Albee of the California Institute of Technology, all scientific instruments on the Mars probe are fully functioning and providing good data. Early results from data collected during the 18 elliptical orbits in October and November are being reported in this week's issue of Science.

"For the first time, a spacecraft has captured the start of a major dust storm on Mars and has followed it through its development and demise," Albee says. "Also, we've received a number of narrow-angle high-resolution images that are enough to put any planetary geologist into a state of ecstasy."

These accomplishments are especially noteworthy when considering that the probe developed a glitch when it first began tightening up its orbit last September. For various reasons having to do with design and cost, Global Surveyor was set on a course that took it initially into a huge sweeping elliptical orbit of Mars.

On its near approach in each orbit, the probe was to dip into the upper atmosphere of Mars in a maneuver known as "aerobraking," which would effectively slow the probe down and eventually place it into a near-circular orbit.

But a solar–panel damper failed early in the mission, and damage to the solar panel forced the team to slow down the aerobraking. At the current rate of aerobraking, Mars Global Surveyor will enter its circular mapping orbit in March 1999.

This has delayed the systematic mapping of Mars, but Albee says that the new mission plan nonetheless permits the collection of science data in a 12-hour elliptical orbit, from March to September of this year.

"Another exciting discovery is that the Martian crust exhibits layering to much greater depths than would have been expected," Albee says. "Steep walls of canyons, valleys, and craters show the Martian crust to be stratified at scales of a few tens of meters.

"At this point, it is simply not known whether these layers represent piles of volcanic flows or sedimentary rocks that might have formed in a standing body of water," he adds.

The Mars Global Surveyor team has previously announced that the on-board magnetometer shows Mars to have a more complex magnetic field than once thought. Of particular interest is the fact that the magnetic field was apparently once about the same strength as that of present-day Earth.

Many experts think that a strong magnetic field may be crucial for the evolution of life on a planet. Without a magnetic field, a planet tends to have any existing atmospheric particles blasted away by cosmic rays in a process known as "sputtering."

And finally, the Mars Orbiting Laser Altimeter (MOLA) has already sent back 18 very good topographic profiles of the northern hemisphere. "Characterization of these features is leading to a new understanding of the surface processes of Mars," Albee says.

Robert Tindol

Biological Activity the Likely Culprit of Atmospheric Nitrous Oxide Increases

PASADENA–Nitrous oxide (N2O) is an atmospheric gas known to contribute both to global warming and ozone depletion. New research suggests that its changing concentration in the atmosphere is largely a result of biological activity.

In the December 5 issue of the journal Science, Yuk Yung of the California Institute of Technology and Charles Miller of Caltech's Jet Propulsion Lab (now at the Haverford College Department of Chemistry) describe their work on isotopes of N2O, a greenhouse gas that is of increasing concern because its concentration in the atmosphere has been rising for several decades.

N2O is a molecule with two atoms of nitrogen and a single atom of oxygen. It is created in the decay of organic material, principally plants, but is also generated in the manufacture of nylon.

Scientists have known for years that N2O enters the nitrogen cycle, but the ultimate sources and sinks of the gas have been unclear. By contrast, carbon dioxide, another greenhouse gas, is known to be a direct consequence of industrial activity.

"Nitrous oxide is less important as a greenhouse molecule than carbon dioxide, and slightly less important than methane," says Yung, a professor of planetary science at Caltech. "But the concentrations have been increasing since good measurements began 20 years ago, and ice core samples in Greenland and Antarctica suggest that it has been increasing since the Industrial Revolution began."

Yung and Miller specifically looked at isotopes of nitrous oxide once it enters the stratosphere. Isotopes are variations of a chemical element that have the same number of protons (and thus the same atomic number), but a different number of neutrons. Thus, a 15N atom of nitrogen has its regular seven protons and seven neutrons, but an additional neutron as well.

A careful analysis of isotopic variations is an effective way of tracing substances to their sources. If the nitrogen-based fertilizer in agriculture has a known isotopic makeup and that same percentage is found in the stratosphere, for example, then it can be concluded that agricultural fertilization is a contributor.

Yung and Miller examined theoretically how isotopes of nitrous oxide interact with ultraviolet light energy. They predict that, as N2O is destroyed by light, heavier isotopes survive preferentially because molecules comprising slightly heavier isotopes require a bit more energy for the atoms to separate.

From their theory and related atmospheric measurements presented in the same issue by researchers at the Scripps Institution of Oceanography and the University of California at San Diego, Yung and Miller conclude that new chemical sources do not need to be introduced to account for the isotopic concentrations that are indeed observed in the stratosphere.

Thus, sources such as the decay of plant life and the burning of rainforests and other biomass burning can account for the signatures that are seen. Experimental verification of the predictions is now under way in the laser spectroscopy lab of Caltech cosmochemist Geoff Blake.

Understanding the sources can give society a better grip on the possibilities of dealing with the problem, Yung says.

"I think the most reasonable explanation for the increase is that we are accelerating biological activity globally," he says. "Because of global warming, the use of agricultural fertilizers, and nitrogen made from pollution that acts just like a fertilizer, the biosphere has been stimulated. This fosters the growth/decay cycle which leads to N2O release."

The next step for the researchers is to pin down the precise isotopic signatures of various biological and atmospheric processes. But there may be little that can realistically or politically be done if biology on a planetary scale is responsible, Geoff Blake says.

"We may just have to live with it.

Robert Tindol

Geophysicists Develop Model to Describe Huge Gravity Anomaly of Hudson Bay Region

PASADENA—While the gravity field of Earth is commonly thought of as constant, in reality there are small variations in the gravitational field as one moves around the surface of the planet.

These variations have typical magnitudes of about one–ten thousandth of the average gravitational attraction, which is approximately 9.8 meters per second per second. A global map of these variations shows large undulations at a variety of length scales. These undulations are known as gravity anomalies.

There are many such anomalies in Earth's gravity field, but one of the largest negative gravity anomalies (implying the attractions of gravity being a little less than average, or in other words, a mass deficit) centered over Hudson Bay, Canada. Using a new approach to analyzing planetary gravity fields, two geophysicists, Mark Simons at the California Institute of Technology and Bradford Hager at M.I.T., have shown that incomplete glacial rebound can account for a substantial portion of the Hudson Bay gravity anomaly.

With this new information, Simons and Hager were able to place new constraints on the variations in strength of the materials that constitute the outer layers of Earth's interior (the crust and mantle). Their work appears in the December 4 issue of the journal Nature.

About 18,000 years ago, Hudson Bay was at the center of a continental–sized glacier. Known as the Laurentide ice sheet, this glacier had a thickness of several kilometers. The weight of the ice bowed the surface of Earth down. The vast majority of the ice eventually melted at the end the Ice Age, leaving a depression in its wake.

While this depression has endured for thousands of years, it has been gradually recovering or "flattening itself out." The term "glacial rebound" refers to this exact behavior, whereby the land in formerly glaciated areas rises after the ice load has disappeared.

Evidence of this is seen in coastlines located near the center of the former ice sheet. These coastlines have already risen several hundred meters and will continue to rebound.

"The rate at which the area rebounds is a function of the viscosity of Earth," says Simons. "By looking at the rate of rebound going on, it's possible to learn about the planet's viscosity."

Simons says that geophysicists have known for some time about the Hudson Bay gravity anomaly, but have hitherto been uncertain how much of the gravity anomaly is a result of glacial rebound and how much is due to mantle convection or other processes.

The gravity anomaly is measured from both the ground and from space. Simons and Hager use a gravity data set developed by researchers at the Goddard Space Flight Center.

However, knowing how much of an anomaly exists at a certain site on Earth is not sufficient to determine the pliability of the materials beneath it. For this, Simons and his former M.I.T. colleague Hager have developed a new mathematical tool that looks at the spatial variations of the spectrum of the gravity field.

In many instances, this approach allows one to separate the signatures of geologic processes that occur at different locations on Earth. In particular, Simons and Hager were able to isolate the glacial rebound signature from signatures of other processes, such as manifestations of plate tectonics, that dominate that gravity field but are concentrated at other geographic locations.

Having an estimate of incomplete postglacial rebound allowed Simons and Hager to derive a model of how the viscosity of the mantle changes with depth. Simons and Hager propose one such model that explains both the gravity anomaly as well as the uplift rates estimated from the coastlines.

Their favored model suggests that underneath the oldest parts of continents (some of which are over 4 billion years old) the viscosity of the outer 400 kilometers of Earth is much stiffer than under the oceans. Therefore, these continental keels can resist the erosion by the convective flow that drives plate tectonics.



Robert Tindol

Caltech Scientists Find Evidence For Massive Ice Age When Earth Was 2.4 billion Years Old

PASADENA— Those who think the winter of '97 was rough should be relieved that they weren't around 2.2 billion years ago. Scientists have discovered evidence for an ice age at the time that was severe enough to partially freeze over the equator. In today's new issue of Nature, California Institute of Technology geologists Dave Evans and Joseph Kirschvink report evidence that glaciers came within a few degrees of the equator's latitude when the planet was about 2.4 billion years old. They base their conclusion on glacial deposits discovered in present-day South Africa, plus magnetic evidence showing where South Africa's crustal plate was located at that time.

Based on that evidence, the Caltech researchers think they have documented the extremely rare "Snowball Earth" phenomenon, in which virtually the entire planet may have been covered in ice and snow. According to Kirschvink, who originally proposed the Snowball Earth theory, there have probably been only two episodes in which glaciation of the planet reached such an extent — one less than a billion years ago during the Neoproterozoic Era, and the one that has now been discovered from the Paleoproterozoic Era 2.2 billion years ago.

"The young Earth didn't catch a cold very often," says Evans, a graduate student in Kirschvink's lab. "But when it did, it seems to have been pretty severe."

The researchers collected their data by drilling rock specimens in South Africa and carefully recording the magnetic directions of the samples. From this information, the researchers then computed the direction and distance to the ancient north and south poles.

The conclusion was that the place in which they were drilling was 11 degrees (plus or minus five degrees) from the equator when Earth was 2.4 billion years old. Plate tectonic motions since that time have caused South Africa to drift all over the planet, to its current position at about 30 degrees south latitude. Additional tests showed that the samples were from glacial deposits, and further, were characteristic of a widespread region.

Kirschvink and Evans say that the preliminary implications are that Earth can somehow manage to pull itself out of a period of severe glaciation. Because ice and snow tend to reflect sunlight much better than land and water, Earth would normally be expect to have a hard time reheating itself in order to leave an ice age. Thus, one would expect a Snowball Earth to remain forever.

Yet, the planet obviously recovered both times from the severe glaciation. "We think it is likely that the intricacies of global climate feedback are not yet completely understood, especially concerning major departures from today's climate," says Evans. "If the Snowball Earth model is correct, then our planet has a remarkable resilience to abrupt shifts in climate.

"Somehow, the planet recovered from these ice ages, probably as a result of increased carbon dioxide — the main greenhouse gas."

Evans says that an asteroid or comet impact could have caused carbon dioxide to pour into the atmosphere, allowing Earth to trap solar energy and reheat itself. But evidence of an impact during this age, such as a remote crater, is lacking.

Large volcanic outpourings could also have released a lot of carbon dioxide, as well as other factors, such as sedimentary processes and biological factors.

At any rate, the evidence for the robustness of the planet and the life that inhabits it is encouraging, the researchers say. Not only did Earth pull itself out of both periods of severe glaciation, but many of the single-celled organisms that existed at the time managed to persevere.

Robert Tindol

State-of-the-Art Seismic Network Gets First Trial-by-Fire During This Morning's 5.4-magnitude Earthquake

PASADENA—Los Angeles reporters and camera crews responding to a 5.4-magnitude earthquake this morning got their first look at the new Caltech/USGS earthquake monitoring system.

The look was not only new but almost instantaneous. Within 15 minutes of the earthquake, Caltech seismologists had already printed out a full-color poster-sized map of the region to show on live TV, and had already posted the contour map on the Internet. Moreover, they were able to determine the magnitude of the event within five minutes — a tremendous improvement over the time it once took to confirm data.

"Today, we had a much better picture of how the ground responded to the earthquake than we've ever had in the past," said Dr. Lucile Jones, a U.S. Geological Survey seismologist who is stationed at Caltech. "This was the largest earthquake we've had since September of 1995, and was the first time we've been able to use the new instruments that we're still installing."

The new instruments are made possible by the TriNet Project, a $20.75-million initiative for providing a state-of-the-art monitoring network for Southern California. A scientific collaboration between Caltech, the USGS and the California Department of Conservation's Division of Mines and Geology, the project is designed to provide real-time earthquake monitoring and, ultimately, to lead to early-warning technology to save lives and mitigate urban damage after earthquakes occur.

"The idea of Trinet was to get quick locations and magnitudes out, to get quick estimates of the distribution of the ground shaking, and a prototype early-warning system," Caltech seismic analyst Egill Hauksson said an hour after this morning's earthquake. "The first two of those things are already in progress. We are in the midst of deploying hardware in the field and developing data-processing software." TriNet was announced earlier this year when funding was approved by the Federal Emergency Management Agency. The new system relies heavily on recent advances in computer communications technology and data processing.

The map printed out this morning (the ShakeMap) is just a preview of future TriNet products. Caltech seismologist Kate Hutton gave a number of TV interviews in front of the map this morning. The map was noteworthy not only for the speed in which it was produced, but also for the manner in which information about the earthquake was relayed.

Instead of charting magnitudes, the map was drawn in such a way that the velocity the ground moved was shown with contour lines. The most rapid movement in the 5.4 earthquake this morning was about two inches per second at the epicenter, and this was clearly indicated in the innermost circle on the color map. Moving outward from the epicenter of the earthquake, the velocity of ground movement decreased, and this was indicated by lower velocity numbers in the outer circles.

The maps can also be printed out to show ground accelerations, which are especially useful for ascertaining likely damage in an earthquake area, Hutton said.

Later, the TriNet will result in prototype early warnings to distant locations in the Los Angeles area that potentially damaging ground shaking is on the way. After an earthquake occurs, the seismic waves travel a few kilometers per second, while communication transmission can travel the speed of light. Thus, Los Angeles could eventually receive warning of a major earthquake at the San Andreas fault some 30 to 60 seconds before the heavy shaking actually began in the city.

The total cost of the project is $20.75 million. FEMA will provide $12.75 million, the USGS has provided $4.0 million, and the balance is to be matched by Caltech ($2.5 million) and the DOC ($1.75 million). Several private sector partners, including GTE and Pacific Bell, are assisting Caltech with matching funds for its portion of the TriNet balance.

The TriNet Project is being built upon existing networks and collaborations. Southern California's first digital network began with the installation of seismographs known as TERRAscope, and was made possible by a grant from the L.K. Whittier Foundation and the ARCO Foundation. Also, Pacific Bell through its CalREN Program has provided new frame-relay digital communications technology.

A major step in the modernization came in response to the Northridge earthquake, when the USGS received $4.0 million from funds appropriated by Congress to the National Earthquake Hazard Reduction Program. This money was the first step in the TriNet project and the USGS has been working with Caltech for the last 27 months to begin design and implementation. Significant progress has already been made and new instrumentation is now operational:

o Thirty state-of-the-art digital seismic stations are operating with continuous communication to Caltech/USGS

o Twenty strong-motion sites installed near critical structures

o Two high-rise buildings have been instrumented

o Alarming and processing software have been designed and implemented

o Automated maps of contoured ground shaking are available on the Web within a few minutes after felt and damaging earthquakes (

DOC's strong motion network in Southern California is a key component of the TriNet Project, contributing 400 of the network's 650 sensing stations. DOC's network expansion and upgrade through the funding of this project will allow much better information about strong shaking than was possible for the Northridge earthquake. This data is the key to improving building codes for more earthquake-resistant structures.

Robert Tindol

Researchers Establish Upper Limit of Temperature at the Core-mantle Boundary of Earth

PASADENA— Researchers at the California Institute of Technology have determined that Earth's mantle reaches a maximum temperature of 4,300 degrees Kelvin. The results are reported in the March 14, 1997, issue of the journal Science.

According to geophysics professor Tom Ahrens and graduate student Kathleen Holland, the results are important for setting very reliable bounds on the temperature of Earth's interior. Scientists need to know very precisely the temperature at various depths in order to better understand large-scale processes such as plate tectonics and volcanic activity, which involves movement of molten rock from the deep interior of the Earth to the surface.

"This nails down the maximum temperature of the lower-most mantle, a rocky layer extending from a depth of 10 to 30 kilometers to a depth of 2900 kilometers, where the molten iron core begins," Ahrens says. "We know from seismic data that the mantle is solid, so it has to be at a lower temperature than the melting temperature of the materials that make it up."

In effect, the research establishes the melting temperature of the high-pressure form of the crystal olivine. At normal pressures, olivine is known by the formula (Mg,Fe)2SiO4, and is a semiprecious translucent green gem. At very high pressures, olivine breaks down into magnesiowüstite and a mineral with the perovskite structure. Together these two minerals are thought to make up the bulk of the materials in the lower mantle.

The researchers achieved these ultra-high pressures in their samples by propagating a shock wave into them, using a high-powered cannon apparatus, called a light-gas gun. This gun launches projectiles at speeds of up to 7 km/sec. Upon impact with the sample, a strong shock wave causes ultra-high pressures to be achieved for only about one-half a millionth of a second. The researchers have established the melting temperature at a pressure of 1.3 million atmospheres. This is the pressure at the boundary of the solid lower mantle and liquid outer core.

"We have replicated the melting which we think occurs in the deepest mantle of the Earth," says Holland, a doctoral candidate in geophysics at Caltech. "This study shows that material in the deep mantle can melt at a much lower temperature than had been previously estimated. It is exciting that we can measure phase transitions at these ultra-high pressures."

The researchers further note that the temperature of 4,300 degrees would allow partial melting in the lowest 40 kilometers or so of the lower mantle. This agrees well with seismic analysis of wave forms conducted in 1996 by Caltech Professor of Seismology, Donald Helmberger, and his former graduate student, Edward Garnero. Their research suggests that at the very lowest reaches of the mantle there is a partially molten layer, called the Ultra-Low-Velocity-Zone.

"We're getting into explaining how such a thin layer of molten rock could exist at great depth," says Ahrens. "This layer may be the origin layer that feeds mantle plumes, the volcanic edifices such as the Hawaiian island chain and Iceland. "We want to understand how Earth works."

Robert Tindol

Caltech Geologists Find New Evidence That Martian Meteorite Could Have Harbored Life

PASADENA—Geologists studying Martian meteorite ALH84001 have found new support for the possibility that the rock could once have harbored life.

Moreover, the conclusions of California Institute of Technology researchers Joseph L. Kirschvink and Altair T. Maine, and McGill University's Hojatollah Vali, also suggest that Mars had a substantial magnetic field early in its history.

Finally, the new results suggest that any life on the rock existing when it was ejected from Mars could have survived the trip to Earth.

In an article appearing in the March 13 issue of the journal Science, the researchers report that their findings have effectively resolved a controversy about the meteorite that has raged since evidence for Martian life was first presented in 1996. Even before this report, other scientists suggested that the carbonate globules containing the possible Martian fossils had formed at temperatures far too hot for life to survive. All objects found on the meteorite, then, would have to be inorganic.

However, based on magnetic evidence, Kirschvink and his colleagues say that the rock has certainly not been hotter than 350 degrees Celsius in the past four billion years—and probably has not been above the boiling point of water. At these low temperatures, bacterial organisms could conceivably survive.

"Our research doesn't directly address the presence of life," says Kirschvink. "But if our results had gone the other way, the high-temperature scenario would have been supported."

Kirschvink's team began their research on the meteorite by sawing a tiny sample in two and then determining the direction of the magnetic field held by each. This work required the use of an ultrasensitive superconducting magnetometer system, housed in a unique, nonmagnetic clean lab facility. The team's results showed that the sample in which the carbonate material was found had two magnetic directions—one on each side of the fractures.

The distinct magnetic directions are critical to the findings, because any weakly magnetized rock will reorient its magnetism to be aligned with the local field direction after it has been heated to high temperatures and cooled. If two such rock fragments are attached so that their magnetic directions are separate, but are then heated to a certain critical temperature, they will have a uniform direction.

The igneous rock (called pyroxenite) that makes up the bulk of the meteorite contains small inclusions of magnetic iron sulfide minerals that will entirely realign their field directions at about 350°C, and will partially align the field directions at much lower temperatures. Thus, the researchers have concluded that the rock has never been heated substantially since it last cooled some four billion years ago.

"We should have been able to detect even a brief heating event over 100 degrees Celsius," Kirschvink says. "And we didn't."

These results also imply that Mars must have had a magnetic field similar in strength to that of the present Earth when the rock last cooled. This is very important for the evolution of life, as the magnetic field will protect the early atmosphere of a planet from being sputtered away into space by the solar wind. Mars has since lost its strong magnetic field, and its atmosphere is nearly gone.

The fracture surfaces on the meteorite formed after it cooled, during an impact event on Mars that crushed the interior portion. The carbonate globules that contain putative evidence for life formed later on these fracture surfaces, and thus were never exposed to high temperatures, even during their ejection from the Martian surface nearly 15 million years ago, presumably from another large asteroid or comet impact.

A further conclusion one can reach from Kirschvink's work is that the inside of the meteorite never reached high temperatures when it entered Earth's atmosphere. This means, in effect, that any remaining life on the Martian meteorite could have survived the trip from Mars to Earth (which can take as little as a year, according to some dynamic studies), and could have ridden the meteorite down through the atmosphere by residing in the interior cracks of the rock and been deposited safely on Earth.

"An implication of our study is that you could get life from Mars to Earth periodically," Kirschvink says. "In fact, every major impact could do it." Kirschvink's suggested history of the rock is as follows:

The rock crystallized from an igneous melt some 4.5 billion years ago and spent about half a billion years on the primordial planet, being subjected to a series of impact-related metamorphic events, which included formation of the iron sulfide minerals.

After final cooling in the ancient Martian magnetic field about four billion years ago, the rock would have had a single magnetic field direction. Following this, another impact crushed parts of the meteorite without heating it, and caused some of the grains in the interior to rotate relative to each other. This led to a separation of their magnetic directions and produced a set of fracture cracks. Aqueous fluids later percolated through these cracks, perhaps providing a substrate for the growth of Martian bacteria. The rock then sat more or less undisturbed until a huge asteroid or comet smacked into Mars 15 million years ago. The rock wandered in space until about 13,000 years ago, when it fell on the Antarctic ice sheet.

Robert Tindol


Subscribe to RSS - GPS