Yucca Mountain Is Possibly More Seismically Active Than Once Believed, Geologists Discover

PASADENA—Recent geodetic measurements using Global Positioning System (GPS) satellites show that the Yucca Mountain area in southern Nevada is straining roughly 10 to 100 times faster than expected on the basis of the geologic history of the area. And for the moment at least, geologists are at a loss to explain the anomaly.

In the March 28 issue of the journal Science, Brian Wernicke of the California Institute of Technology (Caltech) and his colleagues at the [Smithsonian Astrophysical Observatory] in Cambridge, Massachusetts, report on Global Positioning System surveys they conducted from 1991 to 1997. Those surveys show that the Yucca Mountain area is stretching apart at about one millimeter per year east-southeastward.

"The question is, why are the predicted geological rates of stretching so much lower than what we are measuring?" asks Wernicke. "That's something we need to think through and understand."

The answer is likely to be of interest to quite a few people, because Yucca Mountain has been proposed as a site for the permanent disposal of high-level radioactive waste. Experts believe that the waste-disposal site can accommodate a certain amount of seismic activity, but they nonetheless would like for any site to have a certain amount of stability over the next 10,000 to 100,000 years.

Yucca Mountain was already known to have both seismic and volcanic activity, Wernicke says. An example of the former is the 5.4-magnitude "Little Skull Mountain" earthquake that occurred in 1992. And an example of the latter is the 80,000-year-old volcano to the south of the mountain. The volcano is inactive, but still must be studied according to Department of Energy regulations.

The problem the new study poses is that the strain is building up in the crust at a rate about one-fourth that of the most rapidly straining areas of the earth's crust, such as near the San Andreas fault, Wernicke says. But there could be other factors at work.

"There are three possibilities that we outline in the paper as to why the satellite data doesn't agree with the average predicted by the geological record," he says. "Either the average is wrong, or we are wrong, or there's some kind of pulse of activity going on and we just happened to take our data during the pulse."

The latter scenario, Wernicke believes, could turn out to be the case. But if Yucca Mountain is really as seismically active as the current data indicate at face value, the likelihood of magmatic and tectonic events could be 10 times higher than once believed.


Robert Tindol

Physicists create first nanometer-scale mechanical charge detector

PASADENA—Wristwatch cellular phones and space probes the size of baseballs would certainly have some eager customers, but both are still the stuff of science fiction.

Nonetheless, physicists are making strides these days in the sort of miniaturization that could someday make tiny electromechanical devices a reality. One such milestone, the first nanometer-scale mechanical charge detector, is reported in the current issue of Nature.

According to Michael Roukes, professor of physics at Caltech and coinventor of the device, the new electrometer is among the most sensitive charge detectors in existence, and definitely the first based upon nanomechanical principles.

"One compelling reason for doing this sort of thing is to explore entirely new avenues for making small, ultralow power electronic devices," says Roukes.

"Making new types of electronic devices that involve moving elements, which we call nanoelectromechanical systems, will open up a huge variety of new technological applications in areas such as telecommunications, computation, magnetic resonance imaging, and space exploration. And the physics is exciting, besides."

The device fabricated at Caltech by Roukes and his former postdoctoral collaborator, Andrew Cleland (now an assistant professor at UC Santa Barbara), is a good example of the type of advances in solid-state devices that currently are loosely gathered these days under the rubric "nanotechnology." Roukes says he generally avoids using the term. "Rather, this is the kind of science that is building the foundation for real nanotechnology, not the stuff of fiction. Right now Mother Nature is really the only true nanotechnologist."

A nanometer is one-billionth of a meter, which is about a hundred-thousandth the width of a human hair. A few atoms stacked side-by-side span about a nanometer.

To give an idea of the scale, Roukes points out that the devices are far smaller than cellular organisms; a clear picture of the device's inner workings can only be taken with an electron microscope.

The scale is especially noteworthy when one considers that the electrometer is actually a mechanical device, in the same manner as an old-fashioned clock. In other words, there are moving parts at its heart. In the Caltech devices, movement is induced by tiny wires that exert forces on the nanomechanical elements when a minute external electrical current is applied to them.

"The simplest kinds of mechanical structures are resonators, for example, cantilevers—in other words, a structure like a diving board—or thin clamped beams, something like a thick guitar string attached at both ends," Roukes explains. "They really are mechanical structures—you 'pluck' them to get them to vibrate."

"What's fascinating is that, if you can get these things small enough, they'll vibrate billions of times per second—which gives them the same frequency as the microwaves used in telecommunications," he says. "That's because their mass is very small, which means there's less inertia for internal forces to overcome.

There is a second important aspect to nanomechanical systems, Roukes adds. "Because the distances involved are very small, the amplitudes of their vibrations are very small. For this reason, the amount of energy you would have to put into such devices to get them going is extremely minute.

"This means that for certain critical applications—like small communicators and miniaturized satellites—you would not have to carry along nearly as much energy to run the device."

The latter would be fortuitous in any circumstances where carrying along power is difficult. Transistors in the best receiving devices today can run on a few thousandths of a watt, but with nanotechnology, they could run on a few billionths of a watt, or less. Thus, planetary space probes (which employ such devices in spades) could be much smaller, since they could get by with a much smaller energy source.

At the center of the Caltech nanoelectromechanical charge detection device are small rods that vibrate something like a nanoscale tuning fork. In their ultimate incarnation, which Roukes believes his lab can achieve in the next few years, these rods will be about 100 nanometers long, 10 nanometers wide, and 10 nanometers thick.

Roukes indicates that a silicon beam of such small dimensions would vibrate at about 7 gigahertz (or 7 billion times per second) if it is clamped down at both ends. When one considers that a top-of-the-line personal computer these days has a clock speed about twenty times slower, the advantages become apparent.

But it's not necessarily the replacement of conventional computer components that Roukes is after. It turns out that the small resonators his group is currently able to manufacture on campus—if cooled to temperatures a few tenths of a degree above absolute zero—sit right at the border where the quantum effects governing individual atoms and particles take over.

Working with these quantum effects is a daunting technological challenge, but success could lead to devices such as quantum computers.

"There is a natural dividing line that depends on the temperature and the frequency. Basically, if you can get the temperature low enough and the frequency is high enough, then you can operate at the quantum level.

"We could do this today," Roukes says. "In my laboratories we can get to temperatures a few thousandths of a degree above absolute zero. We also have the sizes small enough to give us sufficiently high frequencies.

"But what we don't yet know how to do is to probe these structures optimally."

In fact, one of the main themes of work in Roukes's group on nanoscale electromechanical devices is pretty much "how to talk to the devices and how to listen to them," he says. To measure a system is to probe it somehow, and to probe it is to interact with it.

The problem is that interacting with the system is, in essence, to alter its properties. In the worst case, which is easy to do, one could actually heat it sufficiently to raise its energy above the point at which it would cease functioning as a quantum-limited mechanical device.

"But there are lots of different physical processes on which we can base signal transducers. We are looking for the right approach that will allow us to listen to and hear from these devices at the scale of the quantum limit," he says.

"There's lots of interesting physics, and practical applications that we are learning about in the process."

As far as the device reported in Nature is concerned, Roukes says that the scales involved set a milestone—that of submicron mechanical structures—that is encouraging for scientists and technologists in the field. In addition to possibilities for telecommunications, techniques on which the experimental prototype is based should also lead to significant improvements in magnetic resonance detection.

These, in turn, could lead to imaging with a thousand times better resolution than that currently available.

Roukes's group, in close collaboration with P. Chris Hammel's group at Los Alamos National Laboratory, is already hard at work on these possibilities.

Robert Tindol

Mars Global Surveyor already bringing in scientific payoff

PASADENA—Despite a 12-month delay in aerobraking into a circular orbit, the Mars Global Surveyor is already returning a wealth of data about the atmosphere and surface of the Red Planet.

According to mission scientist Arden Albee of the California Institute of Technology, all scientific instruments on the Mars probe are fully functioning and providing good data. Early results from data collected during the 18 elliptical orbits in October and November are being reported in this week's issue of Science.

"For the first time, a spacecraft has captured the start of a major dust storm on Mars and has followed it through its development and demise," Albee says. "Also, we've received a number of narrow-angle high-resolution images that are enough to put any planetary geologist into a state of ecstasy."

These accomplishments are especially noteworthy when considering that the probe developed a glitch when it first began tightening up its orbit last September. For various reasons having to do with design and cost, Global Surveyor was set on a course that took it initially into a huge sweeping elliptical orbit of Mars.

On its near approach in each orbit, the probe was to dip into the upper atmosphere of Mars in a maneuver known as "aerobraking," which would effectively slow the probe down and eventually place it into a near-circular orbit.

But a solar–panel damper failed early in the mission, and damage to the solar panel forced the team to slow down the aerobraking. At the current rate of aerobraking, Mars Global Surveyor will enter its circular mapping orbit in March 1999.

This has delayed the systematic mapping of Mars, but Albee says that the new mission plan nonetheless permits the collection of science data in a 12-hour elliptical orbit, from March to September of this year.

"Another exciting discovery is that the Martian crust exhibits layering to much greater depths than would have been expected," Albee says. "Steep walls of canyons, valleys, and craters show the Martian crust to be stratified at scales of a few tens of meters.

"At this point, it is simply not known whether these layers represent piles of volcanic flows or sedimentary rocks that might have formed in a standing body of water," he adds.

The Mars Global Surveyor team has previously announced that the on-board magnetometer shows Mars to have a more complex magnetic field than once thought. Of particular interest is the fact that the magnetic field was apparently once about the same strength as that of present-day Earth.

Many experts think that a strong magnetic field may be crucial for the evolution of life on a planet. Without a magnetic field, a planet tends to have any existing atmospheric particles blasted away by cosmic rays in a process known as "sputtering."

And finally, the Mars Orbiting Laser Altimeter (MOLA) has already sent back 18 very good topographic profiles of the northern hemisphere. "Characterization of these features is leading to a new understanding of the surface processes of Mars," Albee says.

Robert Tindol

Researchers develop new plastic recording material that can be used to see through tissue without X rays

PASADENA--Researchers have recently achieved a certain amount of success in using laser light to see through scattering media such as human tissue. The new technology could eventually have medical applications in situations where X rays are ineffective or downright dangerous.

According to Seth Marder of the California Institute of Technology, his team and a group of collaborators from the University of Arizona have developed a photorefractive polymer that is highly sensitive to near infrared light.

Using this polymer, the group is now able to see through about half an inch of 2 percent milk. They have achieved this by custom-designing a polymer for a state-of-the-art laser holography setup.

Marder, a chemist at Caltech's Beckman Institute, says the work capitalizes on the fact that certain low-energy wavelengths can indeed penetrate solid substances to a certain extent. Visible light, radio and TV waves, infrared heat from an oven, and X rays are all manifestations of electromagnetic radiation that differ only in wavelength and energy.

"If you hold your hand up to a bright light source with your fingers closed tightly, you can actually see light coming through the skin itself," he explains. "In your body there are various things that absorb light such as amino acids, DNA, hemoglobin—which means that you can't see through them with ultraviolet or visible light.

"But many of these tissues stop absorbing light at a low enough energy," he says. "It turns out that, in the near infrared, your body is relatively transparent."

The goal, then, is to analyze this light that has penetrated tissue, and this is where Bernard Kippelen and Nasser Peyghambarian and their team at the Optical Sciences Center of the University of Arizona come in. Using extremely fast lasers, the Arizona team is able to look only at photons of light referred to as "ballistic photons," while filtering out scattered photons that arrive an instant later. The scattered photons lead to tremendous distortions of the image, rendering it useless. By filtering out the scattered photons (leaving only the ballistic photons) it is possible to recapture the original image.

The filtering technique, in fact, is holographic time gating and keys on the ability of a femtosecond laser (with light pulses of a millionth of a billionth of a second) to isolate the information from the ballistic photons.

First, a laser pulse is shot at the tissue to be imaged while a reference beam originating from the same laser source is also introduced into the optical system from another angle. The ballistic photons then interact with laser pulses from the reference beam and record a hologram in the photorefractive polymer developed by Caltech and the University of Arizona.

That hologram contains only the image from the ballistic photons. The delayed scattered photons do not form an interference pattern with the reference pulses because they arrive later and consequently do not write a hologram.

The ballistic photon hologram can then be reconstructed by sending a third laser beam from the opposite side (parallel to the original beam hitting the tissue). This beam is altered by the hologram recorded in the polymer. This altered probe beam then is isolated by the optical system to reconstruct a nearly undistorted image of the object.

The result is an image built of light, using infrared energy that has looked inside the object to a slight extent, but an image that has not required the use of high-energy radiation.

An application that Kippelen sees for the future is the imaging of human retinas. The onset of glaucoma can be detected quite early if certain structural changes just beneath the surface of the retina can be imaged.

Marder cautions that the techniques are still rudimentary, and even in their final form may not see through nearly as much tissue as X rays. But the object is not to replace X rays in all applications, he says—just certain ones in which the conditions are appropriate.

A recent technical report on the research can be found in the January 2 issue of the journal Science. In addition to Kippelen, Marder, and Peyghambarian, the other authors of the paper are Eric Hendrickx, Jose-Luis Maldonado, Gregoire Guillemet, Boris L. Volodin, Derek D. Steele, Yasufumi Enami, Sandalphon, Yonj-Jing Yao, Jiafu F. Wang, Harald Roeckel, and Lael Erskine.


Robert Tindol

ACE Satellite Now In Place Between Earth and Sun; Will Seek To Determine What Sun Is Made Of

PASADENA—Tanning aficionados, beach bums, surfers, and other solar enthusiasts may not realize it yet, but there is a new satellite making a huge looping halo around the sun. And it's a satellite that's going to be a benefit to weather forecasters in predicting solar flares as well as to astrophysicists in understanding the nature of the universe.

The satellite is called the Advanced Composition Explorer, or ACE for short. Launched August 25, the satellite has reached its destination about a million miles from Earth toward the sun at a position known as L1. That's the point at which the gravitational pull from Earth and sun, plus centrifugal effects, exactly balance each other.

"So, a spacecraft can orbit this invisible point, maintaining a fixed distance from Earth as Earth orbits the sun," says Ed Stone, Morrisroe Professor of Physics at Caltech and principal investigator of the ACE science mission.

Stone and Caltech physicist Dick Mewaldt are leading the satellite's science mission at the ACE Science Center at Caltech. There, they obtain spacecraft telemetry from the flight operations team at the Goddard Space Flight Center, and process the data for the astrophysics community.

The satellite is designed to collect a wide range of information on the matter it encounters. Its mission can broadly be classified in two phases:

® The satellite incorporates a real-time solar wind system that will provide around-the-clock coverage of interplanetary conditions that affect Earth. This is especially of benefit to those living at high northern and southern latitudes, because Earth's magnetic field is such that a coronal mass ejection can more easily disrupt power systems close to the poles.

While the ACE can do nothing to prevent this phenomenon from occurring, the satellite can at least provide an hour of warning that a coronal mass ejection may create a magnetic storm. The warning could help minimize and perhaps even eliminate some of the outages.

The National Oceanic and Atmospheric Administration (NOAA) will analyze the data and issue forecasts and warnings of solar storms. According to NOAA, it will be possible to issue geomagnetic storm alerts with virtually 100 percent accuracy.

® The ACE science mission is designed to measure and compare the composition of three samples of matter that can be found in interplanetary space. These are the solar material in the form of the solar wind and energetic particles accelerated by violent eruptions of the sun, the gas from the nearby space between the stars, and high-energy cosmic rays that come from more distant regions in the Milky Way.

Understanding the nature of this matter can help researchers provide answers to fundamental questions about the origin of matter. Additional information on the precise mix of elements in the solar wind, for example, will also serve as a benchmark for understanding the composition of other bodies in the solar system.

The ACE satellite is carrying nine scientific instruments that were developed by a team of scientists representing 10 institutions in the United States and Europe. These instruments are an array of mass spectrometers that measure the mass of individual ions. The satellite is already collecting data, and is expected to do so for at least five years.

"Our first look at the data tells us that the performance of the instruments is excellent," says Stone. "We should be learning what the sun is made of in the months ahead."

[Note to editors: See http://www.srl.caltech.edu/ACE/ for more on the ACE science mission. Also, NOAA on Jan. 23 issued a press release on the ACE satellite's space weather forecasting capabilities.]

Robert Tindol

Caltech Question of the Month: How does a high or low water table affect our ability to feel earthquakes?

Submitted by Susan Rogers, Azusa, California, and answered by Dave Wald, visiting associate in geophysics at Caltech, and U.S. Geological Survey geophysicist.

When considering earthquake damage, it is important to distinguish between the effects of ground shaking and ground deformation (landslides, liquefaction, ground settling). In the Northridge earthquake, most of the damage was due to the strong shaking of buildings and their contents, but there was also a substantial amount of damage to foundations of buildings and residences caused by landslides and permanent ground settling.

Now, ground shaking is only marginally affected by the level of the water table. Whether or not the near-surface sediments are saturated with water does not significantly change the passing of seismic waves. So for shaking, for the most part, the level of the water table is not that important.

However, if the shallow sediments are water saturated (shallow water table) there is an increased chance of settling and landslides. If in addition the sediment is very sandy, the possibility of liquefaction is greatly increased and, therefore, the damage in such an area may be more widespread than if the water table had been deeper.

Robert Tindol

Black Hole That Periodically Ejects Its Inner Disk As Jets Discovered

WASHINGTON—Astronomers observing a disk of matter spiralling into a black hole in our galaxy have discovered that the black hole periodically hurls the inner portion of the disk into space as jets travelling at near the speed of light.

According to Stephen Eikenberry, an astrophysicist at the California Institute of Technology, the superhot gas in the inner disk shines brightly in X-rays, and dramatic dips in the X-ray emission suggest that the inner disk vanishes every 20 to 40 minutes. Infrared and radio observations at the same time show huge flares which indicate that matter is being thrown out of the system.

Eikenberry and colleagues from the Massachusetts Institute of Technology and NASA's Goddard Space Flight Center will discuss their findings at a 9:30 a.m. press conference on Wednesday, January 7, during the winter meeting of the American Astronomical Society.

The scientists observed the disappearance of the inner portion of the disk, known as an accretion disk, at the same time that glowing plasma is ejected from the black hole system. In August, Eikenberry and his collaborators at Caltech observed infrared flares from the black hole system, known as GRS 1915+105, using the Mt. Palomar 200-inch telescope.

At the same time, Ronald Remillard and his collaborators at MIT monitored X-ray dips from the same black hole using NASA's Rossi X-ray Timing Explorer (RXTE) satellite. Jean Swank and her collaborators at NASA/GSFC observed similar dips, antion between the disappearance of the inner disk and the jet ejection has never been seen until now."

"This work is also exciting because it may help us understand many other types of systems with jets," notes Robert Nelson, who works with Eikenberry at Caltech. "Astronomers have found jets in a wide range of objects, from quasars—incredibly powerful objects seen out to the edge of the observable universe—to young protostars."

The half-hour spacing between the ejections may be telling researchers that what they had thought were smooth, continuous outflows may in fact be intermittent explosions.

"There are many fine details in the X-ray dips that we may now seriously investigate to better understand the ejection mechanism," adds Edward R. Morgan, who works with Remillard at MIT. "In particular, there is a very unusual X-ray flash at the bottom of these dips in which the X-ray spectrum changes significantly. This may be the trigger for the rapid acceleration of the disk material."

The black hole in GRS 1915+105 became known to astronomers in 1992 as an X-ray nova, which is believed to signify the sudden flow of hot gases into a black hole from a companion star in a binary system. The black hole in GRS 1915+105 is thought to have a mass equal to ten Suns or more, all crushed by its own gravity into a tiny sphere contained within an "event horizon," which itself has a radius of about 20 km.

When a black hole pulls gas from the atmosphere of a companion star, the matter spirals in toward the event horizon like water going down a drain, and the swirling disk created by the flow is known to astronomers as an "accretion disk." The gas in the disk heats up dramatically due to the large acceleration and friction. Just before entering the event horizon, the gas reaches temperatures of millions of degrees, causing it to glow in X-rays.

In 1994, Mirabel and Luis Rodriguez observed radio emission from jets in GRS 1915+105, and they determined that the speed of the jets was greater than 90 percent of the speed of light, or roughly 600 million miles per hour. Since RXTE began observing the X-ray sky in early 1996, the exceptionally chaotic behavior of GRS 1915+105 in X-rays has been chronicled on many occasions.

The new results gained by Eikenberry's team brings together these phenomena by showing that modest jet ejections and the pattern of X-ray variations are synchronized in an organized way.

"The repeated ejections are really amazing," says Craig Markwardt, a member of the NASA/GSFC team. "The system behaves like a celestial version of Old Faithful. At fairly regular intervals, the accretion disk is disrupted and a fast-moving jet is produced."

"This jet is staggeringly more powerful than a geyser," adds Swank. "Every half hour, the black hole GRS 1915+105 throws off the mass of an asteroid at near the speed of light. This process clearly requires a lot of energy; each cycle is equivalent to 6 trillion times the annual energy consumption of the entire United States."

"Since the disk-jet interaction is so poorly understood, we're hoping that further analysis of these observations will show us more details of what is happening so close to the black hole," Eikenberry says. "We're planning more detailed studies for the coming year which should give us even more clues as to the nature of these incredibly powerful events.

"Right now, we still aren't even sure why these dips and ejections occur every half hour or so—why not every week or every 30 seconds, for instance?

Robert Tindol

Biological Activity the Likely Culprit of Atmospheric Nitrous Oxide Increases

PASADENA–Nitrous oxide (N2O) is an atmospheric gas known to contribute both to global warming and ozone depletion. New research suggests that its changing concentration in the atmosphere is largely a result of biological activity.

In the December 5 issue of the journal Science, Yuk Yung of the California Institute of Technology and Charles Miller of Caltech's Jet Propulsion Lab (now at the Haverford College Department of Chemistry) describe their work on isotopes of N2O, a greenhouse gas that is of increasing concern because its concentration in the atmosphere has been rising for several decades.

N2O is a molecule with two atoms of nitrogen and a single atom of oxygen. It is created in the decay of organic material, principally plants, but is also generated in the manufacture of nylon.

Scientists have known for years that N2O enters the nitrogen cycle, but the ultimate sources and sinks of the gas have been unclear. By contrast, carbon dioxide, another greenhouse gas, is known to be a direct consequence of industrial activity.

"Nitrous oxide is less important as a greenhouse molecule than carbon dioxide, and slightly less important than methane," says Yung, a professor of planetary science at Caltech. "But the concentrations have been increasing since good measurements began 20 years ago, and ice core samples in Greenland and Antarctica suggest that it has been increasing since the Industrial Revolution began."

Yung and Miller specifically looked at isotopes of nitrous oxide once it enters the stratosphere. Isotopes are variations of a chemical element that have the same number of protons (and thus the same atomic number), but a different number of neutrons. Thus, a 15N atom of nitrogen has its regular seven protons and seven neutrons, but an additional neutron as well.

A careful analysis of isotopic variations is an effective way of tracing substances to their sources. If the nitrogen-based fertilizer in agriculture has a known isotopic makeup and that same percentage is found in the stratosphere, for example, then it can be concluded that agricultural fertilization is a contributor.

Yung and Miller examined theoretically how isotopes of nitrous oxide interact with ultraviolet light energy. They predict that, as N2O is destroyed by light, heavier isotopes survive preferentially because molecules comprising slightly heavier isotopes require a bit more energy for the atoms to separate.

From their theory and related atmospheric measurements presented in the same issue by researchers at the Scripps Institution of Oceanography and the University of California at San Diego, Yung and Miller conclude that new chemical sources do not need to be introduced to account for the isotopic concentrations that are indeed observed in the stratosphere.

Thus, sources such as the decay of plant life and the burning of rainforests and other biomass burning can account for the signatures that are seen. Experimental verification of the predictions is now under way in the laser spectroscopy lab of Caltech cosmochemist Geoff Blake.

Understanding the sources can give society a better grip on the possibilities of dealing with the problem, Yung says.

"I think the most reasonable explanation for the increase is that we are accelerating biological activity globally," he says. "Because of global warming, the use of agricultural fertilizers, and nitrogen made from pollution that acts just like a fertilizer, the biosphere has been stimulated. This fosters the growth/decay cycle which leads to N2O release."

The next step for the researchers is to pin down the precise isotopic signatures of various biological and atmospheric processes. But there may be little that can realistically or politically be done if biology on a planetary scale is responsible, Geoff Blake says.

"We may just have to live with it.

Robert Tindol

Geophysicists Develop Model to Describe Huge Gravity Anomaly of Hudson Bay Region

PASADENA—While the gravity field of Earth is commonly thought of as constant, in reality there are small variations in the gravitational field as one moves around the surface of the planet.

These variations have typical magnitudes of about one–ten thousandth of the average gravitational attraction, which is approximately 9.8 meters per second per second. A global map of these variations shows large undulations at a variety of length scales. These undulations are known as gravity anomalies.

There are many such anomalies in Earth's gravity field, but one of the largest negative gravity anomalies (implying the attractions of gravity being a little less than average, or in other words, a mass deficit) centered over Hudson Bay, Canada. Using a new approach to analyzing planetary gravity fields, two geophysicists, Mark Simons at the California Institute of Technology and Bradford Hager at M.I.T., have shown that incomplete glacial rebound can account for a substantial portion of the Hudson Bay gravity anomaly.

With this new information, Simons and Hager were able to place new constraints on the variations in strength of the materials that constitute the outer layers of Earth's interior (the crust and mantle). Their work appears in the December 4 issue of the journal Nature.

About 18,000 years ago, Hudson Bay was at the center of a continental–sized glacier. Known as the Laurentide ice sheet, this glacier had a thickness of several kilometers. The weight of the ice bowed the surface of Earth down. The vast majority of the ice eventually melted at the end the Ice Age, leaving a depression in its wake.

While this depression has endured for thousands of years, it has been gradually recovering or "flattening itself out." The term "glacial rebound" refers to this exact behavior, whereby the land in formerly glaciated areas rises after the ice load has disappeared.

Evidence of this is seen in coastlines located near the center of the former ice sheet. These coastlines have already risen several hundred meters and will continue to rebound.

"The rate at which the area rebounds is a function of the viscosity of Earth," says Simons. "By looking at the rate of rebound going on, it's possible to learn about the planet's viscosity."

Simons says that geophysicists have known for some time about the Hudson Bay gravity anomaly, but have hitherto been uncertain how much of the gravity anomaly is a result of glacial rebound and how much is due to mantle convection or other processes.

The gravity anomaly is measured from both the ground and from space. Simons and Hager use a gravity data set developed by researchers at the Goddard Space Flight Center.

However, knowing how much of an anomaly exists at a certain site on Earth is not sufficient to determine the pliability of the materials beneath it. For this, Simons and his former M.I.T. colleague Hager have developed a new mathematical tool that looks at the spatial variations of the spectrum of the gravity field.

In many instances, this approach allows one to separate the signatures of geologic processes that occur at different locations on Earth. In particular, Simons and Hager were able to isolate the glacial rebound signature from signatures of other processes, such as manifestations of plate tectonics, that dominate that gravity field but are concentrated at other geographic locations.

Having an estimate of incomplete postglacial rebound allowed Simons and Hager to derive a model of how the viscosity of the mantle changes with depth. Simons and Hager propose one such model that explains both the gravity anomaly as well as the uplift rates estimated from the coastlines.

Their favored model suggests that underneath the oldest parts of continents (some of which are over 4 billion years old) the viscosity of the outer 400 kilometers of Earth is much stiffer than under the oceans. Therefore, these continental keels can resist the erosion by the convective flow that drives plate tectonics.



Robert Tindol

Caltech Question of the Month

Question of the Month Submitted by John Propst, Fullerton, California.

Answered by Ken Libbrecht, Caltech professor of physics.

Light travels at the speed of light, and is created traveling at light speed. When Einstein invented the theory of special relativity, he postulated that the speed of light was a constant.

If you carry your flashlight on a moving train, the photons travel out from it at a constant speed, whether you measure their speed from the ground or measure their speed from the train. You can't derive this fact from anything, so it becomes a fundamental law of physics.

We don't know why nature chooses to operate this way, but many measurements have shown that Einstein's postulate is very accurately followed.




Subscribe to RSS - research_news