North Atlantic Corals Could Lead to Better Understanding of the Nature of Climate Change

PASADENA, Calif.—The deep-sea corals of the North Atlantic are now recognized as "archives" of Earth's climatic past. Not only are they sensitive to changes in the mineral content of the water during their 100-year lifetimes, but they can also be dated very accurately.

In a new paper appearing in Science Express, the online publication of the American Association for the Advancement of Science (AAAS), environmental scientists describe their recent advances in "reading" the climatic history of the planet by looking at the radiocarbon of deep-sea corals known as Desmophyllum dianthus.

According to lead author Laura Robinson, a postdoctoral scholar at the California Institute of Technology, the work shows in principle that coral analysis could help solve some outstanding puzzles about the climate. In particular, environmental scientists would like to know why Earth's temperature has been holding so steadily for the last 10,000 years or so, after having previously been so variable.

"These corals are a new archive of climate, just like ice cores and tree rings are archives of climate," says Robinson, who works in the Caltech lab of Jess Adkins, assistant professor of geochemistry and global environmental science, and also an author of the paper.

"One of the significant things about this study is the sheer number of corals we now have to work with," says Adkins, "We've now collected 3,700 corals in the North Atlantic, and have been able to study about 150 so far in detail. Of these, about 25 samples were used in the present study.

"To put this in perspective, I wrote my doctoral dissertation with two dozen corals available," Adkins adds.

The corals that are needed to tell Earth's climatic story are typically found at depths of a few hundred to thousands of meters. Scuba divers, by contrast, can go only about 50 to 75 meters below the surface. Besides, the water is bitter cold and the seas are choppy. And to add an additional complication, the corals can be hard to find.

The solution has been for the researchers to take out a submarine to harvest the coral. The star of the ventures so far has been the deep-submergence vehicle known as Alvin, which is famed for having discovered the Titanic some years back. In a 2003 expedition several hundred miles off the coast of New England, Alvin brought back the aforementioned 3,700 corals from the New England Seamounts.

The D. dianthus is especially useful because it lives a long time, can be dated very precisely through uranium dating, and also shows the variations in carbon-14 (or radiocarbon) due to changing ocean currents. The carbon-14 all originally came from the atmosphere and decays at a precisely known rate, whether it is found in the water itself or in the skeleton of a coral. The less carbon-14 found, the "older" the water. This means that the carbon-14 age of the coral would be "older" than the uranium age of the coral. The larger the age difference, the older the water that bathed the coral in the past.

In a perfectly tame and orderly environment, the deepest water would be the most depleted of carbon-14 because the waters at that depth would have allowed the element the most time to decay. A sampling of carbon-14 content at various depths, therefore, would allow a graph to be constructed, in which the maximum carbon-14 content would be found at the surface.

In the real world, however, the oceans circulate. As a result, an "older" mass of water can actually sit on top of a "younger" mass. What's more, the ways the ocean water circulate are tied to climatic variations. A more realistic graph plotting carbon-14 content against depth would thus be rather wavy, with steeper curves meaning a faster rate of new water flushing in, and flatter curves corresponding to relatively unperturbed water.

The researchers can get this information by cutting up the individual corals and measuring their carbon-14 content. During the animals' 100-year life spans, they take in minerals from the water and use the minerals to build their skeletons. The calcium carbonate fossil we see, then, is a skeleton of an animal that may have just died or may have lived thousands of years ago. But in any case, the skeleton is a 100-year record of how much carbon-14 was washing over the creature's body during its lifetime.

An individual coral can tell a story of the water it lived in because the amount of variation in different parts of the growing skeleton is an indication of the kind of water that was present. If a coral sample shows a big increase in carbon-14 about midway through life, then one can assume that a mass of younger water suddenly bathed the coral. On the other hand, if a huge decrease of carbon-14 is observed, then an older water mass must have suddenly moved in.

A coral with no change in the amount of carbon-14 observed in its skeleton means that things were pretty steady during its 100-year lifetime, but the story may be different for a coral at a different depth, or one that lived at a different time.

In sum, the corals tell how the waters were circulating, which in turn is profoundly linked to climatic change, Adkins explains.

"The last 10,000 years have been relatively warm and stable-perhaps because of the overturning of the deep ocean," he says. "The deep ocean has nearly all the carbon, nearly all the heat, and nearly all the mass of the climate system, so how these giant masses of water have sloshed back and forth is thought to be tied to the period of the glacial cycles."

Details of glaciation can be studied in other ways, but getting a history of water currents is a lot more tricky, Adkins adds. But if the ocean currents themselves are implicated in climatic change, then knowing precisely how the rules work would be a great advancement in the knowledge of our planet.

"These guys provide us with a powerful new way of looking into Earth's climate," he says. "They give us a new way to investigate how the rate of ocean overturning has changed in the past."

Robinson says that the current collection of corals all come from the North Atlantic. Future plans call for an expedition to the area southeast of the southern tip of South America to collect corals. The addition of the second collection would give a more comprehensive picture of the global history of ocean overturning, she says.

In addition to Robinson and Adkins, the other authors of the paper are Lloyd Keigwin of the Woods Hole Oceanographic Institute; John Southon of the University of California at Irvine; Diego Fernandez and Shin-Ling Wang of Caltech; and Dan Scheirer of the U.S. Geological Survey office at Menlo Park.

The Science Express article will be published in a future issue of the journal Science.

Writer: 
Robert Tindol
Writer: 

Geologists Uncover New Evidence About the Rise of Oxygen

PASADENA, Calif.—Scientists believe that oxygen first showed up in the atmosphere about 2.7 billion years ago. They think it was put there by a one-celled organism called "cyanobacteria," which had recently become the first living thing on Earth to make oxygen from water and sunlight.

The rock record provides a good bit of evidence that this is so. But one of these rocks has just gotten a great deal more slippery, so to speak.

In an article appearing in the Geological Society of America's journal Geology, investigators from the California Institute of Technology, the University of Tübingen in Germany, and the University of Alberta describe their new findings about the origin of the mineral deposits known as banded-iron formations, or "BIFs." A rather attractive mineral that is often cut and polished for paperweights and other decorative items, a BIF typically has alternating bands of iron oxide and silica. How the iron got into the BIFs to begin with is thought to be a key to knowing when molecular oxygen first was produced on Earth.

The researchers show that purple bacteria—primitive organisms that have thrived on Earth without producing oxygen since before cyanobacteria first evolved—could also have laid down the iron oxide deposits that make up BIFs. Further, the research shows that the newer cyanobacteria, which suddenly evolved the ability to make oxygen through photosynthesis, could have even been floating around when the purple bacteria were making the iron oxides in the BIFs.

"The question is what made the BIFs," says Dianne Newman, who is associate professor of geobiology and environmental science and engineering at Caltech and an investigator with the Howard Hughes Medical Institute. "BIFs are thought to record the history of the rise of oxygen on Earth, but this may not be true for all of them."

The classical view of how the BIFs were made is that cyanobacteria began putting oxygen in the atmosphere about 2.7 billion years ago. At the same time, hydrothermal sources beneath the ocean floors caused ferrous iron (that is, "nonrusted" iron) to rise in the water. This iron then reacted with the new oxygen in the atmosphere, which caused the iron to change into ferric iron. In other words, the iron literally "rusted" at the surface of the ocean waters, and then ultimately settled on the ocean floor as sediments of hematite (Fe2O3) and magnetite (Fe3O4).

The problem with this scenario was that scientists in Germany about 10 years ago discovered a way that the more ancient purple bacteria could oxidize iron without oxygen. Instead, these anaerobic bacteria could have used a photosynthetic process in which light and carbon dioxide are used to turn the ferrous iron into ferric iron, throwing the mechanism of BIF formation into question.

Newman's postdoctoral researcher Andreas Kappler (now an assistant professor at the University of Tübingen) expanded on this discovery by doing some lab experiments to measure the rate at which purple bacteria could form ferric iron under light conditions relevant for different depths within the ocean.

Kappler's results showed that iron could indeed have been oxidized by these bacteria, in amounts matching what would have been necessary to form one of the Precambrian iron deposits in Australia.

Another of the paper's Caltech authors, Claudia Pasquero, determined the thickness of the purple bacterial layer that would have been needed for complete iron oxidation. Her results showed that the thickness of the bacterial layer could have been on the order of 17 meters, below wave base, which compares favorably to what is seen today in stratified water bodies such as the Black Sea.

Also, the results show that, in principle, the purple bacteria could have oxidized all the iron seen in the BIFs, even if the cyanobacteria had been present in overlying waters.

However, Newman says that the rock record contains various other kinds of evidence that oxygen was indeed absent in the atmosphere earlier than 2.7 billion years ago. Therefore, the goal of better understanding the history of the rise of oxygen could come down to finding out if there are subtle differences between BIFs that could have been produced by cyanobacteria and/or purple bacteria. And to do this, it's best to look at the biology of the organisms.

"The hope is that we'll be able to find out whether some organic compound is absolutely necessary for anaerobic anoxygenic photosynthesis to occur," Newman says. "If we can know how they work in detail, then maybe we'll be fortunate enough to find one molecule really necessary."

A good candidate is an organic molecule with high geological preservation potential that would have existed in the purple bacteria three billion years ago and still exists today. If the Newman team could find such a molecule that is definitely involved in the changing of iron to iron oxide, and is not present in cyanobacteria, then some of the enigmas of oxygen on the ancient earth would be solved.

"The goals are to get at the types of biomolecules essential for different types of photosynthesis-hopefully, one that is preservable," Newman says.

"I guess one interesting thing from our findings is that you can get rust without oxygen, but this is also about the history of metabolic evolution, and the ability to use ancient rock to investigate the history of life."

Better understanding microbial metabolism could also be of use in NASA's ambitious goal of looking for life on other worlds. The question of which organisms made the BIFs on Earth, therefore, could be useful for astrobiologists who may someday find evidence in rock records elsewhere.

Writer: 
Robert Tindol
Writer: 

Cracks or Cryovolcanoes? Surface Geology Creates Clouds on Titan

PASADENA, Calif.-Like the little engine that could, geologic activity on the surface of Saturn's moon Titan-maybe outgassing cracks and perhaps icy cryovolcanoes-is belching puffs of methane gas into the atmosphere of the moon, creating clouds.

This is the conclusion of planetary astronomer Henry G. Roe, a postdoctoral researcher, and Michael E. Brown, professor of planetary astronomy at the California Institute of Technology. Roe, Brown, and their colleagues at Caltech and the Gemini Observatory in Hawaii based their analysis on new images of distinctive clouds that sporadically appear in the middle latitudes of the moon's southern hemisphere. The research will appear in the October 21 issue of the journal Science.

The clouds provide the first explanation for a long-standing Titan mystery: From where does the atmosphere's copious methane gas keep coming? That methane is continuously destroyed by the sun's ultraviolet rays, in a process called photolysis. This photolysis forms the thick blanket of haze enveloping the moon, and should have removed all of Titan's atmospheric methane billions of years ago.

Clearly, something is replenishing the gas-and that something, say Roe and his colleagues, is geologic activity on the surface. "This is the first strong evidence for currently active methane release from the surface," Roe says.

Adds Brown: "For a long time we've wondered why there is methane in the atmosphere of Titan at all, and the answer is that it spews out of the surface. And what is tremendously exciting is that we can see it, from Earth; we see these big clouds coming from above these methane vents, or methane volcanoes. Everyone had thought that must have been the answer, but until now, no one had found the spewing gun."

Roe, Brown, and their colleagues made the discovery using images obtained during the past two years by adaptive optics systems on the 10-meter telescope at the W. M. Keck Observatory on Mauna Kea in Hawaii and the neighboring 8-meter telescope at the Gemini North Observatory. Adaptive optics is a technique that removes the blurring of atmospheric turbulence, creating images as sharp as would be obtained from space-based telescopes.

"These results came about from a collaborative effort between two very large telescopes with adaptive optics capability, Gemini and Keck," says astronomer Chadwick A. Trujillo of the Gemini Observatory, a co-author of the paper. "At both telescopes, the science data were collected from only about a half an hour of images taken over many nights. Only this unusual 'quick look' scheduling could have produced these unique results. At most telescopes, the whole night is given to a single observer, which could not have produced this science."

The two telescopes observed Titan on 82 nights. On 15 nights, the images revealed distinctive bright clouds-two dozen in all-at midlatitudes in the southern hemisphere. The clouds usually popped up quickly, and generally had disappeared by the next day. "We have several observations where on one night, we don't see a cloud, the next night we do, and the following night it is gone," Roe says.

Some of the clouds stretched as much as 2,000 km across the 5,150 km diameter moon. "An equivalent cloud on Earth would cover from the east coast to the west coast of the United States," Roe says. Although the precise altitude of the clouds is not known, they fall somewhere between 10 km and 35 km above the surface, within Titan's troposphere (most cloud activity on the earth is also within its troposphere).

Notably, all of the clouds were located within a relatively narrow band at around 40 degrees south latitude, and most were clustered tightly near 350 degrees west longitude. Both their sporadic appearance and their specific geographic location led the researchers to conclude that the clouds were not arising from the regular convective overturn of the atmosphere due to its heating by the sun (which produces the cloud cover across the moon's southern pole) but, rather, that some process on the surface was creating the clouds.

"If these clouds were due only to the global wind pattern, what we call general circulation, there's no reason the clouds should be linked to a single longitude. They'd be found in a band around the entire moon," Roe says.

Another possible explanation for the clouds' patchy formation is variation in the albedo, or brightness, of the surface. Darker surfaces absorb more sunlight than lighter ones. The air above those warmer spots would be heated, then rise and form convective clouds, much like thunderstorms on a summer's day on Earth. Roe and his colleagues, however, found no differences in the brightness of the surface at 40 degrees south latitude. Clouds can also form over mountains when prevailing winds force air upward, but in that case the clouds should always appear in the identical locations. "We see the clouds regularly appear in the same geographic region, but not always in the exact same location," says Roe.

The other way to make a cloud on Titan is to raise the humidity by directly injecting methane into the atmosphere, and that, the scientists say, is the most likely explanation here.

Exactly how the methane is being injected is still unknown. It may seep out of transient cracks on the surface, or bubble out during the eruption of icy cryovolcanoes.

Although no such features have yet been observed on the moon, Roe and his colleagues believe they may be common. "We think there are numerous sources all over the surface, of varying size, but most below the size that we could see with our instruments," he says.

One large feature near 350 degrees west longitude is probably creating the clump of clouds that forms in that region, while also humidifying the band at 40 degrees latitude, Roe says, "so you end up creating areas where the humidity is elevated by injected methane, making it easier for another, smaller source to also generate clouds. They are like weather fronts that move through. So we are seeing weather, on another planet, with something other than water. With methane. That's cool. It's better than science fiction."

Images are available upon request. For advance copies of the embargoed paper, contact the AAAS Office of Public Programs, (202) 326-6440 or scipak@aaas.org. ###

Contact: Dr. Henry G. Roe (626) 395-8708 hroe@gps.caltech.edu Kathy Svitil Caltech Media Relations (626) 395-8022 ksvitil@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

Writer: 
KS
Writer: 

NASA Grant Will Fund New Research on Mars with the Spirit and Opportunity Rovers

PASADENA, Calif.—When it comes to longevity, the Spirit and Opportunity rovers on Mars are giving some real competition to the pink bunny from those battery advertisements. The two rovers in a couple of months will celebrate their second anniversary on the red planet, even though their original missions were only 90 days.

With no end to the rover missions in sight, NASA has selected a planetary scientist at the California Institute of Technology to see if he and his team can learn new things about the ground the rovers are currently rolling on. With any luck, the researchers will uncover further evidence about water or water vapor once present on the planet's surface.

Oded Aharonson, assistant professor of planetary science at Caltech, was chosen as part of the Mars Exploration Rover Participating Scientist Program. Aharonson and seven other investigators have been selected from 35 applicants. According to NASA, the eight successful proposals were chosen on the basis of merit, relevance, and cost-effectiveness. Aharonson and the seven other finalists will become official members of the Mars Exploration Rovers science team, according to Michael Meyer, lead scientist for the Mars Exploration Program.

"Spirit and Opportunity have exceeded all expectations for their longevity on Mars, and both rovers are in good position to continue providing even more great science," said Meyer. "Because of this, we want to add to the rover team that collectively chooses how to use the rover's science instruments each day."

Aharonson's proposal is formally titled "Soil Structure and Stratification as Indicators of Aqueous Transport at the MER Landing Sites." In nontechnical talk, that means the researchers will be using the rovers to look at Martian dirt and rocks to see if liquid water has ever altered them.

The search for evidence of running water on Mars has been a "Holy Grail" for the entire exploratory program. Although the details of how life originally evolved are still largely conjectural, experts think that liquid water is required for the sort of chemistry thought to be conducive to the emergence of life as we know it.

Although there is no liquid water on the Martian surface at present, Opportunity has found geological evidence that water formerly flowed there. Thus, Aharonson will be looking for the telltale signatures of ancient as well as more recent aqueous transport and alteration.

"My experiments would normally take a couple of weeks, but it's not clear exactly how much time we'll devote to them," Aharonson said. "If we find something interesting, it could be much longer. But we might also cut the time shorter if, for example, we come upon an interesting rock we want to look at more closely."

Aharonson will work with a new Caltech faculty member, John Grotzinger, who comes from MIT as the Fletcher Jones Professor of Geology and is already a member of the rovers' science team. In addition, Caltech postdoctoral researcher Deanne Rogers will be involved in the research.

Spirit and Opportunity have been exploring sites on opposite sides of Mars since January 2004. They have found geological evidence of ancient environmental conditions that were wet and possibly habitable. They completed their primary missions three months later and are currently in the third extension of thse missions. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the Mars Exploration Rover project for NASA's Science Mission Directorate.

 

Writer: 
Robert Tindol
Writer: 

Interdisciplinary Scientists Propose Paradigm Shift in Robotic Space Exploration

PASADENA, Calif.—Just ask any geologist. If you're studying the history of a planet and the life forms that may have lived on it, the really good places to look are rugged terrains like canyons and other areas where water, igneous activity, wind, and seismic rumblings have left their respective marks. Flat is not so good.

But when it comes to exploring other worlds, like Mars, the strategy for ground-based reconnaissance thus far has been to land in relatively smooth places so the spacecraft won't slam into something vertical as it touches down or as it rolls to a stop in its protective airbags. In the cases of the Mars landings—and all soft landings on other planets and moons, for that matter—flat is definitely good.

To address this disconnect, a team of interdisciplinary scientists from the California Institute of Technology, the University of Arizona, and the U.S. Geological Survey has unveiled a proposal to make core changes in the robotic exploration of the solar system. In addition to spaceborne orbiters, the "new paradigm" would involve sending orbiter-guided blimps (or other airborne agents) carrying instruments such as optical and thermal cameras, ground-penetrating radar, and gas and humidity sensors to chosen areas of a planet, as well as using herds of small, robotic, ground-based explorers.

The ground explorers would communicate with the airborne and/or spaceborne agents, coupled with innovative software for identification, characterization, and integration of various types of spatial and temporal information for in-transit comparative analysis, hypothesis formulation, and target selection. This would lead to a "tier-scalable perspective," akin to the approach used by field geologists to solve a complicated geological puzzle.

Writing in an upcoming issue of the journal Planetary and Space Science, the researchers propose "a fundamentally new scientific mission concept for remote planetary surface and subsurface reconnaissance." The new approach will be cost-effective, in that it can include greater redundancy and thus greater assurance of mission success, while significantly allowing unconstrained science-driven missions to uncover transient events (for example, evidence of liquid water) and possible signs of life on other worlds.

"We're not trying to take anything away from the successful landings on Mars, Venus, and Titan, nor the orbital-based successes to most of the planetary bodies of the solar system," says Wolfgang Fink, a physicist who is serving a multiyear appointment as a visiting associate at Caltech. "But we think our tier-scalable mission concept will afford greater opportunity and freedom to identify and home in on geological and potential astrobiological 'sweet spots.'"

The new paradigm is spearheaded by Fink and by James Dohm, a planetary geologist in the Department of Hydrology and Water Resources at the University of Arizona. The team effort includes Mark Tarbell, who is Fink's associate in Caltech's Visual and Autonomous Exploration Systems Research Lab; Trent Hare of the U.S. Geological Survey office in Flagstaff; and Victor Baker, also of the University of Arizona.

"The paradigm-changing mission concept is by no means accidental," Dohm explains. "Our interdisciplinary team of scientists has evolved the concept through the profound realization of the requirement to link the various disciplines to optimally go after prime targets such as those environments that have high potential to contain life or far-reaching geological, hydrological, and climatological records."

Fink, for his part, is an expert in imaging systems, autonomous control, and science analysis systems for space missions. Dohm is a planetary and terrestrial field geologist, who, based on his experience, has a keen sense of how and where to study a terrain, be it earthly or otherworldly.

Dohm, who has performed geological investigations of Mars from local to global scales for nearly twenty years, says the study of the geology of other planets has been fruitful yet frustrating. "You're not able to verify the remote-based information in person and uncover additional information that would lead to an improved understanding of the geologic, water, climate, and possible biologic history of Mars.

"Ideally, you'd want to look at remote-based geological information while you walked with a rock hammer in hand along the margin that separates a lava flow from putative marine deposits, exploring possible water seeps and moisture embankments within the expansive canyon system of Valles Marineris that would extend from Los Angeles to New York, characterizing the sites of potential ancient and present hydrothermal activity, climbing over the ancient mountain ranges, gathering diverse rock types for lab analysis, and so on.

"We think we've devised a way to perform the geologic approach on other planets in more or less the way geologists do here on Earth."

Even though orbiting spacecraft have successfully collected significant data through instrument suites, working hypotheses are yet to be confirmed. In the case of Mars, for example, it is unknown whether the mountain ranges contain rock types other than volcanic, or whether sites of suspected hydrothermal activity are indeed hydrothermal environments, or whether the most habitable sites actually contain signs of life. These questions may be addressed through the "new paradigm."

The interdisciplinary collaboration provides the wherewithal for thinking out of the box because the researchers are, well, out of the box. "We're looking at a new way to cover lots of distance, both horizontally and vertically, and new, automated, ways to put the gathered information together and analyze it-perhaps before it even comes back to Earth," Fink says.

Just how innovative would the missions be? The tier-scalable paradigm would vary according to the conditions of the planet or moon to be studied, and, significantly, to the specific scientific goals. "I realize that several missions in the past have been lost during orbital insertion, but we think that the worst perils for a robotic mission are in getting the instruments to the ground successfully," Fink says. "So our new paradigm involves missions that are not crippled if a single rover is lost."

In the case of Mars, a typical mission would deploy maneuverable airborne agents, such as blimps, equipped with existing multilayered information (geologic, topographic, geomorphic, geophysical, hydrologic, elemental, spectral, etc.) that would acquire and ingest information while in transit from various altitudes.

While floating and performing smart reconnaissance (that is, in-transit analysis of both the existing and newly acquired spatial and temporal information in order to formulate working hypotheses), the airborne agents would migrate toward sweet spots, all the while communicating with the orbiter or orbiters. Once the sweet spots are identified, the airborne agents would be in position to deploy or help guide orbiter-based deployment of ground-based agents for further analysis and sampling.

"Knowing where you are in the field is extremely critical to the geologic reconnaissance approach," Dohm says.

"Thanks to the airborne perspective and control, this would be less of a concern within our tier-scalable mission concept, as opposed to, for example, the case of an autonomous long-range rover on Mars that is dependent on visible landmarks to account for its current location," Fink adds.

The robotic ground-based agents would be simpler and smaller than the rovers currently being sent to Mars. Though not necessarily any more robust than the current generation, the agents would be able to take on rocky and steep terrains simply because there would be several with varying degrees of intelligence and thereby, would be somewhat expendable. If a single rover turned over on a steep terrain, for example, the mission would not be lost.

The stationary sensors would be even more expendable and could collect information that can be transmitted back to the airborne units and orbiter or orbiters for comparative analysis of the spatial and temporal information for redeployment if the units are mobile, or for additional deployment, such as placing a drill rig in a prime sweet spot in order to sample possible near-surface groundwater or even living organisms.

And these are just the ideas for planets such as Mars. The researchers also have multi-tier mission plans for other planets and moons, where the conditions could be much more harsh.

In the case of Titan, one of Saturn's moons with an atmosphere one and a half times as thick as Earth's, autonomously controlled airships would be ideal for exploration, rendering Titan perfectly suited for deployment of a three-tiered system consisting of orbiters, blimps, and both mobile and immobile ground-agents, especially in light of the even longer communication time lag than in the case of Mars.

On Io, the extremely volcanically active atmosphere-void moon of Jupiter, in contrast, the orbiter-guided deployment of mobile ground-agents and immobile sensors would be a productive way of performing ground-based reconnaissance, capturing and studying active volcanism beyond Earth. In this case, the three-tiered system of spaceborne-, airborne-, and ground-level would be reduced to the two tiers of spaceborne- and ground-level.

The advantages of redundancy and data compilation would still provide huge operational advantages on Io: if some of the ground-agents are lost to volcanic hazards, for example, those that remain can still identify, map out, and transmit back significant information, thus rendering the overall mission a success. The tier-scalable concept is equally applicable to investigating active processes on Earth that may put scientists in harm's way, including volcanism, flooding, and other hazardous natural and human-induced environmental conditions.

The researchers are already in the test-bed stage of trying out advanced hardware and software designs. But Fink says that much of the technology is already available, and even that which is not currently available (the software, primarily) is quite attainable. Further design, testing, and "ground-truthing" are required in diverse environments on Earth. Fink and Dohm envision field camps where the international community of scientists, engineers, mission architects, and others can design and test optimal tier-scalable reconnaissance systems for the various planetary bodies and scientific objectives.

"After all, the question is what do you do with your given payload," Fink says. "Do you land one big rover, or would you rather deploy airborne agents and multiple smaller ground-agents that are commanded centrally from above, approximating a geologist performing tier-scalable reconnaissance?

"We think the 'tunable' range of autonomy with our tier-scalable mission concept will be of interest to NASA and the other agencies worldwide that are looking at the exploration of the Solar System."

The title of the Planetary and Space Science article is "Next-generation robotic planetary reconnaissance missions: A paradigm shift."

Writer: 
Robert Tindol
Writer: 

Tenth Planet Has a Moon

PASADENA, Calif. --The newly discovered 10th planet, 2003 UB313, is looking more and more like one of the solar system's major players. It has the heft of a real planet (latest estimates put it at about 20 percent larger than Pluto), a catchy code name (Xena, after the TV warrior princess), and a Guinness Book-ish record of its own (at about 97 astronomical units-or 9 billion miles from the sun-it is the solar system's farthest detected object). And, astronomers from the California Institute of Technology and their colleagues have now discovered, it has a moon.

The moon, 100 times fainter than Xena and orbiting the planet once every couple of weeks, was spotted on September 10, 2005, with the 10-meter Keck II telescope at the W.M. Keck Observatory in Hawaii by Michael E. Brown, professor of planetary astronomy, and his colleagues at Caltech, the Keck Observatory, Yale University, and the Gemini Observatory in Hawaii. The research was partly funded by NASA. A paper about the discovery was submitted on October 3 to Astrophysical Journal Letters.

"Since the day we discovered Xena, the big question has been whether or not it has a moon," says Brown. "Having a moon is just inherently cool-and it is something that most self-respecting planets have, so it is good to see that this one does too."

Brown estimates that the moon, nicknamed "Gabrielle"-after the fictional Xena's fictional sidekick-is at least one-tenth of the size of Xena, which is thought to be about 2700 km in diameter (Pluto is 2274 km), and may be around 250 km across.

To know Gabrielle's size more precisely, the researchers need to know the moon's composition, which has not yet been determined. Most objects in the Kuiper Belt, the massive swath of miniplanets that stretches from beyond Neptune out into the distant fringes of the solar system, are about half rock and half water ice. Since a half-rock, half-ice surface reflects a fairly predictable amount of sunlight, a general estimate of the size of an object with that composition can be made. Very icy objects, however, reflect a lot more light, and so will appear brighter-and thus bigger-than similarly sized rocky objects.

Further observations of the moon with NASA's Hubble Space Telescope, planned for November and December, will allow Brown and his colleagues to pin down Gabrielle's exact orbit around Xena. With that data, they will be able to calculate Xena's mass, using a formula first devised some 300 years ago by Isaac Newton.

"A combination of the distance of the moon from the planet and the speed it goes around the planet tells you very precisely what the mass of the planet is," explains Brown. "If the planet is very massive, the moon will go around very fast; if it is less massive, the moon will travel more slowly. It is the only way we could ever measure the mass of Xena-because it has a moon."

The researchers discovered Gabrielle using Keck II's recently commissioned Laser Guide Star Adaptive Optics system. Adaptive optics is a technique that removes the blurring of atmospheric turbulence, creating images as sharp as would be obtained from space-based telescopes. The new laser guide star system allows researchers to create an artificial "star" by bouncing a laser beam off a layer of the atmosphere about 75 miles above the ground. Bright stars located near the object of interest are used as the reference point for the adaptive optics corrections. Since no bright stars are naturally found near Xena, adaptive optics imaging would have been impossible without the laser system.

"With Laser Guide Star Adaptive Optics, observers not only get more resolution, but the light from distant objects is concentrated over a much smaller area of the sky, making faint detections possible," says Marcos van Dam, adaptive optics scientist at the W.M. Keck Observatory, and second author on the new paper.

The new system also allowed Brown and his colleagues to observe a small moon in January around 2003 EL61, code-named "Santa," another large new Kuiper Belt object. No moon was spotted around 2005 FY9-or "Easterbunny"-the third of the three big Kuiper Belt objects recently discovered by Brown and his colleagues using the 48-inch Samuel Oschin Telescope at Palomar Observatory. But the presence of moons around three of the Kuiper Belt's four largest objects-Xena, Santa, and Pluto-challenges conventional ideas about how worlds in this region of the solar system acquire satellites.

Previously, researchers believed that Kuiper Belt objects obtained moons through a process called gravitational capture, in which two formerly separate objects moved too close to one another and become entrapped in each other's gravitational embrace. This was thought to be true of the Kuiper Belt's small denizens-but not, however, of Pluto. Pluto's massive, closely orbiting moon, Charon, broke off the planet billions of years ago, after it was smashed by another Kuiper Belt object. Xena's and Santa's moons appear best explained by a similar origin.

"Pluto once seemed a unique oddball at the fringe of the solar system," Brown says. "But we now see that Xena, Pluto, and the others are part of a diverse family of large objects with similar characteristics, histories, and even moons, which together will teach us much more about the solar system than any single oddball ever would."

Brown's research is partly funded by NASA.

For more information on the discovery and on Xena, visit www.gps.caltech.edu/~mbrown/planetlila

###

Contact: Kathy Svitil (626) 395-8022 ksvitil@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

 

Writer: 
KS
Writer: 

Caltech and Cisco Team to Advance Development of FAST Network

PASADENA, Calif.- Most people-even broadband users-are now familiar with the relatively slow speed of downloading large files off the Internet. Imagine, then, the frustration of scientists working with files a million times larger than the average user ever encounters.

This was the difficulty physicists posed to Steven Low, associate professor of computer science and electrical engineering at the California Institute of Technology, three years ago.

Low, an expert in the field of advanced networking, and his team devised a scheme for moving immense amounts of data across high-speed networks at breakneck speed. Called FAST TCP, for Fast Active queue management Scalable Transmission Control Protocol, the new algorithm has been used by physicists to shatter data transfer speed records for the last two years.

In order to push this work further, the Low group needs an experimental facility to test their ideas. Available test beds are either production wide-area networks that connect physics laboratories around the world such as CERN in Geneva, the Stanford Linear Accelerator Center at Stanford, and Fermi Lab in Chicago or an emulated network in a laboratory.

The first option offers a realistic environment for developing network protocols, but is almost impossible to reconfigure and monitor closely for experimental purposes; the second is the exact opposite.

The Low group proposed to build the "WAN in Lab," a wide-area network with all equipment contained in a laboratory, so that the network would be under the complete control of the experimenters. This unique facility might be described as a "wind tunnel" for networking. And, like FAST TCP, is a collaboration with physicist Harvey Newman and mathematician John Doyle of Caltech.

Thanks to a $1.1 million list value equipment donation from Cisco's Academic Research and Technology Initiatives (ARTI) group, this new lab is quickly becoming a reality. Last-minute work is being done, and the team expects the new facility to be ready this fall.

The ARTI group is part of Cisco's Chief Development Office and is focused on engaging with the research and education community in fostering innovation, research and development opportunities for advanced networking infrastructure in research and education networks and related projects around the world.

The relationship between Caltech's FAST TCP project and Cisco's ARTI group began in the spring of 2002, when Low received a research grant from Cisco's University Research Program (URP). The relationship continued with additional URP funding, which eventually led to this most recent equipment donation for the WAN in Lab project. Throughout the development process for the WAN in Lab, Cisco engineers worked directly with Caltech researchers in designing and implementing the WAN in Lab infrastructure.

"This is an exciting time for us," says Low. "Seeing how scientists actually use our protocol will help us refine our approach. The WAN in Lab is a unique experimental infrastructure critical for exploring many new ideas."

According to Bob Aiken, director of the ARTI group, FAST TCP is designed to address the difficult challenge of meeting the needs of the next generation of networking and computational science researchers. Multidisciplinary research is increasing on a global basis, and the networking requirements of these data-intensive applications require global-scale grids and networks.

"FAST TCP and WAN in Lab, in conjunction with existing research networks with which they connect and peer, such as National Lambda Rail, allow for the measurement of real optical networks and protocols in a tightly controlled environment with true global WAN distances," said Aiken. "Cisco is pleased to have been able to work with Steven Low and Caltech on these projects over the last several years, and look forward to an ongoing research relationship."

The remarkable thing about FAST TCP is that it uses the existing Internet. The secret is in software at the sending point that parses the data into network-compatible packets that avoid typical Internet congestion as they weave their way to their ultimate destination.

In a demonstration at the Supercomputing Bandwidth Challenge last November, FAST TCP set a new world record for sustained data transfer of 101 gigabits per second from Pittsburgh to various research facilities around the world. This phenomenal rate is equivalent to transmitting the entire contents of the Library of Congress in 15 minutes.

Future work in the new WAN in Lab funded by the Cisco equipment donation, the National Science Foundation, the Army Research Office, and Corning will focus on making FAST TCP more robust. The WAN in Lab will also be useful for testing new network theories developed at Caltech and other universities.

Harvey Newman, Caltech physics professor and board chair of the U.S. CMS (the collaborative Compact Muon Solenoid experiment being conducted by U.S. scientists at CERN), expects joint research to lead to even greater performance.

"This new facility will allow scientists and network engineers to collaborate on designing networks that will revolutionize data-intensive grid computing and other forms of scientific computing over the next decade," said Newman. "This will be an enabling force in the next round of scientific discoveries expected when the Large Hadron Collider begins operation in 2007."

The creation of an interdisciplinary research infrastructure based on information is a major focus of the Information Science and Technology (IST) initiative at Caltech. The $1.1 million Cisco equipment donation is part of a $100 million effort to make IST a reality.

### Contact: Jill Perry (626) 395-3226 jperry@caltech.edu Visit the Caltech Media Relations website at: http://pr.caltech.edu/media

Writer: 
JP
Writer: 

Survey of Early Universe Uncovers Mature Galaxy Eight Times More Massive Than Milky Way

PASADENA, Calif.--A massive galaxy seen when the universe was only 800 million years old has been discovered by teams of astronomers using NASA's Spitzer and Hubble Space Telescopes.

The galaxy's large mass and maturity come as a surprise, because experts previously thought that early galaxies in the young universe should be less prominent agglomerations of stars, rather than giant collections of hundreds of billions of stars as populous or more so than the Milky Way. The researchers are particularly intrigued by the fact that star formation in the galaxy seems to have already been completed. This implies that the bulk of the activity that built up the galaxy had occurred even earlier.

"This is truly a significant object," says Richard Ellis, who is the Steele Family Professor of Astronomy at the California Institute of Technology and a member of the discovery team. "Although we are looking back to when the universe was only 6 percent of its present age, this galaxy has already built up a mass in stars eight times that of the Milky Way.

"If the distance measurement to this object holds up to further scrutiny, the fact such a galaxy has already completed its star formation implies a yet earlier period of intense activity," Ellis adds. "It's like crossing the ocean and meeting a lone seagull, a forerunner of land ahead. There is now every reason to search beyond this object for the cosmic dawn when the first such systems switched on!"

The galaxy was pinpointed among approximately 10,000 others in a small patch of sky called the Hubble Ultra Deep Field (UDF). It is believed to be about as far away as the most distant galaxies known.

Bahram Mobasher of the Space Telescope Science Institute, leader of the science team, explains, "We found this galaxy in Hubble's infrared images of the UDF and expected it to be young and small, like other known galaxies at similar distances. Instead, we found evidence that it is remarkably mature and much more massive. This is the surprising discovery."

The galaxy's great distance was deduced from the fact that Hubble does not see the galaxy in visible light (despite the fact that the UDF is the deepest image ever taken in optical light). This indicates that the galaxy's blue light has been absorbed by traveling billions of light-years through intervening hydrogen gas. The galaxy was detected using Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS), and with an infrared camera on the Very Large Telescope (VLT) at the European Southern Observatory. At those near-infrared wavelengths it is very faint and red.

The big surprise is how much brighter the galaxy is in images at slightly longer infrared wavelengths from the Spitzer Space Telescope. Spitzer is sensitive to the light from older, redder stars, which should make up most of the mass in a galaxy. The infrared brightness of the galaxy suggests that it is very massive.

Two other Spitzer observations, one reported earlier by Ellis and his colleagues at the University of Exeter, UK, and the other by Haojing Yan of the Spitzer Science Center, had already revealed evidence for mature stars in more ordinary, less massive galaxies at similar distances, when the universe was less than one billion years old. However, the new observation extends this notion of surprisingly mature galaxies to an object which is perhaps ten times more massive, and which seemed to form its stars even earlier in the history of the universe.

The team estimated the distance to this galaxy by combining the information provided by the Hubble, Spitzer, and VLT observations. The relative brightness of the galaxy at different wavelengths is influenced by the expanding universe, and allows astronomers to estimate its distance. At the same time, they can also get an idea of the make-up of the galaxy in terms of the mass and age of its stars.

Efforts by Dan Stark, a graduate student at Caltech, using both the giant 10 m Keck and 8 m Gemini telescopes failed to pinpoint the galaxy's distance via spectroscopic methods-the astronomers' conventional tool for estimating cosmic distances. "We have to admit," says Stark, "that we have now reached the point where we are studying sources which lie beyond the spectroscopic capabilities of our current ground-based facilities. It may take the next generation of telescopes, such as the James Webb Space Telescope and Caltech's proposed Thirty Meter Telescope, to confirm the galaxy's distance."

While astronomers generally believe most galaxies were built up piecewise by mergers of smaller galaxies, the discovery of this object suggests that at least a few galaxies formed quickly and in their entirety long ago. For such a large galaxy, this would have been a tremendously explosive event of star birth.

The findings will be published in the December 20, 2005, issue of the Astrophysical Journal.

JPL manages the Spitzer Space Telescope mission for NASA. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. JPL is a division of Caltech. Spitzer's infrared array camera, which took the picture of the galaxy, was built by NASA Goddard Space Flight Center, Greenbelt, Md.

Electronic images and additional information are available at

http://hubblesite.org/news/2005/28 http://www.spitzer.caltech.edu/Media/releases/ssc2005-19/

Further information relating to the James Webb Space Telescope and the proposed Thirty Meter Telescope (a collaboration between the California Institute of Technology the University of California, the Association of Universities for Research in Astronomy, and the Association of Canadian Universities for Research in Astronomy), can be found at:

http://www.jwst.nasa.gov/ http://www.tmt.org/

Contacts:

Professor Richard Ellis (cell) 626-676-5530 rse@astro.caltech.edu

Daniel Stark (cell) 626-315-2939 dps@astro.caltech.edu

Dr Bahram Mobasher (cell) 443-812-8149 mobasher@ststci.edu

Robert Tindol (Media Relations, Caltech): (office) 626-395-3631 tindol@cal

Writer: 
RT

Scientists Uncover Rules that Govern the Rate of Protein Evolution

PASADENA, Calif.--Humans and insects and pond scum-and all other living things on Earth-are constantly evolving. The tiny proteins these living things are built from are also evolving, accumulating mutations mostly one at a time over billions of years. But for reasons that hitherto have been a mystery, some proteins evolve quickly, while others take their sweet time-even when they reside in the same organism.

Now, a team of researchers at the California Institute of Technology, applying novel data-mining methods to the now-completed sequence of the yeast genome, have uncovered a surprising reason why different proteins evolve at different rates.

Reporting in the September 19 edition of the journal Proceedings of the National Academy of Sciences (PNAS), lead author Allan Drummond and his coauthors from Caltech and the Keck Graduate Institute show that the evolution of protein is governed by their ability to tolerate mistakes during their production. This finding disputes the longstanding assumption that functionally important proteins evolve slowly, while less-important proteins evolve more quickly.

"The reason proteins evolve at different rates has been a mystery for decades in biology," Drummond explains. But with the recent flood of sequenced genomes and inventories of all the pieces and parts making up cells, the mystery deepened. Researchers discovered that the more of a protein that was produced, the slower it evolved, a trend that applies to all living things. But the reason for this trend remained obscure, despite many attempts to explain it.

Biologists have long known that the production machinery that translates the genetic code into proteins is sloppy. So much so, in fact, that on average about one in five proteins in yeast is mistranslated, the equivalent of translating the Spanish word "Adios" as "Goofbye." The more copies of a protein produced, the more potential errors. And mistakes can be costly: some translation errors turn proteins into useless junk that can even be harmful (like miscopying a digit in an important phone number), while other errors can be tolerated. So the more protein copies per cell, the more potential harm-unless those abundant proteins themselves can evolve to tolerate more errors.

"That was the 'Aha!'" says Drummond. "We knew from our experiments with manipulating proteins in the lab that some had special properties that allowed them to tolerate more changes than other proteins. They were more robust." So, what if proteins could become robust to translation errors? That would mean fewer harmful errors, and thus a more fit organism.

To test predictions of this hypothesis, the team turned to the lowly baker's yeast, a simple one-celled organism that likes to suck up the nutrients in bread dough, and then expels gas to give baked bread its fluffy texture. Baker's yeast is not only a simple organism, it is also extraordinarily well understood. Just as biologists have now sequenced the human genome, they have also sequenced the yeast genome. Moreover, the numbers of every type of protein in the yeast cell have been painstakingly measured.

For example, there's a protein in the yeast cell called PMA1 that acts as a transformer, converting stored energy into more useful forms. Since nothing living can do without energy, this is a very fundamental and important component of the yeast cell. And every yeast cell churns out about 1.26 million individual PMA1 molecules, making it the second-most abundant cellular protein.

The old assumption was that PMA1 changed slowly because its energy-transforming function was so fundamental to survival. But the Caltech team's new evidence suggests that the sheer number of PMA1 molecules produced is the reason that the protein doesn't evolve very quickly.

"The key insight is that natural selection targets the junk proteins, not the functional proteins," says Drummond. "If translation errors turned 5 percent of the PMA1 proteins in a yeast cell into junk, those junk proteins would be more abundant than 97 percent of all the other proteins in the cell. That's a huge amount of toxic waste to dispose of."

So instead, Darwinian evolution favors yeast cells with a version of PMA1 that continues to function despite errors, producing less junk. That version of PMA1 evolves slowly because the slightest changes destroy its crucial ability to withstand errors.

Consider two competing computer factories. Both make the same number of mistakes on their assembly lines, but one company's computers are designed such that the inevitable mistakes result in computers that still work, while with the other company's design, one mistake and the computer must be tossed on the recycling heap. In the cutthroat marketplace, the former company, with lower costs and higher output, will quickly outcompete the latter.

Likewise, viewing yeast cells as miniature factories, the yeast whose most-abundant proteins are least likely to be destroyed by production mistakes will outcompete its less-efficient rivals. The more optimized those high-abundance proteins are--the more rigid the specifications that make them so error-resistant-the slower they evolve. Hence, high abundance means slow evolution.

The team is now exploring other predictions of this surprising hypothesis, such as what specific chemical changes allow proteins to resist translation errors. "It's the tip of the iceberg," Drummond says.

Drummond is a graduate student in Caltech's interdisciplinary Computation and Neural Systems program. The other authors of the paper include his two advisors: Frances Arnold, the Dickinson Professor of Chemical Engineering and Biochemistry at Caltech, and Chris Adami, an expert in population genetics who is now at the Keck Graduate Institute in Claremont, California. The other authors are Jesse D. Bloom, a graduate student in chemistry at Caltech; and Claus Wilke, a former postdoctoral researcher of Adami's who has recently joined the University of Texas at Austin as an assistant professor.

The title of the PNAS paper is "Why highly expressed proteins evolve slowly."

 

Writer: 
Robert Tindol
Writer: 
Exclude from News Hub: 
No

Most Distant Explosion in Universe Detected; Smashes Previous Record

WASHINGTON, D.C.--Scientists using the NASA Swift satellite and several ground-based telescopes, including Palomar Observatory's robotic 60-inch telescope, have detected the most distant explosion yet, a gamma-ray burst from the edge of the visible universe.

This powerful burst, likely marking the death of a massive star as it collapsed into a black hole, was detected on September 4. It comes from an era soon after stars and galaxies first formed, about 500 million to 1 billion years after the Big Bang. The science team cannot yet determine the nature of the exploded star; a detailed analysis is forthcoming.

The 60-inch telescope at Palomar Observatory, which is owned and operated by the California Institute of Technology, observed the burst at visible wavelengths. At about the same time, a team led by Daniel Reichart of the University of North Carolina undertook near-infrared observations with the SOAR (Southern Observatory for Astrophysical Research) telescope, located in Chile. A bright near-infrared source was detected in the SOAR observations but completely absent in the Palomar data.

Building upon these pieces of information, a team led by Nobuyuki Kawai of the Tokyo Institute of Technology used the Subaru Observatory on Mauna Kea, in Hawaii, to confirm the distance and fine-tune the redshift measurement to 6.29 via a technique called spectroscopy. A redshift of 6.29 translates to a distance of about 13 billion light-years from Earth. The universe is thought to be 13.7 billion years old.

An ordinary afterglow would be easily detected by both facilities, but the fact that the afterglow was not seen by the Palomar 60-inch visible imager, but was readily detected by the SOAR infra-red imager alerted Reichart to the possibility that the afterglow was located at the edge of the universe. For such distant objects, hydrogen in the intergalactic medium absorbs visible rays but not infrared rays. Hence, invisibility in the visible spectrum indicated that the object was extremely far away.

"This is uncharted territory," said Reichart, who spearheaded the distance measurement. "This burst smashes the old distance record by 500 million light-years. We are finally starting to see the remnants of some of the oldest objects in the universe."

The Caltech team, led by Derek Fox, until recently a postdoctoral fellow and now a professor at the Pennsylvania State University, and Caltech graduate student Brad Cenko, did the actual observations with the Palomar telescope.

"The key step in identifying these much sought-after gamma-ray bursts is the dual combination of detection in the near-infrared and non-detection in the visible," says Fox. "We sincerely hope that the 60-inch will continue not detecting more gamma-ray bursts!"

"I am elated that the Palomar 60-inch telescope contributed to the first vital step in inferring the great distance to this burst," added Cenko, who spent the last two years robotocizing the 60-inch telescope. "The existence of such distant GRBs has been postulated for quite some time, but the detection makes me feel secure that I have a sound thesis topic."

Shri Kulkarni, the principal investigator of the Palomar 60-inch robotic telescope and the MacArthur Professor of Astronomy and Planetary Science at Caltech, noted that "the discovery has highlighted the important niche for smaller telescopes, especially robotic telescopes, in undertaking cutting-edge research for objects literally at the edge of the universe. We have hit paydirt, thanks to Caltech's investment in the 60-inch telescope."

The only object ever discovered at a greater distance was a quasar at a redshift of 6.4. Whereas quasars are supermassive black holes containing the mass of billions of stars, this burst comes from a single star. Yet gamma-ray bursts might be plentiful, according to Donald Lamb of the University of Chicago.

Scientists measure cosmic distances via redshift, the extent to which light is "shifted" towards the red (lower energy) part of the electromagnetic spectrum during its long journey across the universe. The greater the distance, the higher the redshift.

The September 4 burst is named GRB 050904, for the date it was detected. The previous most distant gamma-ray burst had a redshift of 4.5.

"We designed Swift to look for faint bursts coming from the edge of the universe," said Neil Gehrels of NASA's Goddard Space Flight Center, who is the Swift principal investigator. "Now we've got one, and it's fascinating. For the first time we can learn about individual stars from near the beginning of time. There are surely many more out there."

Writer: 
Robert Tindol
Writer: 

Pages