NASA Grant Will Fund New Research on Mars with the Spirit and Opportunity Rovers

PASADENA, Calif.—When it comes to longevity, the Spirit and Opportunity rovers on Mars are giving some real competition to the pink bunny from those battery advertisements. The two rovers in a couple of months will celebrate their second anniversary on the red planet, even though their original missions were only 90 days.

With no end to the rover missions in sight, NASA has selected a planetary scientist at the California Institute of Technology to see if he and his team can learn new things about the ground the rovers are currently rolling on. With any luck, the researchers will uncover further evidence about water or water vapor once present on the planet's surface.

Oded Aharonson, assistant professor of planetary science at Caltech, was chosen as part of the Mars Exploration Rover Participating Scientist Program. Aharonson and seven other investigators have been selected from 35 applicants. According to NASA, the eight successful proposals were chosen on the basis of merit, relevance, and cost-effectiveness. Aharonson and the seven other finalists will become official members of the Mars Exploration Rovers science team, according to Michael Meyer, lead scientist for the Mars Exploration Program.

"Spirit and Opportunity have exceeded all expectations for their longevity on Mars, and both rovers are in good position to continue providing even more great science," said Meyer. "Because of this, we want to add to the rover team that collectively chooses how to use the rover's science instruments each day."

Aharonson's proposal is formally titled "Soil Structure and Stratification as Indicators of Aqueous Transport at the MER Landing Sites." In nontechnical talk, that means the researchers will be using the rovers to look at Martian dirt and rocks to see if liquid water has ever altered them.

The search for evidence of running water on Mars has been a "Holy Grail" for the entire exploratory program. Although the details of how life originally evolved are still largely conjectural, experts think that liquid water is required for the sort of chemistry thought to be conducive to the emergence of life as we know it.

Although there is no liquid water on the Martian surface at present, Opportunity has found geological evidence that water formerly flowed there. Thus, Aharonson will be looking for the telltale signatures of ancient as well as more recent aqueous transport and alteration.

"My experiments would normally take a couple of weeks, but it's not clear exactly how much time we'll devote to them," Aharonson said. "If we find something interesting, it could be much longer. But we might also cut the time shorter if, for example, we come upon an interesting rock we want to look at more closely."

Aharonson will work with a new Caltech faculty member, John Grotzinger, who comes from MIT as the Fletcher Jones Professor of Geology and is already a member of the rovers' science team. In addition, Caltech postdoctoral researcher Deanne Rogers will be involved in the research.

Spirit and Opportunity have been exploring sites on opposite sides of Mars since January 2004. They have found geological evidence of ancient environmental conditions that were wet and possibly habitable. They completed their primary missions three months later and are currently in the third extension of thse missions. NASA's Jet Propulsion Laboratory, a division of the California Institute of Technology, Pasadena, manages the Mars Exploration Rover project for NASA's Science Mission Directorate.


Robert Tindol

Interdisciplinary Scientists Propose Paradigm Shift in Robotic Space Exploration

PASADENA, Calif.—Just ask any geologist. If you're studying the history of a planet and the life forms that may have lived on it, the really good places to look are rugged terrains like canyons and other areas where water, igneous activity, wind, and seismic rumblings have left their respective marks. Flat is not so good.

But when it comes to exploring other worlds, like Mars, the strategy for ground-based reconnaissance thus far has been to land in relatively smooth places so the spacecraft won't slam into something vertical as it touches down or as it rolls to a stop in its protective airbags. In the cases of the Mars landings—and all soft landings on other planets and moons, for that matter—flat is definitely good.

To address this disconnect, a team of interdisciplinary scientists from the California Institute of Technology, the University of Arizona, and the U.S. Geological Survey has unveiled a proposal to make core changes in the robotic exploration of the solar system. In addition to spaceborne orbiters, the "new paradigm" would involve sending orbiter-guided blimps (or other airborne agents) carrying instruments such as optical and thermal cameras, ground-penetrating radar, and gas and humidity sensors to chosen areas of a planet, as well as using herds of small, robotic, ground-based explorers.

The ground explorers would communicate with the airborne and/or spaceborne agents, coupled with innovative software for identification, characterization, and integration of various types of spatial and temporal information for in-transit comparative analysis, hypothesis formulation, and target selection. This would lead to a "tier-scalable perspective," akin to the approach used by field geologists to solve a complicated geological puzzle.

Writing in an upcoming issue of the journal Planetary and Space Science, the researchers propose "a fundamentally new scientific mission concept for remote planetary surface and subsurface reconnaissance." The new approach will be cost-effective, in that it can include greater redundancy and thus greater assurance of mission success, while significantly allowing unconstrained science-driven missions to uncover transient events (for example, evidence of liquid water) and possible signs of life on other worlds.

"We're not trying to take anything away from the successful landings on Mars, Venus, and Titan, nor the orbital-based successes to most of the planetary bodies of the solar system," says Wolfgang Fink, a physicist who is serving a multiyear appointment as a visiting associate at Caltech. "But we think our tier-scalable mission concept will afford greater opportunity and freedom to identify and home in on geological and potential astrobiological 'sweet spots.'"

The new paradigm is spearheaded by Fink and by James Dohm, a planetary geologist in the Department of Hydrology and Water Resources at the University of Arizona. The team effort includes Mark Tarbell, who is Fink's associate in Caltech's Visual and Autonomous Exploration Systems Research Lab; Trent Hare of the U.S. Geological Survey office in Flagstaff; and Victor Baker, also of the University of Arizona.

"The paradigm-changing mission concept is by no means accidental," Dohm explains. "Our interdisciplinary team of scientists has evolved the concept through the profound realization of the requirement to link the various disciplines to optimally go after prime targets such as those environments that have high potential to contain life or far-reaching geological, hydrological, and climatological records."

Fink, for his part, is an expert in imaging systems, autonomous control, and science analysis systems for space missions. Dohm is a planetary and terrestrial field geologist, who, based on his experience, has a keen sense of how and where to study a terrain, be it earthly or otherworldly.

Dohm, who has performed geological investigations of Mars from local to global scales for nearly twenty years, says the study of the geology of other planets has been fruitful yet frustrating. "You're not able to verify the remote-based information in person and uncover additional information that would lead to an improved understanding of the geologic, water, climate, and possible biologic history of Mars.

"Ideally, you'd want to look at remote-based geological information while you walked with a rock hammer in hand along the margin that separates a lava flow from putative marine deposits, exploring possible water seeps and moisture embankments within the expansive canyon system of Valles Marineris that would extend from Los Angeles to New York, characterizing the sites of potential ancient and present hydrothermal activity, climbing over the ancient mountain ranges, gathering diverse rock types for lab analysis, and so on.

"We think we've devised a way to perform the geologic approach on other planets in more or less the way geologists do here on Earth."

Even though orbiting spacecraft have successfully collected significant data through instrument suites, working hypotheses are yet to be confirmed. In the case of Mars, for example, it is unknown whether the mountain ranges contain rock types other than volcanic, or whether sites of suspected hydrothermal activity are indeed hydrothermal environments, or whether the most habitable sites actually contain signs of life. These questions may be addressed through the "new paradigm."

The interdisciplinary collaboration provides the wherewithal for thinking out of the box because the researchers are, well, out of the box. "We're looking at a new way to cover lots of distance, both horizontally and vertically, and new, automated, ways to put the gathered information together and analyze it-perhaps before it even comes back to Earth," Fink says.

Just how innovative would the missions be? The tier-scalable paradigm would vary according to the conditions of the planet or moon to be studied, and, significantly, to the specific scientific goals. "I realize that several missions in the past have been lost during orbital insertion, but we think that the worst perils for a robotic mission are in getting the instruments to the ground successfully," Fink says. "So our new paradigm involves missions that are not crippled if a single rover is lost."

In the case of Mars, a typical mission would deploy maneuverable airborne agents, such as blimps, equipped with existing multilayered information (geologic, topographic, geomorphic, geophysical, hydrologic, elemental, spectral, etc.) that would acquire and ingest information while in transit from various altitudes.

While floating and performing smart reconnaissance (that is, in-transit analysis of both the existing and newly acquired spatial and temporal information in order to formulate working hypotheses), the airborne agents would migrate toward sweet spots, all the while communicating with the orbiter or orbiters. Once the sweet spots are identified, the airborne agents would be in position to deploy or help guide orbiter-based deployment of ground-based agents for further analysis and sampling.

"Knowing where you are in the field is extremely critical to the geologic reconnaissance approach," Dohm says.

"Thanks to the airborne perspective and control, this would be less of a concern within our tier-scalable mission concept, as opposed to, for example, the case of an autonomous long-range rover on Mars that is dependent on visible landmarks to account for its current location," Fink adds.

The robotic ground-based agents would be simpler and smaller than the rovers currently being sent to Mars. Though not necessarily any more robust than the current generation, the agents would be able to take on rocky and steep terrains simply because there would be several with varying degrees of intelligence and thereby, would be somewhat expendable. If a single rover turned over on a steep terrain, for example, the mission would not be lost.

The stationary sensors would be even more expendable and could collect information that can be transmitted back to the airborne units and orbiter or orbiters for comparative analysis of the spatial and temporal information for redeployment if the units are mobile, or for additional deployment, such as placing a drill rig in a prime sweet spot in order to sample possible near-surface groundwater or even living organisms.

And these are just the ideas for planets such as Mars. The researchers also have multi-tier mission plans for other planets and moons, where the conditions could be much more harsh.

In the case of Titan, one of Saturn's moons with an atmosphere one and a half times as thick as Earth's, autonomously controlled airships would be ideal for exploration, rendering Titan perfectly suited for deployment of a three-tiered system consisting of orbiters, blimps, and both mobile and immobile ground-agents, especially in light of the even longer communication time lag than in the case of Mars.

On Io, the extremely volcanically active atmosphere-void moon of Jupiter, in contrast, the orbiter-guided deployment of mobile ground-agents and immobile sensors would be a productive way of performing ground-based reconnaissance, capturing and studying active volcanism beyond Earth. In this case, the three-tiered system of spaceborne-, airborne-, and ground-level would be reduced to the two tiers of spaceborne- and ground-level.

The advantages of redundancy and data compilation would still provide huge operational advantages on Io: if some of the ground-agents are lost to volcanic hazards, for example, those that remain can still identify, map out, and transmit back significant information, thus rendering the overall mission a success. The tier-scalable concept is equally applicable to investigating active processes on Earth that may put scientists in harm's way, including volcanism, flooding, and other hazardous natural and human-induced environmental conditions.

The researchers are already in the test-bed stage of trying out advanced hardware and software designs. But Fink says that much of the technology is already available, and even that which is not currently available (the software, primarily) is quite attainable. Further design, testing, and "ground-truthing" are required in diverse environments on Earth. Fink and Dohm envision field camps where the international community of scientists, engineers, mission architects, and others can design and test optimal tier-scalable reconnaissance systems for the various planetary bodies and scientific objectives.

"After all, the question is what do you do with your given payload," Fink says. "Do you land one big rover, or would you rather deploy airborne agents and multiple smaller ground-agents that are commanded centrally from above, approximating a geologist performing tier-scalable reconnaissance?

"We think the 'tunable' range of autonomy with our tier-scalable mission concept will be of interest to NASA and the other agencies worldwide that are looking at the exploration of the Solar System."

The title of the Planetary and Space Science article is "Next-generation robotic planetary reconnaissance missions: A paradigm shift."

Robert Tindol

Tenth Planet Has a Moon

PASADENA, Calif. --The newly discovered 10th planet, 2003 UB313, is looking more and more like one of the solar system's major players. It has the heft of a real planet (latest estimates put it at about 20 percent larger than Pluto), a catchy code name (Xena, after the TV warrior princess), and a Guinness Book-ish record of its own (at about 97 astronomical units-or 9 billion miles from the sun-it is the solar system's farthest detected object). And, astronomers from the California Institute of Technology and their colleagues have now discovered, it has a moon.

The moon, 100 times fainter than Xena and orbiting the planet once every couple of weeks, was spotted on September 10, 2005, with the 10-meter Keck II telescope at the W.M. Keck Observatory in Hawaii by Michael E. Brown, professor of planetary astronomy, and his colleagues at Caltech, the Keck Observatory, Yale University, and the Gemini Observatory in Hawaii. The research was partly funded by NASA. A paper about the discovery was submitted on October 3 to Astrophysical Journal Letters.

"Since the day we discovered Xena, the big question has been whether or not it has a moon," says Brown. "Having a moon is just inherently cool-and it is something that most self-respecting planets have, so it is good to see that this one does too."

Brown estimates that the moon, nicknamed "Gabrielle"-after the fictional Xena's fictional sidekick-is at least one-tenth of the size of Xena, which is thought to be about 2700 km in diameter (Pluto is 2274 km), and may be around 250 km across.

To know Gabrielle's size more precisely, the researchers need to know the moon's composition, which has not yet been determined. Most objects in the Kuiper Belt, the massive swath of miniplanets that stretches from beyond Neptune out into the distant fringes of the solar system, are about half rock and half water ice. Since a half-rock, half-ice surface reflects a fairly predictable amount of sunlight, a general estimate of the size of an object with that composition can be made. Very icy objects, however, reflect a lot more light, and so will appear brighter-and thus bigger-than similarly sized rocky objects.

Further observations of the moon with NASA's Hubble Space Telescope, planned for November and December, will allow Brown and his colleagues to pin down Gabrielle's exact orbit around Xena. With that data, they will be able to calculate Xena's mass, using a formula first devised some 300 years ago by Isaac Newton.

"A combination of the distance of the moon from the planet and the speed it goes around the planet tells you very precisely what the mass of the planet is," explains Brown. "If the planet is very massive, the moon will go around very fast; if it is less massive, the moon will travel more slowly. It is the only way we could ever measure the mass of Xena-because it has a moon."

The researchers discovered Gabrielle using Keck II's recently commissioned Laser Guide Star Adaptive Optics system. Adaptive optics is a technique that removes the blurring of atmospheric turbulence, creating images as sharp as would be obtained from space-based telescopes. The new laser guide star system allows researchers to create an artificial "star" by bouncing a laser beam off a layer of the atmosphere about 75 miles above the ground. Bright stars located near the object of interest are used as the reference point for the adaptive optics corrections. Since no bright stars are naturally found near Xena, adaptive optics imaging would have been impossible without the laser system.

"With Laser Guide Star Adaptive Optics, observers not only get more resolution, but the light from distant objects is concentrated over a much smaller area of the sky, making faint detections possible," says Marcos van Dam, adaptive optics scientist at the W.M. Keck Observatory, and second author on the new paper.

The new system also allowed Brown and his colleagues to observe a small moon in January around 2003 EL61, code-named "Santa," another large new Kuiper Belt object. No moon was spotted around 2005 FY9-or "Easterbunny"-the third of the three big Kuiper Belt objects recently discovered by Brown and his colleagues using the 48-inch Samuel Oschin Telescope at Palomar Observatory. But the presence of moons around three of the Kuiper Belt's four largest objects-Xena, Santa, and Pluto-challenges conventional ideas about how worlds in this region of the solar system acquire satellites.

Previously, researchers believed that Kuiper Belt objects obtained moons through a process called gravitational capture, in which two formerly separate objects moved too close to one another and become entrapped in each other's gravitational embrace. This was thought to be true of the Kuiper Belt's small denizens-but not, however, of Pluto. Pluto's massive, closely orbiting moon, Charon, broke off the planet billions of years ago, after it was smashed by another Kuiper Belt object. Xena's and Santa's moons appear best explained by a similar origin.

"Pluto once seemed a unique oddball at the fringe of the solar system," Brown says. "But we now see that Xena, Pluto, and the others are part of a diverse family of large objects with similar characteristics, histories, and even moons, which together will teach us much more about the solar system than any single oddball ever would."

Brown's research is partly funded by NASA.

For more information on the discovery and on Xena, visit


Contact: Kathy Svitil (626) 395-8022

Visit the Caltech Media Relations Web site at:



Caltech and Cisco Team to Advance Development of FAST Network

PASADENA, Calif.- Most people-even broadband users-are now familiar with the relatively slow speed of downloading large files off the Internet. Imagine, then, the frustration of scientists working with files a million times larger than the average user ever encounters.

This was the difficulty physicists posed to Steven Low, associate professor of computer science and electrical engineering at the California Institute of Technology, three years ago.

Low, an expert in the field of advanced networking, and his team devised a scheme for moving immense amounts of data across high-speed networks at breakneck speed. Called FAST TCP, for Fast Active queue management Scalable Transmission Control Protocol, the new algorithm has been used by physicists to shatter data transfer speed records for the last two years.

In order to push this work further, the Low group needs an experimental facility to test their ideas. Available test beds are either production wide-area networks that connect physics laboratories around the world such as CERN in Geneva, the Stanford Linear Accelerator Center at Stanford, and Fermi Lab in Chicago or an emulated network in a laboratory.

The first option offers a realistic environment for developing network protocols, but is almost impossible to reconfigure and monitor closely for experimental purposes; the second is the exact opposite.

The Low group proposed to build the "WAN in Lab," a wide-area network with all equipment contained in a laboratory, so that the network would be under the complete control of the experimenters. This unique facility might be described as a "wind tunnel" for networking. And, like FAST TCP, is a collaboration with physicist Harvey Newman and mathematician John Doyle of Caltech.

Thanks to a $1.1 million list value equipment donation from Cisco's Academic Research and Technology Initiatives (ARTI) group, this new lab is quickly becoming a reality. Last-minute work is being done, and the team expects the new facility to be ready this fall.

The ARTI group is part of Cisco's Chief Development Office and is focused on engaging with the research and education community in fostering innovation, research and development opportunities for advanced networking infrastructure in research and education networks and related projects around the world.

The relationship between Caltech's FAST TCP project and Cisco's ARTI group began in the spring of 2002, when Low received a research grant from Cisco's University Research Program (URP). The relationship continued with additional URP funding, which eventually led to this most recent equipment donation for the WAN in Lab project. Throughout the development process for the WAN in Lab, Cisco engineers worked directly with Caltech researchers in designing and implementing the WAN in Lab infrastructure.

"This is an exciting time for us," says Low. "Seeing how scientists actually use our protocol will help us refine our approach. The WAN in Lab is a unique experimental infrastructure critical for exploring many new ideas."

According to Bob Aiken, director of the ARTI group, FAST TCP is designed to address the difficult challenge of meeting the needs of the next generation of networking and computational science researchers. Multidisciplinary research is increasing on a global basis, and the networking requirements of these data-intensive applications require global-scale grids and networks.

"FAST TCP and WAN in Lab, in conjunction with existing research networks with which they connect and peer, such as National Lambda Rail, allow for the measurement of real optical networks and protocols in a tightly controlled environment with true global WAN distances," said Aiken. "Cisco is pleased to have been able to work with Steven Low and Caltech on these projects over the last several years, and look forward to an ongoing research relationship."

The remarkable thing about FAST TCP is that it uses the existing Internet. The secret is in software at the sending point that parses the data into network-compatible packets that avoid typical Internet congestion as they weave their way to their ultimate destination.

In a demonstration at the Supercomputing Bandwidth Challenge last November, FAST TCP set a new world record for sustained data transfer of 101 gigabits per second from Pittsburgh to various research facilities around the world. This phenomenal rate is equivalent to transmitting the entire contents of the Library of Congress in 15 minutes.

Future work in the new WAN in Lab funded by the Cisco equipment donation, the National Science Foundation, the Army Research Office, and Corning will focus on making FAST TCP more robust. The WAN in Lab will also be useful for testing new network theories developed at Caltech and other universities.

Harvey Newman, Caltech physics professor and board chair of the U.S. CMS (the collaborative Compact Muon Solenoid experiment being conducted by U.S. scientists at CERN), expects joint research to lead to even greater performance.

"This new facility will allow scientists and network engineers to collaborate on designing networks that will revolutionize data-intensive grid computing and other forms of scientific computing over the next decade," said Newman. "This will be an enabling force in the next round of scientific discoveries expected when the Large Hadron Collider begins operation in 2007."

The creation of an interdisciplinary research infrastructure based on information is a major focus of the Information Science and Technology (IST) initiative at Caltech. The $1.1 million Cisco equipment donation is part of a $100 million effort to make IST a reality.

### Contact: Jill Perry (626) 395-3226 Visit the Caltech Media Relations website at:


Survey of Early Universe Uncovers Mature Galaxy Eight Times More Massive Than Milky Way

PASADENA, Calif.--A massive galaxy seen when the universe was only 800 million years old has been discovered by teams of astronomers using NASA's Spitzer and Hubble Space Telescopes.

The galaxy's large mass and maturity come as a surprise, because experts previously thought that early galaxies in the young universe should be less prominent agglomerations of stars, rather than giant collections of hundreds of billions of stars as populous or more so than the Milky Way. The researchers are particularly intrigued by the fact that star formation in the galaxy seems to have already been completed. This implies that the bulk of the activity that built up the galaxy had occurred even earlier.

"This is truly a significant object," says Richard Ellis, who is the Steele Family Professor of Astronomy at the California Institute of Technology and a member of the discovery team. "Although we are looking back to when the universe was only 6 percent of its present age, this galaxy has already built up a mass in stars eight times that of the Milky Way.

"If the distance measurement to this object holds up to further scrutiny, the fact such a galaxy has already completed its star formation implies a yet earlier period of intense activity," Ellis adds. "It's like crossing the ocean and meeting a lone seagull, a forerunner of land ahead. There is now every reason to search beyond this object for the cosmic dawn when the first such systems switched on!"

The galaxy was pinpointed among approximately 10,000 others in a small patch of sky called the Hubble Ultra Deep Field (UDF). It is believed to be about as far away as the most distant galaxies known.

Bahram Mobasher of the Space Telescope Science Institute, leader of the science team, explains, "We found this galaxy in Hubble's infrared images of the UDF and expected it to be young and small, like other known galaxies at similar distances. Instead, we found evidence that it is remarkably mature and much more massive. This is the surprising discovery."

The galaxy's great distance was deduced from the fact that Hubble does not see the galaxy in visible light (despite the fact that the UDF is the deepest image ever taken in optical light). This indicates that the galaxy's blue light has been absorbed by traveling billions of light-years through intervening hydrogen gas. The galaxy was detected using Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS), and with an infrared camera on the Very Large Telescope (VLT) at the European Southern Observatory. At those near-infrared wavelengths it is very faint and red.

The big surprise is how much brighter the galaxy is in images at slightly longer infrared wavelengths from the Spitzer Space Telescope. Spitzer is sensitive to the light from older, redder stars, which should make up most of the mass in a galaxy. The infrared brightness of the galaxy suggests that it is very massive.

Two other Spitzer observations, one reported earlier by Ellis and his colleagues at the University of Exeter, UK, and the other by Haojing Yan of the Spitzer Science Center, had already revealed evidence for mature stars in more ordinary, less massive galaxies at similar distances, when the universe was less than one billion years old. However, the new observation extends this notion of surprisingly mature galaxies to an object which is perhaps ten times more massive, and which seemed to form its stars even earlier in the history of the universe.

The team estimated the distance to this galaxy by combining the information provided by the Hubble, Spitzer, and VLT observations. The relative brightness of the galaxy at different wavelengths is influenced by the expanding universe, and allows astronomers to estimate its distance. At the same time, they can also get an idea of the make-up of the galaxy in terms of the mass and age of its stars.

Efforts by Dan Stark, a graduate student at Caltech, using both the giant 10 m Keck and 8 m Gemini telescopes failed to pinpoint the galaxy's distance via spectroscopic methods-the astronomers' conventional tool for estimating cosmic distances. "We have to admit," says Stark, "that we have now reached the point where we are studying sources which lie beyond the spectroscopic capabilities of our current ground-based facilities. It may take the next generation of telescopes, such as the James Webb Space Telescope and Caltech's proposed Thirty Meter Telescope, to confirm the galaxy's distance."

While astronomers generally believe most galaxies were built up piecewise by mergers of smaller galaxies, the discovery of this object suggests that at least a few galaxies formed quickly and in their entirety long ago. For such a large galaxy, this would have been a tremendously explosive event of star birth.

The findings will be published in the December 20, 2005, issue of the Astrophysical Journal.

JPL manages the Spitzer Space Telescope mission for NASA. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. JPL is a division of Caltech. Spitzer's infrared array camera, which took the picture of the galaxy, was built by NASA Goddard Space Flight Center, Greenbelt, Md.

Electronic images and additional information are available at

Further information relating to the James Webb Space Telescope and the proposed Thirty Meter Telescope (a collaboration between the California Institute of Technology the University of California, the Association of Universities for Research in Astronomy, and the Association of Canadian Universities for Research in Astronomy), can be found at:


Professor Richard Ellis (cell) 626-676-5530

Daniel Stark (cell) 626-315-2939

Dr Bahram Mobasher (cell) 443-812-8149

Robert Tindol (Media Relations, Caltech): (office) 626-395-3631 tindol@cal


Scientists Uncover Rules that Govern the Rate of Protein Evolution

PASADENA, Calif.--Humans and insects and pond scum-and all other living things on Earth-are constantly evolving. The tiny proteins these living things are built from are also evolving, accumulating mutations mostly one at a time over billions of years. But for reasons that hitherto have been a mystery, some proteins evolve quickly, while others take their sweet time-even when they reside in the same organism.

Now, a team of researchers at the California Institute of Technology, applying novel data-mining methods to the now-completed sequence of the yeast genome, have uncovered a surprising reason why different proteins evolve at different rates.

Reporting in the September 19 edition of the journal Proceedings of the National Academy of Sciences (PNAS), lead author Allan Drummond and his coauthors from Caltech and the Keck Graduate Institute show that the evolution of protein is governed by their ability to tolerate mistakes during their production. This finding disputes the longstanding assumption that functionally important proteins evolve slowly, while less-important proteins evolve more quickly.

"The reason proteins evolve at different rates has been a mystery for decades in biology," Drummond explains. But with the recent flood of sequenced genomes and inventories of all the pieces and parts making up cells, the mystery deepened. Researchers discovered that the more of a protein that was produced, the slower it evolved, a trend that applies to all living things. But the reason for this trend remained obscure, despite many attempts to explain it.

Biologists have long known that the production machinery that translates the genetic code into proteins is sloppy. So much so, in fact, that on average about one in five proteins in yeast is mistranslated, the equivalent of translating the Spanish word "Adios" as "Goofbye." The more copies of a protein produced, the more potential errors. And mistakes can be costly: some translation errors turn proteins into useless junk that can even be harmful (like miscopying a digit in an important phone number), while other errors can be tolerated. So the more protein copies per cell, the more potential harm-unless those abundant proteins themselves can evolve to tolerate more errors.

"That was the 'Aha!'" says Drummond. "We knew from our experiments with manipulating proteins in the lab that some had special properties that allowed them to tolerate more changes than other proteins. They were more robust." So, what if proteins could become robust to translation errors? That would mean fewer harmful errors, and thus a more fit organism.

To test predictions of this hypothesis, the team turned to the lowly baker's yeast, a simple one-celled organism that likes to suck up the nutrients in bread dough, and then expels gas to give baked bread its fluffy texture. Baker's yeast is not only a simple organism, it is also extraordinarily well understood. Just as biologists have now sequenced the human genome, they have also sequenced the yeast genome. Moreover, the numbers of every type of protein in the yeast cell have been painstakingly measured.

For example, there's a protein in the yeast cell called PMA1 that acts as a transformer, converting stored energy into more useful forms. Since nothing living can do without energy, this is a very fundamental and important component of the yeast cell. And every yeast cell churns out about 1.26 million individual PMA1 molecules, making it the second-most abundant cellular protein.

The old assumption was that PMA1 changed slowly because its energy-transforming function was so fundamental to survival. But the Caltech team's new evidence suggests that the sheer number of PMA1 molecules produced is the reason that the protein doesn't evolve very quickly.

"The key insight is that natural selection targets the junk proteins, not the functional proteins," says Drummond. "If translation errors turned 5 percent of the PMA1 proteins in a yeast cell into junk, those junk proteins would be more abundant than 97 percent of all the other proteins in the cell. That's a huge amount of toxic waste to dispose of."

So instead, Darwinian evolution favors yeast cells with a version of PMA1 that continues to function despite errors, producing less junk. That version of PMA1 evolves slowly because the slightest changes destroy its crucial ability to withstand errors.

Consider two competing computer factories. Both make the same number of mistakes on their assembly lines, but one company's computers are designed such that the inevitable mistakes result in computers that still work, while with the other company's design, one mistake and the computer must be tossed on the recycling heap. In the cutthroat marketplace, the former company, with lower costs and higher output, will quickly outcompete the latter.

Likewise, viewing yeast cells as miniature factories, the yeast whose most-abundant proteins are least likely to be destroyed by production mistakes will outcompete its less-efficient rivals. The more optimized those high-abundance proteins are--the more rigid the specifications that make them so error-resistant-the slower they evolve. Hence, high abundance means slow evolution.

The team is now exploring other predictions of this surprising hypothesis, such as what specific chemical changes allow proteins to resist translation errors. "It's the tip of the iceberg," Drummond says.

Drummond is a graduate student in Caltech's interdisciplinary Computation and Neural Systems program. The other authors of the paper include his two advisors: Frances Arnold, the Dickinson Professor of Chemical Engineering and Biochemistry at Caltech, and Chris Adami, an expert in population genetics who is now at the Keck Graduate Institute in Claremont, California. The other authors are Jesse D. Bloom, a graduate student in chemistry at Caltech; and Claus Wilke, a former postdoctoral researcher of Adami's who has recently joined the University of Texas at Austin as an assistant professor.

The title of the PNAS paper is "Why highly expressed proteins evolve slowly."


Robert Tindol
Exclude from News Hub: 

Most Distant Explosion in Universe Detected; Smashes Previous Record

WASHINGTON, D.C.--Scientists using the NASA Swift satellite and several ground-based telescopes, including Palomar Observatory's robotic 60-inch telescope, have detected the most distant explosion yet, a gamma-ray burst from the edge of the visible universe.

This powerful burst, likely marking the death of a massive star as it collapsed into a black hole, was detected on September 4. It comes from an era soon after stars and galaxies first formed, about 500 million to 1 billion years after the Big Bang. The science team cannot yet determine the nature of the exploded star; a detailed analysis is forthcoming.

The 60-inch telescope at Palomar Observatory, which is owned and operated by the California Institute of Technology, observed the burst at visible wavelengths. At about the same time, a team led by Daniel Reichart of the University of North Carolina undertook near-infrared observations with the SOAR (Southern Observatory for Astrophysical Research) telescope, located in Chile. A bright near-infrared source was detected in the SOAR observations but completely absent in the Palomar data.

Building upon these pieces of information, a team led by Nobuyuki Kawai of the Tokyo Institute of Technology used the Subaru Observatory on Mauna Kea, in Hawaii, to confirm the distance and fine-tune the redshift measurement to 6.29 via a technique called spectroscopy. A redshift of 6.29 translates to a distance of about 13 billion light-years from Earth. The universe is thought to be 13.7 billion years old.

An ordinary afterglow would be easily detected by both facilities, but the fact that the afterglow was not seen by the Palomar 60-inch visible imager, but was readily detected by the SOAR infra-red imager alerted Reichart to the possibility that the afterglow was located at the edge of the universe. For such distant objects, hydrogen in the intergalactic medium absorbs visible rays but not infrared rays. Hence, invisibility in the visible spectrum indicated that the object was extremely far away.

"This is uncharted territory," said Reichart, who spearheaded the distance measurement. "This burst smashes the old distance record by 500 million light-years. We are finally starting to see the remnants of some of the oldest objects in the universe."

The Caltech team, led by Derek Fox, until recently a postdoctoral fellow and now a professor at the Pennsylvania State University, and Caltech graduate student Brad Cenko, did the actual observations with the Palomar telescope.

"The key step in identifying these much sought-after gamma-ray bursts is the dual combination of detection in the near-infrared and non-detection in the visible," says Fox. "We sincerely hope that the 60-inch will continue not detecting more gamma-ray bursts!"

"I am elated that the Palomar 60-inch telescope contributed to the first vital step in inferring the great distance to this burst," added Cenko, who spent the last two years robotocizing the 60-inch telescope. "The existence of such distant GRBs has been postulated for quite some time, but the detection makes me feel secure that I have a sound thesis topic."

Shri Kulkarni, the principal investigator of the Palomar 60-inch robotic telescope and the MacArthur Professor of Astronomy and Planetary Science at Caltech, noted that "the discovery has highlighted the important niche for smaller telescopes, especially robotic telescopes, in undertaking cutting-edge research for objects literally at the edge of the universe. We have hit paydirt, thanks to Caltech's investment in the 60-inch telescope."

The only object ever discovered at a greater distance was a quasar at a redshift of 6.4. Whereas quasars are supermassive black holes containing the mass of billions of stars, this burst comes from a single star. Yet gamma-ray bursts might be plentiful, according to Donald Lamb of the University of Chicago.

Scientists measure cosmic distances via redshift, the extent to which light is "shifted" towards the red (lower energy) part of the electromagnetic spectrum during its long journey across the universe. The greater the distance, the higher the redshift.

The September 4 burst is named GRB 050904, for the date it was detected. The previous most distant gamma-ray burst had a redshift of 4.5.

"We designed Swift to look for faint bursts coming from the edge of the universe," said Neil Gehrels of NASA's Goddard Space Flight Center, who is the Swift principal investigator. "Now we've got one, and it's fascinating. For the first time we can learn about individual stars from near the beginning of time. There are surely many more out there."

Robert Tindol

Work Continues on the Solar System's Three Recently Discovered Objects

CAMBRIDGE, England--When planetary scientists announced on July 29 that they had discovered a new planet larger than Pluto, the news overshadowed the two other objects the group had also found. But all three objects are odd additions to the solar system, and as such could revolutionize our understanding of how our part of the celestial neighborhood evolved.

To the discoverers, the objects still go by the unofficial code-names "Santa," "Easterbunny," and "Xena," though they are officially known to the International Astronomical Union as 2003 EL61, 2005 FY9, and 2003 UB313. The three objects were all detected with the 48-inch Samuel Oschin Telescope at Palomar Observatory by a team composed of planetary scientists from the California Institute of Technology, the Gemini Observatory, and Yale University. Xena is the object the group describes as one of sufficient size to be called the tenth planet.

"All three objects are nearly Pluto-sized or larger, and all are in elliptical orbits tilted out of the plane of the solar system," says Mike Brown, a professor of planetary astronomy at Caltech and leader of the effort.

"We think that these orbital characteristics may mean that they were all formed closer to the sun, and then were tossed around by the giant planets before they ended up with the odd orbits they currently have," Brown adds.

The other two members of the team are Chad Trujillo, a former postdoctoral researcher at Caltech and currently an astronomer at the Gemini Observatory in Hawaii, and David Rabinowitz of Yale University. Trujillo has led the spectrographic studies of the discoveries, while Rabinowitz is one of the builders of the instrument affixed to the Oschin Telescope for the study, and has led the effort to understand the color and spin of the objects.

Santa, Easterbunny, and Xena are all members of the Kuiper belt, a region beyond the orbit of Neptune that for decades was merely a hypothetical construct based on the behavior of comets, among other factors. But astronomers began detecting objects in the mid-1990s, and the Kuiper belt was suddenly a reality rather than a hypothesis.

Xena, which is currently about 97 astronomical units from the sun (an astronomical unit being the 93-million-mile distance between the sun and Earth), is at least the size of Pluto and almost certainly significantly larger. The researchers are able to determine its smallest possible size because, thanks to the laws of motion, they know very accurately the distance of the planet from the sun. And because they also know very precisely how much light the planet gives off, they can also calculate the diameter of the planet as if it were reflecting sunlight as a uniformly white ball in the sky. Hence, a perfectly round mirror at that distance would be the size of Pluto.

However, the question remains how well the new planet reflects light. The less reflective its surface, the bigger it must be to put out enough light to be detected here on Earth.

At any rate, the researchers hope that infrared data returned by the Spitzer Space Telescope over the weekend of August 27-28, in addition to recently obtained data from the 30-meter IRAM telescope in Spain, will help nail down Xena's size. In much the same way that the detected visible light sets a lower limit on the diameter, the infrared radiation detected by the Spitzer will ideally set an upper limit. That's because the Spitzer is capable of measuring the total amount of heat given off by the planet; and because the researchers know the likely surface temperature is about 405 degrees below zero Fahrenheit, they can infer the overall size of the body.

Brown predicts that Xena will likely be highly reflective, because the spectrographic data gathered by his colleague and codiscoverer Chad Trujillo at the Gemini Observatory show the surface to have a similar composition to that of the highly reflective Pluto. If indeed Xena reflects 70 percent of the sunlight reaching it, as does Pluto, then Xena is about 2700 kilometers in diameter.

And then there's the matter of naming the new planet, which is pretty much in the hands of the International Astronomical Union. Brown says the matter is in "committee limbo": while one IAU committee is taking its time deciding whether or not it is a planet, other committees have to wait until they know what it is before they can consider a name. So for the time being, the discoverers keep calling the new planet Xena, though the name will sooner or later change.

The second of the objects, currently nicknamed Santa because Brown and his colleagues found it on December 28, 2004, is one of the more bizarre objects in the solar system, according to Rabinowitz. His observations from a small telescope in Chile show that Santa is a fast-rotating cigar-shaped body that is about the diameter of Pluto along its longer axis. No large body in the solar system comes even close to rotating as fast as Santa's four-hour period. Observations by Brown and his colleagues at the Keck Observatory have shown that Santa also has a tiny moon, nicknamed Rudolph, which circles it every 49 days. The third new discovery is Easterbunny, so named because of its discovery earlier this year on March 1. Easterbunny is also at 52 astronomical units, and like Santa is probably about three-quarters the size of Pluto. Morever, Easterbunny is now the third known object in the Kuiper belt, after Pluto and Xena, which is known to have a surface covered in frozen methane. For decades, Pluto was the only known methane-covered object beyond Neptune, but "now we suddenly have three in a variety of sizes at a variety of distances and can finally try to understand Pluto and its cousins," says Kris Barkume, a PhD student working with Brown.

"With so many bright objects coming out at once it is hard to keep them all straight," says Brown, adding that the remote region beyond Neptune may present even more surprises in the future.

"We hope to discover a few more large objects in the outer solar system."

The research is funded by NASA. For more information see



Robert Tindol

Caltech, MIT Chemists Look for Better Waysto Use Chemical Bonds to Store Solar Energy

PASADENA, Calif.-With gasoline prices hovering at $3 per gallon, probably few Americans need convincing that another energy crisis is imminent. But what precisely is to be done about our future energy needs is still a puzzle. There's talk about a "hydrogen economy," but hydrogen itself poses some formidable challenges.

The key challenge is, of course, how to make the hydrogen in the first place. The best and cheapest methods currently available involve burning coal or natural gas, which means more greenhouse gases and more pollution. Adopting the cheapest method by using natural gas would merely result in replacing our dependence on foreign oil with a dependence on foreign gas.

"Clearly, one clean way to get hydrogen is by splitting water with sunlight," says Harry Gray, who is the Beckman Professor of Chemistry at the California Institute of Technology.

Gray is involved with several other Caltech and MIT chemists in a research program they call "Powering the Planet." The broadest goal of the project is to "pursue efficient, economical ways to store solar energy in the form of chemical bonds," according to the National Science Foundation (NSF). With a new seed grant from the NSF and the possibility for additional funding after the initial three-year period, the Caltech group says they now have the wherewithal to try out some novel ideas to produce energy cheaply and cleanly.

"Presently, this country spends more money in 10 minutes at the gas pump than it puts into a year of solar-energy research," says Nathan S. Lewis, the Argyros Professor and professor of chemistry. "But the sun provides more energy to the planet in an hour than all the fossil energy consumed worldwide in a year."

The reason that Gray and Lewis advocate the use of solar energy is that no other renewable resource has enough practical potential to provide the world with the energy that it needs. But the sun sets every night, and so use of solar energy on a large scale will necessarily require storing the energy for use upon society's demand, day or night, summer or winter, rain or shine.

As for non-renewable resources, nuclear power plants would do the job, but 10,000 new ones would have to be built. In other words, one new nuclear plant would have to come on-line every other day somewhere in the world for the next 50 years.

The devices used in a simple experiment in the high school chemistry lab to make hydrogen by electrolysis are not currently the cheapest ones to use for mass production. In fact, the tabletop device that breaks water into hydrogen and oxygen is perfectly clean (in other words, no carbon emissions), but it requires a platinum catalyst. And platinum has been selling all year for more than $800 per ounce.

The solution? Find something cheaper than platinum to act as a catalyst. There are other problems, but this is one that the Caltech group is starting to address. In a research article now in press, Associate Professor of Chemistry Jonas Peters and his colleagues demonstrate a way that cobalt can be used for catalysis of hydrogen formation from water.

"This is a good first example for us," says Peters. "A key goal is to try to replace the current state-of-the-art platinum catalyst, which is extremely expensive, with something like cobalt, or even better, iron or nickel. We have to find a way to cheaply make solar-derived fuel if we are to ever really enable widespread use of solar energy as society's main power source."

"It's also a good example because it shows that the NSF grant will get us working together," adds Gray. "This and other research results will involve the joint use of students and postdocs, rather than individual groups going it alone."

In addition to the lab work, the Caltech chemists also have plans to involve other entities outside campus--both for practical and educational reasons. One proposal is to fit out a school so that it will run entirely on solar energy. The initial conversion would likely be done with existing solar panels, but the facility would also serve to provide the researchers with a fairly large-scale "lab" where they can test out new ideas.

"We'd build it so that we could troubleshoot solar converters we're working on," explains Gray.

The ultimate lab goal is to have a "dream machine with no wires in it," Gray says. "We visualize a solar machine with boundary layers, where water comes in, hydrogen goes out one side, and oxygen goes out the other."

Such a machine will require a lot of work and a number of innovations and breakthroughs, but Lewis says the future of the planet depends on moving away from fossil fuels.

"If somebody doesn't figure this out, and fast, we're toast, both literally and practically, due to a growing dependence on foreign oil combined with the increasing projections of global warming."

The NSF grant was formally announced August 11 as a means of funding a new group of chemical bonding centers that will allow research teams to pursue problems in a manner "that's flexible, tolerant of risk, and open to thinking far outside the box." The initial funding to the Caltech and MIT group for the "Powering the Planet" initiative is $1.5 million for three years, with the possibility of $2 to $3 million per year thereafter if the work of the center appears promising.

In addition to Gray, Lewis, and Peters, the other Caltech personnel include Jay Winkler and Bruce Brunschwig, both chemists at Caltech's Beckman Institute. The two faculty members from MIT involved in the initiative are Dan Nocera and Kit Cummins.

Jonas Peters's paper will appear in an upcoming issue of the journal Chemical Communications. In addition to Peters and Lewis, the other authors are Brunschwig, Xile Hu, a postdoctoral researcher in chemistry at Caltech, and Brandi Cossairt, a Caltech undergraduate.


Robert Tindol

Evolutionary Accident Probably Caused The Worst Snowball Earth Episode, Study Shows

PASADENA--For several years geologists have been gathering evidence indicating that Earth has gone into a deep freeze on several occasions, with ice covering even the equator and with potentially devastating consequences for life. The theory, known as "Snowball Earth," has been lacking a good explanation for what triggered the global glaciations.

Now, the California Institute of Technology research group that originated the Snowball Earth theory has proposed that the culprit for the earliest and most severe episode may have been lowly bacteria that, by releasing oxygen, destroyed a key gas keeping the planet warm.

In the current issue of the Proceedings of the National Academy of Sciences (PNAS), Caltech graduate student Robert Kopp and his supervising professor, Joe Kirschvink, along with alumnus Isaac Hilburn (now a graduate student at the Massachusetts Institute of Technology) and graduate student Cody Nash, argue that cyanobacteria (or blue-green algae) suddenly evolved the ability to break water and release oxygen about 2.3 billion years ago. Oxygen destroyed the greenhouse gas methane that was then abundant in the atmosphere, throwing the global climate completely out of kilter.

Though the younger sun was only about 85 percent as bright as it is now, average temperatures were comparable to those of today. This state of affairs, many researchers believe, was due to the abundance of methane, known commercially as natural gas. Just as they do in kitchen ranges, methane and oxygen in the atmosphere make an unstable combination; in nature they react in a matter of years to produce carbon dioxide and water. Though carbon dioxide is also a greenhouse gas, methane is dozens of times more so.

The problem began when cyanobacteria evolved into the first organisms able to use water in photosynthesis, releasing oxygen into the environment as a waste product. More primitive bacteria depend upon soluble iron or sulfides for use in photosynthesis; the switch to water allowed them to grow almost everywhere that had light and nutrients. Many experts think this happened early in Earth history, between 3.8 and 2.7 billion years ago, in which case some process must have kept the cyanobacteria from destroying the methane greenhouse for hundreds of millions of years. The Caltech researchers, however, find no hard evidence in the rocks to show that the switch to water for photosynthesis occurred prior to 2.3 billion years ago, which is about when the Paleoproterozoic Snowball Earth was triggered.

For cyanobacteria to trigger the rapid onset of a Snowball Earth, they must have had an ample supply of key nutrients like phosphorous and iron. Nutrient availability is why cyanobacterial blooms occur today in regions with heavy agricultural runoff.

Fortunately for the bacteria, Earth 2.3 billion years ago had already entered a moderately cold period, reflected in glacially formed rocks in Canada. Measurements of the magnetization of these Canadian rocks, which the Caltech group published earlier this year, indicate that the glaciers that formed them may have been at middle latitudes, just like the glaciers of the last ice age.

The action of the glaciers, grinding continental material into powder and carrying it into the oceans, would have made the oceans rich in nutrients. Once cyanobacteria evolved this new oxygen-releasing ability, they could feast on this cornucopia, turning an ordinary glaciation into a global one.

"Their greater range should have allowed the cyanobacteria to come to dominate life on Earth quickly and start releasing large amounts of oxygen," Kopp says.

This was bad for the climate because the oxygen destabilized the methane greenhouse. Kopp and Kirschvink's model shows that the greenhouse may have been destroyed in as little as 100,000 years, but almost certainly was eliminated within several million years of the cyanobacteria's evolution into an oxygen-generating organism. Without the methane greenhouse, global temperatures plummeted to -50 degrees Celsius.

The planet went into a glacial period so cold that even equatorial oceans were covered with a mile-thick layer of ice. The vast majority of living organisms died, and those that survived, either underground or at hydrothermal vents and springs, were probably forced into bare subsistence. If correct, the authors note, then an evolutionary accident triggered the world's worst climate disaster.

However, in evolving to cope with the new influx of oxygen, many survivors gained the ability to breathe it. This metabolic process was capable of releasing much energy and eventually allowing the evolution of all higher forms of life.

Kirschvink and his lab have earlier shown a mechanism by which Earth could have gotten out of Snowball Earth. After some tens of millions of years, carbon dioxide would build up to the point that another greenhouse took place. In fact, the global temperature probably bounced back to +50 degrees Celsius, and the deep-sea vents that provided a refuge for living organisms also had steadily released various trace metals and nutrients. So not only did life return after the ice layers melted, but it did so with a magnificent bloom.

"It was a close call to a planetary destruction," says Kirschvink. "If Earth had been a bit further from the sun, the temperature at the poles could have dropped enough to freeze the carbon dioxide into dry ice, robbing us of this greenhouse escape from Snowball Earth."

Of course, 2.3 billion years is a very long time ago. But the episode points to a grim reality for the human race if conditions ever resulted in another Snowball Earth. We who are living today will never see it, but Kirschvink says that an even worse Snowball Earth could occur if the conditions were again right.

"We could still go into Snowball if we goof up the environment badly enough," he says. "We haven't had a Snowball in the past 630 million years, and because the sun is warmer now it may be harder to get into the right condition. But if it ever happens, all life on Earth would likely be destroyed. We could probably get out only by becoming a runaway greenhouse planet like Venus."

Kirschvink is Caltech's Van Wingen Professor of Geobiology.

The PNAS paper is titled "The Paleoproterozoic Snowball Earth: A Climate Disaster Triggered by the Evolution of Oxygenic Photosynthesis."




Robert Tindol