Survey of Early Universe Uncovers Mature Galaxy Eight Times More Massive Than Milky Way

PASADENA, Calif.--A massive galaxy seen when the universe was only 800 million years old has been discovered by teams of astronomers using NASA's Spitzer and Hubble Space Telescopes.

The galaxy's large mass and maturity come as a surprise, because experts previously thought that early galaxies in the young universe should be less prominent agglomerations of stars, rather than giant collections of hundreds of billions of stars as populous or more so than the Milky Way. The researchers are particularly intrigued by the fact that star formation in the galaxy seems to have already been completed. This implies that the bulk of the activity that built up the galaxy had occurred even earlier.

"This is truly a significant object," says Richard Ellis, who is the Steele Family Professor of Astronomy at the California Institute of Technology and a member of the discovery team. "Although we are looking back to when the universe was only 6 percent of its present age, this galaxy has already built up a mass in stars eight times that of the Milky Way.

"If the distance measurement to this object holds up to further scrutiny, the fact such a galaxy has already completed its star formation implies a yet earlier period of intense activity," Ellis adds. "It's like crossing the ocean and meeting a lone seagull, a forerunner of land ahead. There is now every reason to search beyond this object for the cosmic dawn when the first such systems switched on!"

The galaxy was pinpointed among approximately 10,000 others in a small patch of sky called the Hubble Ultra Deep Field (UDF). It is believed to be about as far away as the most distant galaxies known.

Bahram Mobasher of the Space Telescope Science Institute, leader of the science team, explains, "We found this galaxy in Hubble's infrared images of the UDF and expected it to be young and small, like other known galaxies at similar distances. Instead, we found evidence that it is remarkably mature and much more massive. This is the surprising discovery."

The galaxy's great distance was deduced from the fact that Hubble does not see the galaxy in visible light (despite the fact that the UDF is the deepest image ever taken in optical light). This indicates that the galaxy's blue light has been absorbed by traveling billions of light-years through intervening hydrogen gas. The galaxy was detected using Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS), and with an infrared camera on the Very Large Telescope (VLT) at the European Southern Observatory. At those near-infrared wavelengths it is very faint and red.

The big surprise is how much brighter the galaxy is in images at slightly longer infrared wavelengths from the Spitzer Space Telescope. Spitzer is sensitive to the light from older, redder stars, which should make up most of the mass in a galaxy. The infrared brightness of the galaxy suggests that it is very massive.

Two other Spitzer observations, one reported earlier by Ellis and his colleagues at the University of Exeter, UK, and the other by Haojing Yan of the Spitzer Science Center, had already revealed evidence for mature stars in more ordinary, less massive galaxies at similar distances, when the universe was less than one billion years old. However, the new observation extends this notion of surprisingly mature galaxies to an object which is perhaps ten times more massive, and which seemed to form its stars even earlier in the history of the universe.

The team estimated the distance to this galaxy by combining the information provided by the Hubble, Spitzer, and VLT observations. The relative brightness of the galaxy at different wavelengths is influenced by the expanding universe, and allows astronomers to estimate its distance. At the same time, they can also get an idea of the make-up of the galaxy in terms of the mass and age of its stars.

Efforts by Dan Stark, a graduate student at Caltech, using both the giant 10 m Keck and 8 m Gemini telescopes failed to pinpoint the galaxy's distance via spectroscopic methods-the astronomers' conventional tool for estimating cosmic distances. "We have to admit," says Stark, "that we have now reached the point where we are studying sources which lie beyond the spectroscopic capabilities of our current ground-based facilities. It may take the next generation of telescopes, such as the James Webb Space Telescope and Caltech's proposed Thirty Meter Telescope, to confirm the galaxy's distance."

While astronomers generally believe most galaxies were built up piecewise by mergers of smaller galaxies, the discovery of this object suggests that at least a few galaxies formed quickly and in their entirety long ago. For such a large galaxy, this would have been a tremendously explosive event of star birth.

The findings will be published in the December 20, 2005, issue of the Astrophysical Journal.

JPL manages the Spitzer Space Telescope mission for NASA. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. JPL is a division of Caltech. Spitzer's infrared array camera, which took the picture of the galaxy, was built by NASA Goddard Space Flight Center, Greenbelt, Md.

Electronic images and additional information are available at

http://hubblesite.org/news/2005/28 http://www.spitzer.caltech.edu/Media/releases/ssc2005-19/

Further information relating to the James Webb Space Telescope and the proposed Thirty Meter Telescope (a collaboration between the California Institute of Technology the University of California, the Association of Universities for Research in Astronomy, and the Association of Canadian Universities for Research in Astronomy), can be found at:

http://www.jwst.nasa.gov/ http://www.tmt.org/

Contacts:

Professor Richard Ellis (cell) 626-676-5530 rse@astro.caltech.edu

Daniel Stark (cell) 626-315-2939 dps@astro.caltech.edu

Dr Bahram Mobasher (cell) 443-812-8149 mobasher@ststci.edu

Robert Tindol (Media Relations, Caltech): (office) 626-395-3631 tindol@cal

Writer: 
RT

Scientists Uncover Rules that Govern the Rate of Protein Evolution

PASADENA, Calif.--Humans and insects and pond scum-and all other living things on Earth-are constantly evolving. The tiny proteins these living things are built from are also evolving, accumulating mutations mostly one at a time over billions of years. But for reasons that hitherto have been a mystery, some proteins evolve quickly, while others take their sweet time-even when they reside in the same organism.

Now, a team of researchers at the California Institute of Technology, applying novel data-mining methods to the now-completed sequence of the yeast genome, have uncovered a surprising reason why different proteins evolve at different rates.

Reporting in the September 19 edition of the journal Proceedings of the National Academy of Sciences (PNAS), lead author Allan Drummond and his coauthors from Caltech and the Keck Graduate Institute show that the evolution of protein is governed by their ability to tolerate mistakes during their production. This finding disputes the longstanding assumption that functionally important proteins evolve slowly, while less-important proteins evolve more quickly.

"The reason proteins evolve at different rates has been a mystery for decades in biology," Drummond explains. But with the recent flood of sequenced genomes and inventories of all the pieces and parts making up cells, the mystery deepened. Researchers discovered that the more of a protein that was produced, the slower it evolved, a trend that applies to all living things. But the reason for this trend remained obscure, despite many attempts to explain it.

Biologists have long known that the production machinery that translates the genetic code into proteins is sloppy. So much so, in fact, that on average about one in five proteins in yeast is mistranslated, the equivalent of translating the Spanish word "Adios" as "Goofbye." The more copies of a protein produced, the more potential errors. And mistakes can be costly: some translation errors turn proteins into useless junk that can even be harmful (like miscopying a digit in an important phone number), while other errors can be tolerated. So the more protein copies per cell, the more potential harm-unless those abundant proteins themselves can evolve to tolerate more errors.

"That was the 'Aha!'" says Drummond. "We knew from our experiments with manipulating proteins in the lab that some had special properties that allowed them to tolerate more changes than other proteins. They were more robust." So, what if proteins could become robust to translation errors? That would mean fewer harmful errors, and thus a more fit organism.

To test predictions of this hypothesis, the team turned to the lowly baker's yeast, a simple one-celled organism that likes to suck up the nutrients in bread dough, and then expels gas to give baked bread its fluffy texture. Baker's yeast is not only a simple organism, it is also extraordinarily well understood. Just as biologists have now sequenced the human genome, they have also sequenced the yeast genome. Moreover, the numbers of every type of protein in the yeast cell have been painstakingly measured.

For example, there's a protein in the yeast cell called PMA1 that acts as a transformer, converting stored energy into more useful forms. Since nothing living can do without energy, this is a very fundamental and important component of the yeast cell. And every yeast cell churns out about 1.26 million individual PMA1 molecules, making it the second-most abundant cellular protein.

The old assumption was that PMA1 changed slowly because its energy-transforming function was so fundamental to survival. But the Caltech team's new evidence suggests that the sheer number of PMA1 molecules produced is the reason that the protein doesn't evolve very quickly.

"The key insight is that natural selection targets the junk proteins, not the functional proteins," says Drummond. "If translation errors turned 5 percent of the PMA1 proteins in a yeast cell into junk, those junk proteins would be more abundant than 97 percent of all the other proteins in the cell. That's a huge amount of toxic waste to dispose of."

So instead, Darwinian evolution favors yeast cells with a version of PMA1 that continues to function despite errors, producing less junk. That version of PMA1 evolves slowly because the slightest changes destroy its crucial ability to withstand errors.

Consider two competing computer factories. Both make the same number of mistakes on their assembly lines, but one company's computers are designed such that the inevitable mistakes result in computers that still work, while with the other company's design, one mistake and the computer must be tossed on the recycling heap. In the cutthroat marketplace, the former company, with lower costs and higher output, will quickly outcompete the latter.

Likewise, viewing yeast cells as miniature factories, the yeast whose most-abundant proteins are least likely to be destroyed by production mistakes will outcompete its less-efficient rivals. The more optimized those high-abundance proteins are--the more rigid the specifications that make them so error-resistant-the slower they evolve. Hence, high abundance means slow evolution.

The team is now exploring other predictions of this surprising hypothesis, such as what specific chemical changes allow proteins to resist translation errors. "It's the tip of the iceberg," Drummond says.

Drummond is a graduate student in Caltech's interdisciplinary Computation and Neural Systems program. The other authors of the paper include his two advisors: Frances Arnold, the Dickinson Professor of Chemical Engineering and Biochemistry at Caltech, and Chris Adami, an expert in population genetics who is now at the Keck Graduate Institute in Claremont, California. The other authors are Jesse D. Bloom, a graduate student in chemistry at Caltech; and Claus Wilke, a former postdoctoral researcher of Adami's who has recently joined the University of Texas at Austin as an assistant professor.

The title of the PNAS paper is "Why highly expressed proteins evolve slowly."

 

Writer: 
Robert Tindol
Writer: 
Exclude from News Hub: 
No

Most Distant Explosion in Universe Detected; Smashes Previous Record

WASHINGTON, D.C.--Scientists using the NASA Swift satellite and several ground-based telescopes, including Palomar Observatory's robotic 60-inch telescope, have detected the most distant explosion yet, a gamma-ray burst from the edge of the visible universe.

This powerful burst, likely marking the death of a massive star as it collapsed into a black hole, was detected on September 4. It comes from an era soon after stars and galaxies first formed, about 500 million to 1 billion years after the Big Bang. The science team cannot yet determine the nature of the exploded star; a detailed analysis is forthcoming.

The 60-inch telescope at Palomar Observatory, which is owned and operated by the California Institute of Technology, observed the burst at visible wavelengths. At about the same time, a team led by Daniel Reichart of the University of North Carolina undertook near-infrared observations with the SOAR (Southern Observatory for Astrophysical Research) telescope, located in Chile. A bright near-infrared source was detected in the SOAR observations but completely absent in the Palomar data.

Building upon these pieces of information, a team led by Nobuyuki Kawai of the Tokyo Institute of Technology used the Subaru Observatory on Mauna Kea, in Hawaii, to confirm the distance and fine-tune the redshift measurement to 6.29 via a technique called spectroscopy. A redshift of 6.29 translates to a distance of about 13 billion light-years from Earth. The universe is thought to be 13.7 billion years old.

An ordinary afterglow would be easily detected by both facilities, but the fact that the afterglow was not seen by the Palomar 60-inch visible imager, but was readily detected by the SOAR infra-red imager alerted Reichart to the possibility that the afterglow was located at the edge of the universe. For such distant objects, hydrogen in the intergalactic medium absorbs visible rays but not infrared rays. Hence, invisibility in the visible spectrum indicated that the object was extremely far away.

"This is uncharted territory," said Reichart, who spearheaded the distance measurement. "This burst smashes the old distance record by 500 million light-years. We are finally starting to see the remnants of some of the oldest objects in the universe."

The Caltech team, led by Derek Fox, until recently a postdoctoral fellow and now a professor at the Pennsylvania State University, and Caltech graduate student Brad Cenko, did the actual observations with the Palomar telescope.

"The key step in identifying these much sought-after gamma-ray bursts is the dual combination of detection in the near-infrared and non-detection in the visible," says Fox. "We sincerely hope that the 60-inch will continue not detecting more gamma-ray bursts!"

"I am elated that the Palomar 60-inch telescope contributed to the first vital step in inferring the great distance to this burst," added Cenko, who spent the last two years robotocizing the 60-inch telescope. "The existence of such distant GRBs has been postulated for quite some time, but the detection makes me feel secure that I have a sound thesis topic."

Shri Kulkarni, the principal investigator of the Palomar 60-inch robotic telescope and the MacArthur Professor of Astronomy and Planetary Science at Caltech, noted that "the discovery has highlighted the important niche for smaller telescopes, especially robotic telescopes, in undertaking cutting-edge research for objects literally at the edge of the universe. We have hit paydirt, thanks to Caltech's investment in the 60-inch telescope."

The only object ever discovered at a greater distance was a quasar at a redshift of 6.4. Whereas quasars are supermassive black holes containing the mass of billions of stars, this burst comes from a single star. Yet gamma-ray bursts might be plentiful, according to Donald Lamb of the University of Chicago.

Scientists measure cosmic distances via redshift, the extent to which light is "shifted" towards the red (lower energy) part of the electromagnetic spectrum during its long journey across the universe. The greater the distance, the higher the redshift.

The September 4 burst is named GRB 050904, for the date it was detected. The previous most distant gamma-ray burst had a redshift of 4.5.

"We designed Swift to look for faint bursts coming from the edge of the universe," said Neil Gehrels of NASA's Goddard Space Flight Center, who is the Swift principal investigator. "Now we've got one, and it's fascinating. For the first time we can learn about individual stars from near the beginning of time. There are surely many more out there."

Writer: 
Robert Tindol
Writer: 

Work Continues on the Solar System's Three Recently Discovered Objects

CAMBRIDGE, England--When planetary scientists announced on July 29 that they had discovered a new planet larger than Pluto, the news overshadowed the two other objects the group had also found. But all three objects are odd additions to the solar system, and as such could revolutionize our understanding of how our part of the celestial neighborhood evolved.

To the discoverers, the objects still go by the unofficial code-names "Santa," "Easterbunny," and "Xena," though they are officially known to the International Astronomical Union as 2003 EL61, 2005 FY9, and 2003 UB313. The three objects were all detected with the 48-inch Samuel Oschin Telescope at Palomar Observatory by a team composed of planetary scientists from the California Institute of Technology, the Gemini Observatory, and Yale University. Xena is the object the group describes as one of sufficient size to be called the tenth planet.

"All three objects are nearly Pluto-sized or larger, and all are in elliptical orbits tilted out of the plane of the solar system," says Mike Brown, a professor of planetary astronomy at Caltech and leader of the effort.

"We think that these orbital characteristics may mean that they were all formed closer to the sun, and then were tossed around by the giant planets before they ended up with the odd orbits they currently have," Brown adds.

The other two members of the team are Chad Trujillo, a former postdoctoral researcher at Caltech and currently an astronomer at the Gemini Observatory in Hawaii, and David Rabinowitz of Yale University. Trujillo has led the spectrographic studies of the discoveries, while Rabinowitz is one of the builders of the instrument affixed to the Oschin Telescope for the study, and has led the effort to understand the color and spin of the objects.

Santa, Easterbunny, and Xena are all members of the Kuiper belt, a region beyond the orbit of Neptune that for decades was merely a hypothetical construct based on the behavior of comets, among other factors. But astronomers began detecting objects in the mid-1990s, and the Kuiper belt was suddenly a reality rather than a hypothesis.

Xena, which is currently about 97 astronomical units from the sun (an astronomical unit being the 93-million-mile distance between the sun and Earth), is at least the size of Pluto and almost certainly significantly larger. The researchers are able to determine its smallest possible size because, thanks to the laws of motion, they know very accurately the distance of the planet from the sun. And because they also know very precisely how much light the planet gives off, they can also calculate the diameter of the planet as if it were reflecting sunlight as a uniformly white ball in the sky. Hence, a perfectly round mirror at that distance would be the size of Pluto.

However, the question remains how well the new planet reflects light. The less reflective its surface, the bigger it must be to put out enough light to be detected here on Earth.

At any rate, the researchers hope that infrared data returned by the Spitzer Space Telescope over the weekend of August 27-28, in addition to recently obtained data from the 30-meter IRAM telescope in Spain, will help nail down Xena's size. In much the same way that the detected visible light sets a lower limit on the diameter, the infrared radiation detected by the Spitzer will ideally set an upper limit. That's because the Spitzer is capable of measuring the total amount of heat given off by the planet; and because the researchers know the likely surface temperature is about 405 degrees below zero Fahrenheit, they can infer the overall size of the body.

Brown predicts that Xena will likely be highly reflective, because the spectrographic data gathered by his colleague and codiscoverer Chad Trujillo at the Gemini Observatory show the surface to have a similar composition to that of the highly reflective Pluto. If indeed Xena reflects 70 percent of the sunlight reaching it, as does Pluto, then Xena is about 2700 kilometers in diameter.

And then there's the matter of naming the new planet, which is pretty much in the hands of the International Astronomical Union. Brown says the matter is in "committee limbo": while one IAU committee is taking its time deciding whether or not it is a planet, other committees have to wait until they know what it is before they can consider a name. So for the time being, the discoverers keep calling the new planet Xena, though the name will sooner or later change.

The second of the objects, currently nicknamed Santa because Brown and his colleagues found it on December 28, 2004, is one of the more bizarre objects in the solar system, according to Rabinowitz. His observations from a small telescope in Chile show that Santa is a fast-rotating cigar-shaped body that is about the diameter of Pluto along its longer axis. No large body in the solar system comes even close to rotating as fast as Santa's four-hour period. Observations by Brown and his colleagues at the Keck Observatory have shown that Santa also has a tiny moon, nicknamed Rudolph, which circles it every 49 days. The third new discovery is Easterbunny, so named because of its discovery earlier this year on March 1. Easterbunny is also at 52 astronomical units, and like Santa is probably about three-quarters the size of Pluto. Morever, Easterbunny is now the third known object in the Kuiper belt, after Pluto and Xena, which is known to have a surface covered in frozen methane. For decades, Pluto was the only known methane-covered object beyond Neptune, but "now we suddenly have three in a variety of sizes at a variety of distances and can finally try to understand Pluto and its cousins," says Kris Barkume, a PhD student working with Brown.

"With so many bright objects coming out at once it is hard to keep them all straight," says Brown, adding that the remote region beyond Neptune may present even more surprises in the future.

"We hope to discover a few more large objects in the outer solar system."

The research is funded by NASA. For more information see http://www.gps.caltech.edu/~mbrown

 

 

Writer: 
Robert Tindol
Writer: 

Caltech, MIT Chemists Look for Better Waysto Use Chemical Bonds to Store Solar Energy

PASADENA, Calif.-With gasoline prices hovering at $3 per gallon, probably few Americans need convincing that another energy crisis is imminent. But what precisely is to be done about our future energy needs is still a puzzle. There's talk about a "hydrogen economy," but hydrogen itself poses some formidable challenges.

The key challenge is, of course, how to make the hydrogen in the first place. The best and cheapest methods currently available involve burning coal or natural gas, which means more greenhouse gases and more pollution. Adopting the cheapest method by using natural gas would merely result in replacing our dependence on foreign oil with a dependence on foreign gas.

"Clearly, one clean way to get hydrogen is by splitting water with sunlight," says Harry Gray, who is the Beckman Professor of Chemistry at the California Institute of Technology.

Gray is involved with several other Caltech and MIT chemists in a research program they call "Powering the Planet." The broadest goal of the project is to "pursue efficient, economical ways to store solar energy in the form of chemical bonds," according to the National Science Foundation (NSF). With a new seed grant from the NSF and the possibility for additional funding after the initial three-year period, the Caltech group says they now have the wherewithal to try out some novel ideas to produce energy cheaply and cleanly.

"Presently, this country spends more money in 10 minutes at the gas pump than it puts into a year of solar-energy research," says Nathan S. Lewis, the Argyros Professor and professor of chemistry. "But the sun provides more energy to the planet in an hour than all the fossil energy consumed worldwide in a year."

The reason that Gray and Lewis advocate the use of solar energy is that no other renewable resource has enough practical potential to provide the world with the energy that it needs. But the sun sets every night, and so use of solar energy on a large scale will necessarily require storing the energy for use upon society's demand, day or night, summer or winter, rain or shine.

As for non-renewable resources, nuclear power plants would do the job, but 10,000 new ones would have to be built. In other words, one new nuclear plant would have to come on-line every other day somewhere in the world for the next 50 years.

The devices used in a simple experiment in the high school chemistry lab to make hydrogen by electrolysis are not currently the cheapest ones to use for mass production. In fact, the tabletop device that breaks water into hydrogen and oxygen is perfectly clean (in other words, no carbon emissions), but it requires a platinum catalyst. And platinum has been selling all year for more than $800 per ounce.

The solution? Find something cheaper than platinum to act as a catalyst. There are other problems, but this is one that the Caltech group is starting to address. In a research article now in press, Associate Professor of Chemistry Jonas Peters and his colleagues demonstrate a way that cobalt can be used for catalysis of hydrogen formation from water.

"This is a good first example for us," says Peters. "A key goal is to try to replace the current state-of-the-art platinum catalyst, which is extremely expensive, with something like cobalt, or even better, iron or nickel. We have to find a way to cheaply make solar-derived fuel if we are to ever really enable widespread use of solar energy as society's main power source."

"It's also a good example because it shows that the NSF grant will get us working together," adds Gray. "This and other research results will involve the joint use of students and postdocs, rather than individual groups going it alone."

In addition to the lab work, the Caltech chemists also have plans to involve other entities outside campus--both for practical and educational reasons. One proposal is to fit out a school so that it will run entirely on solar energy. The initial conversion would likely be done with existing solar panels, but the facility would also serve to provide the researchers with a fairly large-scale "lab" where they can test out new ideas.

"We'd build it so that we could troubleshoot solar converters we're working on," explains Gray.

The ultimate lab goal is to have a "dream machine with no wires in it," Gray says. "We visualize a solar machine with boundary layers, where water comes in, hydrogen goes out one side, and oxygen goes out the other."

Such a machine will require a lot of work and a number of innovations and breakthroughs, but Lewis says the future of the planet depends on moving away from fossil fuels.

"If somebody doesn't figure this out, and fast, we're toast, both literally and practically, due to a growing dependence on foreign oil combined with the increasing projections of global warming."

The NSF grant was formally announced August 11 as a means of funding a new group of chemical bonding centers that will allow research teams to pursue problems in a manner "that's flexible, tolerant of risk, and open to thinking far outside the box." The initial funding to the Caltech and MIT group for the "Powering the Planet" initiative is $1.5 million for three years, with the possibility of $2 to $3 million per year thereafter if the work of the center appears promising.

In addition to Gray, Lewis, and Peters, the other Caltech personnel include Jay Winkler and Bruce Brunschwig, both chemists at Caltech's Beckman Institute. The two faculty members from MIT involved in the initiative are Dan Nocera and Kit Cummins.

Jonas Peters's paper will appear in an upcoming issue of the journal Chemical Communications. In addition to Peters and Lewis, the other authors are Brunschwig, Xile Hu, a postdoctoral researcher in chemistry at Caltech, and Brandi Cossairt, a Caltech undergraduate.

 

Writer: 
Robert Tindol
Writer: 

Evolutionary Accident Probably Caused The Worst Snowball Earth Episode, Study Shows

PASADENA--For several years geologists have been gathering evidence indicating that Earth has gone into a deep freeze on several occasions, with ice covering even the equator and with potentially devastating consequences for life. The theory, known as "Snowball Earth," has been lacking a good explanation for what triggered the global glaciations.

Now, the California Institute of Technology research group that originated the Snowball Earth theory has proposed that the culprit for the earliest and most severe episode may have been lowly bacteria that, by releasing oxygen, destroyed a key gas keeping the planet warm.

In the current issue of the Proceedings of the National Academy of Sciences (PNAS), Caltech graduate student Robert Kopp and his supervising professor, Joe Kirschvink, along with alumnus Isaac Hilburn (now a graduate student at the Massachusetts Institute of Technology) and graduate student Cody Nash, argue that cyanobacteria (or blue-green algae) suddenly evolved the ability to break water and release oxygen about 2.3 billion years ago. Oxygen destroyed the greenhouse gas methane that was then abundant in the atmosphere, throwing the global climate completely out of kilter.

Though the younger sun was only about 85 percent as bright as it is now, average temperatures were comparable to those of today. This state of affairs, many researchers believe, was due to the abundance of methane, known commercially as natural gas. Just as they do in kitchen ranges, methane and oxygen in the atmosphere make an unstable combination; in nature they react in a matter of years to produce carbon dioxide and water. Though carbon dioxide is also a greenhouse gas, methane is dozens of times more so.

The problem began when cyanobacteria evolved into the first organisms able to use water in photosynthesis, releasing oxygen into the environment as a waste product. More primitive bacteria depend upon soluble iron or sulfides for use in photosynthesis; the switch to water allowed them to grow almost everywhere that had light and nutrients. Many experts think this happened early in Earth history, between 3.8 and 2.7 billion years ago, in which case some process must have kept the cyanobacteria from destroying the methane greenhouse for hundreds of millions of years. The Caltech researchers, however, find no hard evidence in the rocks to show that the switch to water for photosynthesis occurred prior to 2.3 billion years ago, which is about when the Paleoproterozoic Snowball Earth was triggered.

For cyanobacteria to trigger the rapid onset of a Snowball Earth, they must have had an ample supply of key nutrients like phosphorous and iron. Nutrient availability is why cyanobacterial blooms occur today in regions with heavy agricultural runoff.

Fortunately for the bacteria, Earth 2.3 billion years ago had already entered a moderately cold period, reflected in glacially formed rocks in Canada. Measurements of the magnetization of these Canadian rocks, which the Caltech group published earlier this year, indicate that the glaciers that formed them may have been at middle latitudes, just like the glaciers of the last ice age.

The action of the glaciers, grinding continental material into powder and carrying it into the oceans, would have made the oceans rich in nutrients. Once cyanobacteria evolved this new oxygen-releasing ability, they could feast on this cornucopia, turning an ordinary glaciation into a global one.

"Their greater range should have allowed the cyanobacteria to come to dominate life on Earth quickly and start releasing large amounts of oxygen," Kopp says.

This was bad for the climate because the oxygen destabilized the methane greenhouse. Kopp and Kirschvink's model shows that the greenhouse may have been destroyed in as little as 100,000 years, but almost certainly was eliminated within several million years of the cyanobacteria's evolution into an oxygen-generating organism. Without the methane greenhouse, global temperatures plummeted to -50 degrees Celsius.

The planet went into a glacial period so cold that even equatorial oceans were covered with a mile-thick layer of ice. The vast majority of living organisms died, and those that survived, either underground or at hydrothermal vents and springs, were probably forced into bare subsistence. If correct, the authors note, then an evolutionary accident triggered the world's worst climate disaster.

However, in evolving to cope with the new influx of oxygen, many survivors gained the ability to breathe it. This metabolic process was capable of releasing much energy and eventually allowing the evolution of all higher forms of life.

Kirschvink and his lab have earlier shown a mechanism by which Earth could have gotten out of Snowball Earth. After some tens of millions of years, carbon dioxide would build up to the point that another greenhouse took place. In fact, the global temperature probably bounced back to +50 degrees Celsius, and the deep-sea vents that provided a refuge for living organisms also had steadily released various trace metals and nutrients. So not only did life return after the ice layers melted, but it did so with a magnificent bloom.

"It was a close call to a planetary destruction," says Kirschvink. "If Earth had been a bit further from the sun, the temperature at the poles could have dropped enough to freeze the carbon dioxide into dry ice, robbing us of this greenhouse escape from Snowball Earth."

Of course, 2.3 billion years is a very long time ago. But the episode points to a grim reality for the human race if conditions ever resulted in another Snowball Earth. We who are living today will never see it, but Kirschvink says that an even worse Snowball Earth could occur if the conditions were again right.

"We could still go into Snowball if we goof up the environment badly enough," he says. "We haven't had a Snowball in the past 630 million years, and because the sun is warmer now it may be harder to get into the right condition. But if it ever happens, all life on Earth would likely be destroyed. We could probably get out only by becoming a runaway greenhouse planet like Venus."

Kirschvink is Caltech's Van Wingen Professor of Geobiology.

The PNAS paper is titled "The Paleoproterozoic Snowball Earth: A Climate Disaster Triggered by the Evolution of Oxygenic Photosynthesis."

 

 

 

Writer: 
Robert Tindol
Writer: 

Caltech Scientists Create Tiny Photon Clock

PASADENA--In a new development that could be useful for future electronic devices, applied physicists at the California Institute of Technology have created a tiny disk that vibrates steadily like a tuning fork while it is pumped with light. This is the first micro-mechanical device that has been operated at a steady frequency by the action of photons alone.

Reporting in recently published issues of the journals Optics Express (July 11) and Physical Review Letters (June 10 and July 11), Kerry Vahala and group members Hossein Rokhsari, Tal Carmon, and Tobias Kippenberg, explain how the tiny, disk-shaped resonator made of silica can be made to vibrate mechanically when hit by laser light. The disk, which is less than the width of a human hair, vibrates about 80 million times per second when its rim is pumped with light.

According to Vahala, who is the Jenkins Professor of Information Science and Technology and Professor of Applied Physics, the effect is due to properties of the disk that allow it to store light very efficiently, and also to the fact that light exerts "radiation pressure." In much the same way that NASA's solar sails will catch photons from the sun to power spaceships to other worlds, the disk builds up light energy so that the disk itself swells.

"The light makes hundreds of thousands of orbits around the rim of the disk," Vahala explains. "This causes the disk to literally stretch, owing to the radiation pressure of the photons."

Once the disk has inflated, its physical properties change so that the light energy is lost, and the disk then deflates. The cycle then repeats itself, and this repetition continues in a very orderly fashion as long as the light is pumped into the disk.

In effect, this repetitive process makes the disk a very efficient clock, somewhat similar to the quartz crystal that is made to vibrate from electrical current for the regulation of a battery-powered wristwatch. The differences between the optically driven clock and the traditional electrical one, however, create a design element that could provide new electro-optic functions within the context of integrated circuits.

The researchers also note that whereas the basic operation of the device can be understood at the classical level, such a device could be used to study interactions between radiation and macroscopic mechanical motion at the quantum level. Several groups have already proposed theoretically using radiation pressure as a mechanism to investigate such interactions.

Also, the device could be of help in designing the next-generation Laser Interferometer Gravitational-Wave Observatory (LIGO). A National Science Foundation-funded project operated by Caltech and MIT, LIGO and has been created to detect the phenomenon known as gravitational waves, predicted by Einstein decades ago.

LIGO is designed in such a way that laser light bounces between mirrors along a five-mile right-angle circuit. The light is allowed to build up in the two arms of the detector so as to increase the possibility that gravitational waves will eventually be detected from exotic astrophysical objects such as colliding black holes and supernovae.

But designers have been concerned to ensure that the same radiation-pressure-driven instability does not appear in the LIGO system as its sensitivity is boosted. The work by the Vahala group, though at a vastly smaller size scale, therefore could be of help in the current plans for improvement of the LIGO detectors in Hanford, Washington and Livingston, Louisiana.

"This work demonstrates a mechanism that needs to be understood better," Vahala explains. "It has moved from theory to existence, and that is always exciting."

The paper, "Radiation-pressure-driven micro-mechanical oscillator," appearing in the July 11 issue of the journal Optics Express, is available on-line at http://www.opticsexpress.org/abstract.cfm?URI=OPEX-13-14-5293.

 

 

Writer: 
Robert Tindol
Writer: 

KamLAND Detector Provides New Way to Study Heat from Radioactive Materials Within Earth

PASADENA, Calif.--Much of the heat within our planet is caused by the radioactive decay of the elements uranium and thorium. Now, an international team of particle physicists using a special detector in Japan has demonstrated a novel method of measuring that radioactive heat.

In the July 28 issue of the journal Nature, the physicists report on measurements of electron antineutrinos they have detected from within Earth by using KamLAND, the Kamioka Liquid Scintillator Anti-Neutrino Detector. These data indicate that Earth itself generates about 20 billion kilowatts (or terawatts) of power from underground radioactive decays.

According to Robert McKeown, a physicist at the California Institute of Technology and one of the authors of the paper, the results show that this novel approach to geophysical research is feasible. "Neutrinos and their corresponding antiparticles, antineutrinos, are remarkable for their ability to pass unhindered through large bodies of matter like the entire Earth, and so can give geophysicists a powerful method to access the composition of the planet's interior."

McKeown credits the discovery with the unique KamLAND experimental apparatus. The antineutrino detector was primarily built to study antineutrinos emitted by Japanese nuclear power plants. The KamLAND experiment has already resulted in several breakthroughs in experimental particle physics, including the 2002 discovery that antineutrinos emitted by the power plants do indeed change flavor as they travel through space. This result helped solve a longstanding mystery related to the fact that the number of neutrinos from the sun was apparently too small to be reconciled with our current understanding of nuclear fusion. The new results turn from nuclear reactors and the sun to the Earth below. To detect geoneutrinos (or antineutrinos arising from radioactive decays within the planet), the researchers carefully shielded the detector from background radiation and cosmic sources, and also compensated for the antineutrinos that have come from Japan's 53 nuclear power reactors.

The decays of both uranium and thorium have been well understood for decades, with both decays eventually resulting in stable isotopes of lead. KamLAND is the first detector built with the capability to detect the antineutrinos from these radioactive decays.

The researchers plan to continue running the KamLAND experiments for several years. By reducing the trace residual radioactivity in the detector, they hope to increase the sensitivity of the experiment to geoneutrinos and neutrinos from the sun. The additional data will also allow them to better constrain the oscillation of neutrinos as they change their flavors, and perhaps to catch neutrinos from interstellar space if any supernovae occur in our galaxy.

Other members of McKeown's team at Caltech's Kellogg Radiation Lab are Christopher Mauger, a postdoctoral scholar in physics, and Petr Vogel, a senior research associate emeritus in physics. Other partners in the study include the Research Center for Neutrino Science at Tohuku University in Japan, the University of Alabama, the University of California at Berkeley and the Lawrence Berkeley National Laboratory, Drexel University, the University of Hawaii, the University of New Mexico, the University of North Carolina, Kansas State University, Louisiana State University, Stanford University, Duke University, North Carolina State University, the University of Tennessee, the Institute of High Energy Physics in Beijing, and the University of Bordeaux in France.

The project is supported in part by the U.S. Department of Energy.

 

 

Writer: 
Robert Tindol
Writer: 

Mars Has Been in the Deep Freeze for the Past Four Billion Years, Study Shows

PASADENA, Calif.--The current mean temperature on the equator of Mars is a blustery 69 degrees below zero Fahrenheit. Scientists have long thought that the Red Planet was once temperate enough for water to have existed on the surface, and for life to possibly have evolved. But a new study by Caltech and MIT scientists gives this idea the cold shoulder.

In the July 22 issue of the journal Science, Caltech graduate student David Shuster and MIT assistant professor Benjamin Weiss (formerly a Caltech student) report that their studies of Martian meteorites demonstrate that at least several rocks originally located near the surface of Mars have been freezing cold for four billion years. Their work is a novel approach to extracting information on the past climate of Mars through the study of Martian meteorites.

In fact, the evidence shows that during the last four billion years, Mars has likely never been sufficiently warm for liquid water to have flowed on the surface for extended periods of time. This implies that Mars has probably never had a hospitable environment for life to have evolved, unless life could have gotten started during the first half-billion years of its existence, when the planet was probably warmer.

The work involves two of the seven known "nakhlite" meteorites (named after El Nakhla, Egypt, where the first such meteorite was discovered), and the celebrated ALH84001 meteorite that some scientists believe shows evidence of microbial activity on Mars. Using geochemical techniques, Shuster and Weiss reconstructed a "thermal history" for each of the meteorites to estimate the maximum long-term average temperatures to which they were subjected.

"We looked at meteorites in two ways," says Weiss. "First, we evaluated what the meteorites could have experienced during ejection from Mars, 11 to 15 million years ago, in order to set an upper limit on the temperatures in a worst-case scenario for shock-heating."

Their conclusions were that ALH84001 could never have been heated to a temperature higher than 650 degrees Fahrenheit for even a brief period of time during the last 15 million years. The nakhlites, which show very little evidence of shock-damage, were unlikely to have been above the boiling point of water during ejection 11 million years ago.

Although these are still rather high temperatures, the other part of the research addressed the long-term thermal history of the rocks while they resided on Mars. They did this by estimating the total amount of argon still remaining in the samples, using data previously published by two teams at the University of Arizona and the NASA Johnson Space Center.

The gas argon is present in the meteorites as well as in many rocks on Earth as a natural consequence of the radioactive decay of potassium. As a noble gas, argon is not very chemically reactive, and because the decay rate is precisely known, geologists for years have measured argon as a means of dating rocks.

However, argon is also known to "leak" out of rocks at a temperature-dependent rate. This means that if the argon remaining in the rocks is measured, an inference can be made about the maximum heat to which the rock has been subjected since the argon was first made. The cooler the rock has been, the more argon will have been retained. Shuster and Weiss's analysis found that only a tiny fraction of the argon that was originally produced in the meteorite samples has been lost through the eons. "The small amount of argon loss that has apparently taken place in these meteorites is remarkable. Any way we look at it, these rocks have been cold for a very long time," says Shuster. Their calculations suggest that the Martian surface has been in deep-freeze for most of the last four billion years.

"The temperature histories of these two planets are truly different. On Earth, you couldn't find a single rock that has been below even room temperature for that long," says Shuster. The ALH84001 meteorite, in fact, couldn't have been above freezing for more than a million years during the last 3.5 billion years of history.

"Our research doesn't mean that there weren't pockets of isolated water in geothermal springs for long periods of time, but suggests instead that there haven't been large areas of free-standing water for four billion years.

"Our results seem to imply that surface features indicating the presence and flow of liquid water formed over relatively short time periods," says Shuster.

On a positive note for astrobiology, however, Weiss says that the new study does nothing to disprove the theory of "panspermia," which holds that life can jump from one planet to another by meteorites. Weiss and his supervising professor at Caltech, Joe Kirschvink (the Van Wingen Professor of Geobiology), several years ago showed that microbes could indeed have traveled from Mars to Earth in the hairline fractures of ALH84001 without having been destroyed by heat. In particular, the fact that the nakhlites have never been heated above about 200 degrees Fahrenheit means that they were not heat-sterilized during ejection from Mars and transfer to Earth.

The title of the new paper is "Martian Surface Paleotemperatures from Thermochronology of Meteorites."

 

Writer: 
Robert Tindol
Writer: 

Deep Impact: During and After Impact

PALOMAR MOUNTAIN, Calif. - Astronomers using the Palomar Observatory's 200-inch Hale Telescope have been amazed by comet Tempel 1's behavior during and after its collision with the Deep Impact space probe.

In the minutes just after the impact the comet was seen to increase its near-infrared brightness nearly fivefold. As the event progressed astronomers at Palomar were able to distinguish jets of material venting from the comet's nucleus that have persisted for days.

Early results from the data, in images taken just minutes after impact, showed a possible plume of dust and gas extending outward some 320 km (200 miles) from the comet's center, roughly coinciding with the site of the probe's final demise.

This apparent dust plume has persisted for several nights, allowing astronomers to watch the comet's slow rotation. The night after impact the plume was on the far side of the comet, but was visible again the next evening as the comet's rotation brought it back into view. Two days after impact, the plume was seen again, this time extending about 200 km (124 miles) from the comet's center. According to Bidushi Bhattacharya of the California Institute of Technology's (Caltech) Spitzer Science Center, "This could be indicative of an outburst of gas and dust still taking place near the region of the impact."

"We are very excited by these results. It is a fabulous time to be studying comets," says James Bauer of the Jet Propulsion Laboratory (JPL). "It will be interesting to see how long the effects of the impact persist," he adds.

The images of the comet, obtained by Bauer and Bhattacharya, were sharper than those from most ground-based telescopes because they used a technique known as adaptive optics. Adaptive optics allows astronomers to correct for the blurring of images caused by Earth's turbulent atmosphere, giving them a view that often surpasses those of smaller telescopes based in space.

Using the adaptive-optics technique to improve an astronomer's view is generally only possible when a bright star is located near the object they want to study. On the night of impact there was no bright star close enough to the comet to use. Mitchell Troy, the adaptive-optics group lead and Palomar adaptive-optics task manager at JPL, worked with his team to make adaptive optics corrections anyway. "Through the dedicated efforts of the JPL and Caltech teams we were able to deploy a new sensor that was 25 times more sensitive then our normal sensor. This new sensor allowed us to correct for some of the atmosphere's distortions and significantly improve the view of the comet," says Troy. This improved view allowed astronomers to see the dust and ejected material moving out from the comet's surface immediately following the impact event and again days later.

Earth-based observations from telescopes like the 200-inch at Palomar give astronomers an important perspective on how the comet is reacting to the impact, a perspective that cannot be achieved from the front-row seat of a fly-by spacecraft. Astronomers on the ground have the luxury of long-term observations that may continue to show changes in the comet for weeks to come.

Collaborators on the observations include Paul Weissman (JPL), and the Palomar 200-inch crew. The Caltech-adaptive optics team is made up of Richard Dekany (team leader), Antonin Bouchez, Matthew Britton, Khanh Bui, Alan Morrissett, Hal Petrie, Viswa Velur and Bob Weber. The JPL Palomar adaptive-optics team includes Mitchell Troy (team leader), John Angione, Sidd Bikkannavar, Gary Brack, Steve Guiwits, Dean Palmer, Ben Platt , Jennifer Roberts, Chris Shelton, Fang Shi, Thang Trinh, Tuan Truong and Kent Wallace.

The Palomar adaptive-optics instrument was built and continues to be supported by the Jet Propulsion Laboratory as part of a Caltech-JPL collaboration.

Support for the adaptive-optics research at Caltech's Palomar Observatory comes from the Oschin Family Foundation, the Gordon and Betty Moore Foundation and the National Science Foundation Center for Adaptive Optics.

MEDIA CONTACT: Scott Kardel, Palomar Public Affairs Director (760) 742-2111 wsk@astro.caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media Images are available at: http://www.astro.caltech.edu/palomarnew/deepimpact.html

 

Writer: 
SK

Pages