Old Caltech Telescope Yields New Science

PASADENA, Calif. - Meet Sarah Horst, throwback. The planetary science major, a senior at the California Institute of Technology, spent six months engaged in a bit of old-time telescope observing. The work led to some breakthrough research about Saturn's moon Titan, and indirectly led to funding for a new telescope at Caltech's Palomar Observatory.

Horst, 21, was looking for a part-time job in the summer of her sophomore year, and was hired by Mike Brown, an associate professor of planetary astronomy. Brown and graduate student Antonin Bouchez knew there had been previous evidence of "weather" on Titan in the form of clouds. But that evidence was elusive. "Someone would look one year and think they saw a cloud, then look the next year and not see a cloud," explains Brown. "What we were after was a way to look at Titan, night after night after night."

The problem, of course, is that all of the large telescopes like Keck are incredibly busy, booked by astronomers from around the world who use the precious time for their own line of research. So Brown and Bouchez knew that obtaining large amounts of time for a single project like this was not going to happen.

The solution: Use an old teaching telescope--the hoary 14-inch Celestron telescope located on top of Caltech's Robinson Lab--to do cutting edge science that couldn't be done at the largest telescopes in the world, in Hawaii.

Though the power of the Robinson telescope is weak, and light pollution from Pasadena strong, which prevents imaging the actual clouds, the light reflecting from clouds could be imaged (the more clouds, the more light that's reflected). All that was needed was someone who could come night after night and take multiple images.

Enter Horst, the self-described "lowly undergraduate." For months, Horst spent her evenings in Robinson. "I did the setup, which involved a wheel that contained four light filters," she explains. Each filter would capture a different wavelength of light. Software switched the filters; all she had to do, says Horst, was to orientate and focus the telescope.

Now, modern-day astronomers have it relatively easy when using their telescope time. Sure they're up all night, but they sit on a comfortable chair in a warm room, hot coffee close at hand, and do their observing through a computer monitor that's connected to a telescope.

Not Horst. She did it the old way, in discomfort. "A lot of times in December or January I'd go in late at night, and it would be freezing," says Horst, who runs the 800-meter for the Caltech track team. "I'd wrap myself up in blankets." Horst spent hours in the dark, since the old dome itself had to be dark. "I couldn't even study," she says, "although sometimes I tried to read by the light of the moon."

A software program written by Bouchez plotted the light intensity from each image on a graph. When a particular image looked promising, Bouchez contacted Brown. As a frequent user of the Keck Observatory, which is powerful enough to take an image of the actual clouds, Brown was able to call colleagues who were using the Keck that night and quickly convince them that something exciting was going on. "It only took about ten minutes to get a quick image of Titan," says Brown. "The funny part was having to explain to them that we knew there were clouds because we had seen the evidence in our 14-inch telescope in the middle of the L.A. basin."

The result was "Direct Detection of Variable Tropospheric Clouds Near Titan's South Pole," which appeared in the December 19 journal Nature. It included this acknowledgement: "We thank . . . S. Horst for many nights of monitoring Titan in the cold."

The paper has helped Brown obtain the funding to build a new 24-inch custom-built telescope. It will be placed in its own building atop Palomar Mountain, on the grounds of Caltech's existing observatory. It's also roboticized; Brown will control the scope from Pasadena via a computer program he has written.

He'll use it for further observation of Titan and for other imaging, as well, such as fast-moving comets. "Most astronomy is big," notes Brown; "big scopes looking at big, unchanging things, like galaxies. I like to look at changing things, which led to this telescope."

What really made this project unique, though, according to Brown, is the Robinson scope. "Sarah was able to do something with this little telescope in Pasadena that no one in the world, on any of their larger professional telescopes on high, dark mountaintops, had been able to do," he says. "Sometimes a good idea and stubbornness are better than the largest telescope in town."

For Horst, while the work wasn't intellectually challenging--"a trained monkey could have done it," she says with a laugh--it was, nonetheless, "a cool project. Everything here is so theoretical and tedious, and so classroom orientated. So in that way it was a nice experience and reminded me what real science was about."

Writer: 
MW
Exclude from News Hub: 
No

Infrared space-based astronomy on front burner for Caltech employees at the SIRTF Science Center

There may be no giant monitor in the control room that one normally sees in the movies about space exploration, but the new Space Infrared Telescope Facility (SIRTF) Science Center is an indication of how much has been achieved in the long quest to make space-based astrophysical observation an everyday affair.

The telescope is NASA's newest triumph in the ongoing quest to understand the universe by looking at it in the pristine view of outer space, above the atmosphere and moisture. Following the Hubble Space Telescope, the Compton Gamma-Ray Observatory, and the Chandra X-Ray Observatory, SIRTF is the fourth and final mission of NASA's Great Observatories Program--though by no means the last time NASA intends to launch an observatory into space.

Caltech's part in the program is the hosting of the center where observations will be chosen, the satellite's observations scheduled and programmed, and the scientific data processed and sent on to the scientists involved. In all, about 100 people are employed by the center, which has close ties with the Jet Propulsion Laboratory and the Infrared Processing Analysis Center (IPAC) next door.

The science center is located in an office building on the south side of the Caltech campus in Pasadena, and the third-floor control room affords the staff a rather nice view of a sunny Southern California boulevard with palm trees and the San Gabriel mountains.

Now trailing Earth in a solar orbit slightly larger than Earth's annual orbit around the sun-- and thus slightly slower--the telescope will soon begin returning images of the infrared sky as it has never been seen before. There are much larger infrared telescopes on the ground, but unlike SIRTF, they are hampered by water vapor and atmospheric distortions. On the other hand, there have been infrared instruments in space, but none as big as SIRTF.

With such an instrument, astrophysicists hope to gain new insights about star formation, the center of galaxies, objects now so far away that they can only be imaged in the infrared, brown dwarfs so cool that they do not emit visible light, and perhaps even the extremely dim large bodies that may exist hitherto undetected at the fringes of our own solar system. Also, the telescope will be able to see light coming through many of the gas and dust clouds that are impenetrable to regular visible light.

Tom Soifer, the Caltech physics professor who has directed the center since its inception in 1997, admits he was nervous watching SIRTF being launched from Cape Canaveral in late August, but says he is pleased with the way things have gone so far.

In fact, the infrared telescope has already returned its first picture, an image of a star field. In SIRTF's case, the image was part of the "aliveness test" to demonstrate that the three scientific instruments on board all successfully survived the launch and subsequent deployment in solar orbit.

Soifer points out that the image (available at the Web site listed at the bottom of the page) was taken just a week after the telescope was launched. And even though the first image is a star field with no particular scientific interest, the bright, red stars give an inkling of the visual displays that will be made available to the public during SIRTF's five-year mission. In fact, subsequent images will be even better, because SIRTF's telescope wasn't even fully cooled or in focus when the "aliveness" image was snapped.

The imagery of SIRTF should be of ongoing interest to the public, because the remarkable images returned over the last 13 years by the Hubble Space Telescope have been quite popular. Also, the SIRTF operations will have more than a passing resemblance to those of Hubble, Soifer explains.

"SIRTF is very similar to the Hubble operation," he says. "Any astronomer in the world can get observing time if their proposal is selected, and you get grant money to fund your research if you work in the United States.

"Education and public outreach are important aspects of the science center, and we're working closely with JPL on public affairs activities."

Like the Hubble Space Telescope, SIRTF will look at a huge variety of objects, both in our own solar system and far beyond. In addition to the aforementioned items of interest, SIRTF will also look at giant molecular clouds, some of which contain organic molecules, and will look for signatures of planet formation around other stars.

Unlike the Hubble, however, SIRTF will have a more defined mission--or at least a more defined lifespan. The instruments must be cooled to an extremely low temperature with on-board cryogenic liquid, and this means that the bulk of the observations will end when the liquid coolant runs out. Minimizing heat and thereby conserving coolant is the reason for the unique orbit, which keeps the telescope well away from Earth, and conservation of the refrigerant is also the reason for the meticulous computer-aided planning that goes into all operations. The less the telescope has to move and switch between its three main instruments, the longer the coolant will last.

Soifer expects the mission to last for about five years. There's a chance that a few very limited observations could still be done for an additional four years, but SIRTF definitely comprises a beginning, a middle, and an end.

After that, SIRTF will be some 80 to 90 million miles away from Earth, and the distance will continue to increase. After about 60 years, Earth will catch up with SIRTF, but there would hardly be an advantage in rejuvenating the telescope at that point.

"It wouldn't be worth it," Soifer says. "The technologies are all advancing, so it would be better to go with that generation's technology. So there's a very clear end.

"In fact, the end will be approximately when I get to retirement age," he adds, smiling. "I'm not really sure how I feel about that."

Writer: 
RT

"Bubbloy" the latest invention from Caltech materials scientists

First there was liquid metal, that wondrous substance from Bill Johnson's materials science lab at Caltech that is now used for golf clubs and tennis rackets. Now a couple of Johnson's enterprising grad students have come up with a new invention-liquid metal foam.

According to Chris Veazey, who is working on his doctorate in materials science, the new stuff is a bulk metallic glass that has the stiffness of metal but the springiness of a trampoline. "You can squish it and the metal will spring back," says Veazey, who has given the stuff the tentative name "bubbloy," a combination of "bubble" and "alloy."

Greg Welsh, the co-inventor and also a doctoral student in materials science at Caltech, adds that bubbloy is made possible by a process that foams the alloy so that tiny bubbles form. Preliminary results show that if the bubbles nearly touch, the substance will be especially springy.

"We think it might be especially useful for the crumple zone of a car," says Veazey. "It should make a car safer than one where the structures in the crumple zone are made of conventional metals."

Bubbloy is made of palladium, nickel, copper, and phosphorus. This particular alloy was already known as one of the best bulk metallic glasses around, but Veazey and Welsh's contribution was figuring out how to get the stuff to foam. Other researchers have previously figured out how to foam metals like titanium and aluminum, but bubbloy will have big advantages in the strength-to-weight ratio.

How good is good? Veazey and Welsh's preliminary castings result in bubbloy that is light enough to float in water, yet quite strong and elastic.

"To make it really well is a challenge," Welsh says.

Bubbloy is one of several advances that will be showcased at a September 15 conference at Caltech. The conference, titled "Materials at the Fore," is the third annual meeting of the Center for the Science and Engineering of Materials at Caltech.

The day-long conference will begin with check-in and a continental breakfast at the Beckman Institute Courtyard on the west side of campus. Opening remarks and an overview of the conference will be presented at 8:30 a.m. by center director Julia Kornfield, a professor of chemical engineering at Caltech.

Presentations will include "Nano-scale Mechanical Properties," by Subra Suresh of MIT; "Synthesis and Assembly of Biological Macromolecules: DNA and Beyond," by Steve Quake of Caltech; "Thermoelectric Devices," by Sossina Haile of Caltech, and others.

Attendance is free but requires registration, and reporters are welcome to cover any or all of the presentations. Complete information is available at the Web site http://www.csem.caltech.edu/annrev/index.html.

Reporters who would like to attend are asked to contact Caltech Media Relations in advance.

Writer: 
Robert Tindol
Writer: 

Atmospheric researchers present new findingson the natural hydrogen cycle

Two months after a pivotal study on the potential impact of a future hydrogen economy on the environment, further evidence is emerging on what would happen to new quantities of hydrogen released into the atmosphere through human activity.

In an article appearing in the August 21 issue of the journal Nature, a group of researchers from the California Institute of Technology and other institutions reports results of a study of the atmospheric chemical reactions that produce and destroy molecular hydrogen in the stratosphere. Based on these results, the report concludes that most of the hydrogen eliminated from the atmosphere goes into the ground, and therefore that the scientific community will need to turn its focus toward soil destruction of hydrogen in order to accurately predict whether human emissions will accumulate in the air. The researchers reached this conclusion through careful measurement of the abundance of a rare isotope of hydrogen known as deuterium. It has long been known that atmospheric molecular hydrogen is anomalously rich in deuterium, but it was unclear why. The only reasonable explanation seemed to be that atmospheric hydrogen is mostly destroyed by chemical reactions in the air, and that those reactions are relatively slow for deuterium-rich hydrogen, so it accumulates like salt in an evaporating pan of water.

If correct, this would mean that oxidizing atmospheric trace gases control the natural hydrogen cycle and that soils are relatively unimportant. The Caltech group discovered that one of the main natural sources of atmospheric hydrogen--the breakdown of methane--is actually responsible for the atmosphere's enrichment in deuterium. This result implies that reactions with atmospheric oxidants are relatively unimportant to the hydrogen cycle, and that uptake by soils is really in the driver's seat.

This issue is important because of the potential for a future hydrogen economy to leak hydrogen into the air--a scenario explored in the earlier study published in Science. Such leaks of hydrogen seem likely at present, and if they occur must either be mitigated by some natural processes that destroy hydrogen, or else the leaked hydrogen will accumulate in the atmosphere. If the latter, this hydrogen would inevitably find its way into the stratosphere and participate in chemical reactions that damage the ozone layer. The key to predicting how this chain of events will unfold is knowing what natural processes destroy hydrogen, and to what extent they might counteract increases in human emissions.

Hydrogen is a highly reactive element, but the question of when and where it reacts, and under what circumstances, is difficult to know precisely. This question is simplified in the stratosphere, where it's easier to single out and understand specific reactions. According to John Eiler, an assistant professor of geochemistry at the California Institute of Technology and an author of both the new paper and the June paper in Science, the new data were gathered from air samples gathered in the stratosphere with one of the high-flying ER-2 planes operated by the NASA Dryden Flight Research Center in the Mojave Desert.

The ER-2, a reconfigured U-2 spy plane, is part of NASA's Airborne Research Program and is crucial to atmospheric chemists interested in directly collecting stratospheric samples for air-quality research. The air samples that were collected in the ER-2 in various locales show that there is an extreme enrichment of deuterium in stratospheric hydrogen.

"We wanted to look at hydrogen in the stratosphere because it's easy to study the production of hydrogen from methane separate from other influences," Eiler explains. "It may seem odd to go to the stratosphere to understand what's happening in the ground, but this was the best way to get a global perspective on the importance of soils to the hydrogen cycle."

With precise information on the deuterium content of hydrogen formed from methane, the researchers were able to calculate that the soil uptake of hydrogen is as high as 80 percent. It is suspected that this hydrogen is used by soil-living microbes to carry on their biological functions, although the details of this process are poorly understood and have been the subject of only a few previous studies.

It seems likely that the hydrogen taken up by soils is relatively free of environmental consequences, but the question still remains how much more hydrogen the soil can consume. If future use of hydrogen in transportation results in a significant amount of leakage, then soil uptake must increase dramatically or it will be inadequate to cleanse the released hydrogen from the atmosphere, Eiler says.

"An analogy would be the discovery that trees and other plants get rid of some of the carbon dioxide that cars emit, but by no means all of it," he says. "So the question as we look toward a future hydrogen economy is whether the microbes will be able to eat the hydrogen fast enough."

The research was funded in part by the National Science Foundation. Bruce Doddridge, program director in the NSF's division of atmospheric science, said, "This carefully conducted research investigating the natural chemistry of sources and sinks affecting the abundance of molecular hydrogen in the troposphere results in the most accurate information to date, and appears to account for the tropospheric deuterium excess previously observed.

"A more accurate molecular hydrogen budget may have important implications as global fuel technology shifts its focus from fossil fuels to other sources," Doddridge added.

The lead author of the paper is Thom Rahn, a former postdoctoral scholar of Eiler's who is now affiliated with Los Alamos National Laboratory. The other authors are Paul Wennberg, a professor of atmospheric chemistry and environmental engineering science at Caltech; Kristie A. Boering and Michael McCarthy, both of UC Berkeley; Stanley Tyler of UC Irvine; and Sue Schauffler of the National Center for Atmospheric Research in Boulder, Colorado.

In addition to the NSF, other supporters of the research were the Davidow Fund and General Motors Corp., the David and Lucile Packard Foundation, the NASA Upper Atmosphere Research Program, and the National Center for Atmospheric Research.

Writer: 
Robert Tindol
Writer: 

Gravity Variations Predict Earthquake Behavior

PASADENA, Calif. — In trying to predict where earthquakes will occur, few people would think to look at Earth's gravity field. What does the force that causes objects to fall to the ground and the moon to orbit around the earth have to do with the unpredictable ground trembling of an earthquake?

Now, researchers at the California Institute of Technology have found that within subduction zones, the regions where one of the earth's plates slips below another, areas where the attraction due to gravity is relatively high are less likely to experience large earthquakes than areas where the gravitational force is relatively low.

The study, by Caltech graduate student Teh-Ru Alex Song and Associate Professor of Geophysics Mark Simons, will appear in the August 1 issue of the journal Science.

Until now, says Simons, researchers studying earthquake behavior generally took one of four approaches: 1) analyzing seismograms generated by earthquakes, 2) studying frictional properties of various types of rock in the laboratory or in the field, 3) measuring the slow accumulation of strain between earthquakes with survey techniques, and 4) large-scale dynamic models of earthquakes and tectonics.

Instead of using one of these approaches, Song and Simons considered variations in the gravity field as a predictor of seismic behavior.

A gravity anomaly occurs when gravity is stronger or weaker than the regional average. For example, a mountain or an especially dense rock would tend to increase the nearby gravity field, creating a positive anomaly. Likewise, a valley would tend to create a negative anomaly.

Song and Simons examined existing data from satellite-derived observations of the gravity field in subduction zones. Comparing variations in gravity along the trenches with earthquake data from two different catalogs going back 100 years, the team found that, within a given subduction zone, areas with negative gravity anomalies correlated with increased large earthquake activity. Areas with relatively high gravity anomalies experienced fewer large earthquakes.

In addition, most of the energy released in earthquakes was in areas of low gravity. The team looked at subduction zone earthquakes with magnitude greater than 7.5 since 1976. They found that of the total energy released in those earthquakes, 44 percent came from regions with the most strongly negative gravity anomalies, though these regions made up only 14 percent of the total area.

Song and Simons also compared the location of large earthquakes with the topography of the subduction zones, finding that areas of low topography (such as basins) also corresponded well to areas with low gravity and high seismic activity.

So why would gravity and topography be related to seismic activity?

One possible link is via the frictional behavior of the fault. When two plates rub up against each other, friction between the plates makes it harder for them to slide. If the friction is great enough, the plates will stick. Over long periods of time, as the stuck plates push against each other, they may deform, creating spatial variations in topography and gravity.

In addition to deforming the plates, friction causes stress to build up. When too much stress builds up, the plates will suddenly jump, releasing the strain in the sometimes violent shaking of an earthquake.

If there were no friction between the plates, they would just slide right by each other smoothly, without bending or building up the strain that eventually results in earthquakes.

So in subduction zones, areas under high stress are likely to have greater gravity and topography anomalies, and are also more likely to have earthquakes.

Though this account provides a basic explanation for a rather complicated and unintuitive phenomenon, it is a simplified view, and Song and Simons would like to do more work to refine the details of the relation between the gravity field and large earthquakes.

The gravity anomalies the team considered take a long time to build up, and change very little over timescales up to at least 1 million years. Short-term events such as earthquakes do change the gravity field as the earth's plates suddenly move, but those variations are small compared with the long-term anomalies, which are on the order of 4 x 10^-4 m/s^2.

Because topography and gravity variations persist over periods of time much longer than the typical time between earthquakes, 100 to 1,000 years, large earthquakes should be consistently absent from areas with large positive gravity anomalies, say Song and Simons.

"This study makes a strong connection between long-term tectonic behavior and short-term seismic activity," says Simons, "and thereby provides a class of new observations for understanding earthquake dynamics."

Though no one can tell when or where the next major earthquake will occur, Global Positioning System measurements can show where strain is accumulating. Simons hopes to use such measurements to test the prediction that areas with high gravity will have low strain, and vice versa. The team points out that although large earthquakes occur where gravity and topography are low, there are low-gravity areas in subduction zones with no seismic activity. Furthermore, the research concentrates on subduction zones, and so makes no predictions about other types of faults.

Nonetheless, within a subduction zone known to be earthquake-prone, Simons believes earthquakes are more likely to occur in low-gravity zones. High gravity areas do tend to have few earthquakes. So while the research does not offer a way to predict where earthquakes will happen, it can predict where they won't happen, says Simons.

MEDIA CONTACT: Ernie Tretkoff (626) 395-8733 tretkoff@caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media

Writer: 
ET

New Sky Survey Begins at Palomar Observatory

Photos available at http://www.astro.caltech.edu/palomar/oschin_telescope.htm

PALOMAR Mountain, Calif. — A major new sky survey has begun at the Palomar Observatory. The Palomar-QUEST survey, a collaborative venture between the California Institute of Technology, Yale University, the Jet Propulsion Laboratory, and Indiana University, will explore the universe from our solar system out to the most distant quasars, more than 10 billion light-years away.

The survey will be done using the newly refurbished 48-inch Oschin Telescope, originally used to produce major photographic sky atlases starting in 1950s. At its new technological heart is a very special, fully digital camera. The camera contains 112 digital imaging detectors, known as charge-coupled devices (CCDs). The largest astronomical camera until now has had 30 CCDs. CCDs are often used for digital imaging ranging from common snapshot cameras to sophisticated scientific instruments. Designed and built by scientists at Yale and Indiana Universities, the QUEST (Quasar Equatorial Survey Team) camera was recently installed on the Oschin Telescope. "We are excited by the new data we are starting to obtain from the Palomar Observatory with the new QUEST camera," says Charles Baltay, Higgins Professor of Physics and Astronomy at Yale University. Baltay's dream of building a large electronic camera that could capture the entire field of view of a wide-field telescope is now a reality. The survey will generate astronomical data at an unprecedented rate, about one terabyte per month; a terabyte is a million megabytes, an amount of information approximately equivalent to that contained in two million books. In two years, the survey will generate an amount of information about equal to that in the entire Library of Congress.

A major new feature of the Palomar-QUEST survey will be many repeated observations of the same portions of the sky, enabling researchers to find not only objects that move (like asteroids or comets), but also objects that vary in brightness, such as the supernova explosions, variable stars, quasars, or cosmic gamma-ray bursts--and to do this at an unprecedented scale.

"Previous sky surveys provided essentially digital snapshots of the sky", says S. George Djorgovski, professor of astronomy at Caltech. "Now we are starting to make digital movies of the universe." Djorgovski and his team, in collaboration with the Yale group, are also planning to use the survey to discover large numbers of very distant quasars--highly luminous objects believed to be powered by massive black holes in the centers of young galaxies--and to use them to probe the early stages of the universe.

Richard Ellis, Steele Professor of Astronomy and director of the Caltech Optical Observatories, will use QUEST in the search for exploding stars, known as supernovae. He and his team, in conjunction with the group from Yale, will use their observations of these exploding stars in an attempt to confirm or deny the recent finding that our universe is accelerating as it expands.

Shri Kulkarni, MacArthur Professor of Astronomy and Planetary Science at Caltech, studies gamma-ray bursts, the most energetic stellar explosions in the cosmos. They are short lived and unpredictable. When a gamma-ray burst is detected its exact location in the sky is uncertain. The automated Oschin Telescope, armed with the QUEST camera's wide field of view, is poised and ready to pin down the exact location of these explosions, allowing astronomers to catch and study the fading glows of the gamma-ray bursts as they occur.

Closer to home, Caltech associate professor of planetary astronomy Mike Brown is looking for objects at the edge of our solar system, in the icy swarm known as the Kuiper Belt. Brown is convinced that there big objects out there, possibly as big as the planet Mars. He, in collaboration with astronomer David Rabinowitz of Yale, will use QUEST to look for them.

Steve Pravdo, project manager for the Jet Propulsion Laboratory's Near-Earth Asteroid Tracking (NEAT) Project, will use QUEST to continue the NEAT search which began in 2001. The QUEST camera will extend the search for asteroids that might one day approach or even collide with our planet.

The Palomar-QUEST survey will undoubtedly enable many other kinds of scientific investigations in the years to come. The intent is to make all of the copious amounts of data publicly available in due time on the Web, as a part of the nascent National Virtual Observatory. Roy Williams, member of the professional staff of Caltech's Center for Advanced Computing Research, is working on the National Virtual Observatory project, which will greatly increase the scientific impact of the data and ease its use for public and educational outreach as well.

The QUEST team members from Indiana University are Jim Musser, Stu Mufson, Kent Honeycutt, Mark Gebhard, and Brice Adams. Yale University's team includes Charles Baltay, David Rabinowitz, Jeff Snyder, Nick Morgan, Nan Ellman, William Emmet, and Thomas Hurteau. The members from the California Institute of Technology are S. George Djorgovski, Richard Ellis, Ashish Mahabal, and Roy Williams. The Near-Earth Asteroid Tracking team from the Jet Propulsion Laboratory consists of Raymond Bambery, principal investigator, and coinvestigators Michael Hicks, Kenneth Lawrence, Daniel MacDonald, and Steven Pravdo.

Installation of the QUEST camera at the Palomar Observatory was overseen by Robert Brucato, Robert Thicksten, and Hal Petrie.

Related Link: Palomar Observatory http://www.astro.caltech.edu/palomar

MEDIA CONTACT: Scott Kardel, Palomar Public Affairs Director (760) 742-2111 wsk@astro.caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media

Writer: 
SK

A Detailed Map of Dark Matter in a Galactic Cluster Reveals How Giant Cosmic Structures Formed

Astrophysicists have had an exceedingly difficult time charting the mysterious stuff called dark matter that permeates the universe because it's--well--dark. Now, a unique "mass map" of a cluster of galaxies shows in unprecedented detail how dark matter is distributed with respect to the shining galaxies. The new comparison gives a convincing indication of how dark matter figures into the grand scheme of the cosmos.

Using a technique based on Einstein's theory of general relativity, an international group of astronomers led by Jean-Paul Kneib, Richard Ellis, and Tommaso Treu of the California Institute of Technology mapped the mass distribution of a gigantic cluster of galaxies about 4.5 billion light-years from Earth. They did this by studying the way the cluster bends the light from other galaxies behind it. This technique, known as gravitational lensing, allowed the researchers to infer the mass contribution of the dark matter, even though it is otherwise invisible.

Clusters of galaxies are the largest stable systems in the universe and ideal "laboratories" for studying the relationship between the distributions of dark and visible matter. Caltech's Fritz Zwicky realized in 1937 from studies of the motions of galaxies in the nearby Coma cluster that the visible component of a cluster--the stars in galaxies--represents only a tiny fraction of the total mass. About 80 to 85 percent of the matter is invisible.

In a campaign of over 120 hours of observations using the Hubble Space Telescope, the researchers surveyed a patch of sky almost as large as the full moon, which contained the cluster and thousands of more distant galaxies behind it. The distorted shapes of these distant systems were used to map the dark matter in the foreground cluster. The study achieved a new level of precision, not only for the center of the cluster, as has been done before for many systems, but also for the previously uncharted outlying regions.

The result is the most comprehensive study to date of the distribution of dark matter and its relationship to the shining galaxies. Signals were traced as far out as 15 million light-years from the cluster center, a much larger range than in previous investigations.

Many researchers have tried to perform these types of measurements with ground-based telescopes, but the technique relies heavily on measuring the exact shapes of distant galaxies behind the cluster, and for this the "surgeon's eye" of the Hubble Space Telescope is far superior.

The study, to be published soon in the Astrophysical Journal, reveals that the density of dark matter falls fairly sharply with distance from the cluster center, defining a limit to its distribution and hence the total mass of the cluster. The falloff in density with radius confirms a picture that has emerged from detailed computer simulations over the past years.

Team member Richard Ellis said, "Although theorists have predicted the distribution of dark matter in clusters from numerical simulations based on the effects of gravity, this is the first time we have convincing observations on large scales to back them up.

"Some astronomers had speculated clusters might contain large reservoirs of dark matter in their outermost regions," Ellis added. "Assuming our cluster is representative, this is not the case."

In finer detail, the team noticed that some structure emerged from their map of the dark matter. For example they found localized concentrations of dark matter associated with galaxies known to be slowly falling into the system. Overall there is a striking correspondence between features in the dark matter map and that delineated by the cluster galaxies, which is an important result in the new study.

"The close association of dark matter with structure in the galaxy distribution is convincing evidence that clusters like the one studied built up from the merging of smaller groups of galaxies, which were prevented from flying away by the gravitational pull of their dark matter," says Jean-Paul Kneib, who is the lead author in the publication.

Future investigations will extend this work using Hubble's new camera, the Advanced Camera for Surveys (ACS), which will be trained on a second cluster later this year. ACS is 10 times more efficient than the Wide Field and Planetary Camera 2, which was used for this investigation. With the new instrument, it will be possible to study clumps of finer mass in galaxy clusters in order to investigate how the clusters originally were assembled.

By tracing the distribution of dark matter in the most massive structure in the universe using the powerful trick of gravitational lensing, astronomers are making great progress towards a better understanding of how such systems were assembled, as well as toward defining the key role of dark matter.

In addition to Kneib, Ellis, and Treu, the other team members are Patrick Hudelot of the Observatoire Midi-Pyrénées in France, Graham P. Smith of Caltech, Phil Marshall of the Mullard Radio Observatory in England, Oliver Czoske of the Institut für Astrophysik und Extraterrestrische Forschung in Germany, Ian Smail of the University of Durham in England, and Priya Natarajan of Yale University.

For more information, please contact:

Jean-Paul Kneib Caltech/Observatoire Midi-Pyrénées (currently in Hawaii) Phone: (808) 881-3865 E-mail: jean-paul.kneib@ast.obs-mip.fr

Richard Ellis Caltech Phone: (626) 395-4970 (secretary) (Australia: Cellular: 011-44-7768-923277) E-mail: rse@astro.caltech.edu

Writer: 
RT

International Teams Set New Long-range Speed Record with Next-generation Internet Protocol

Scientists at the California Institute of Technology (Caltech) and the European Organization for Nuclear Research (CERN) have set a new Internet2 land speed record using the next-generation Internet protocol IPv6. The team sustained a single stream TCP rate of 983 megabits per second for more than one hour between the CERN facility in Geneva and Chicago, a distance of more than 7,000 kilometers. This is equivalent to transferring a full CD in 5.6 seconds.

The performance is remarkable because it overcomes two important challenges:

· IPv6 forwarding at Gigabit-per-second speeds · High-speed TCP performance across high bandwidth/latency networks.

This major step towards demonstrating how effectively IPv6 can be used should encourage scientists and engineers in many sectors of society to deploy the next-generation Internet protocol, the Caltech researchers say.

This latest record by Caltech and CERN is a further step in an ongoing research-and-development program to develop high-speed global networks as the foundation of next generation data-intensive grids. Caltech and CERN also hold the current Internet2 land speed record in the IPv4 class, where IPv4 is the traditional Internet protocol that carries 90 percent of the world's network traffic today. In collaboration with the Stanford Linear Accelerator Center (SLAC), Los Alamos National Laboratory, and the companies Cisco Systems, Level 3, and Intel, the team transferred one terabyte of data across 10,037 kilometers in less than one hour, from Sunnyvale, California, to Geneva, Switzerland. This corresponds to a sustained TCP rate of 2.38 gigabits per second for more than one hour.

Multi-gigabit-per-second IPv4 and IPv6 end-to-end network performance will lead to new research and business models. People will be able to form "virtual organizations" of planetary scale, sharing in a flexible way their collective computing and data resources. In particular, this is vital for projects on the frontiers of science and engineering, projects such as particle physics, astronomy, bioinformatics, global climate modeling, and seismology.

Harvey Newman, professor of physics at Caltech, said, "This is a major milestone towards our dynamic vision of globally distributed analysis in data-intensive, next-generation high-energy physics (HEP) experiments. Terabyte-scale data transfers on demand, by hundreds of small groups and thousands of scientists and students spread around the world, is a basic element of this vision; one that our recent records show is realistic. IPv6, with its increased address space and security features is vital for the future of global networks, and especially for organizations such as ours, where scientists from all world regions are building computing clusters on an increasing scale, and where we use computers including wireless laptop and mobile devices in all aspects of our daily work.

"In the future, the use of IPv6 will allow us to avoid network address translations (NAT) that tend to impede the use of video-advanced technologies for real-time collaboration," Newman added. "These developments also will empower the broader research community to use peer-to-peer and other advanced grid architectures in support of their computationally intensive scientific goals."

Olivier Martin, head of external networking at CERN and manager of the DataTAG project said, "These new records clearly demonstrate the maturity of IPv6 protocols and the availability of suitable off-the-shelf commercial products. They also establish the feasibility of transferring very large amounts of data using a single TCP/IP stream rather than multiple streams as has been customarily done until now by most researchers as a quick fix to TCP/IP's congestion avoidance algorithms. I am optimistic that the various research groups working on this issue will now quickly release new TCP/IP stacks having much better resilience to packet losses on long-distance multi-gigabit-per-second paths, thus allowing similar or even better records to be established across shared Internet backbones."

The team used the optical networking capabilities of the LHCnet, DataTAG, and StarLight and gratefully acknowledges support from the DataTAG project sponsored by the European Commission (EU Grant IST-2001-32459), the DOE Office of Science, High Energy and Nuclear Physics Division (DOE Grants DE-FG03-92-ER40701 and DE-FC02-01ER25459), and the National Science Foundation (Grants ANI 9730202, ANI-0230967, and PHY-0122557).

About the California Institute of Technology (Caltech):

With an outstanding faculty, including four Nobel laureates, and such off-campus facilities as Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities. More information is available at http://www.caltech.edu.

About CERN:

CERN, the European Organization for Nuclear Research, has its headquarters in Geneva, Switzerland. At present, its member states are Austria, Belgium, Bulgaria, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About the European Union DataTAG project:

The DataTAG is a project co-funded by the European Union, the U.S. Department of Energy, and the National Science Foundation. It is led by CERN together with four other partners. The project brings together the following European leading research agencies: Italy's Istituto Nazionale di Fisica Nucleare (INFN), France's Institut National de Recherche en Informatique et en Automatique (INRIA), the UK's Particle Physics and Astronomy Research Council (PPARC), and Holland's University of Amsterdam (UvA). The DataTAG project is very closely associated with the European Union DataGrid project, the largest grid project in Europe also led by CERN. For more information, see http://www.datatag.org.

 

Writer: 
Robert Tindol
Writer: 

Hydrogen economy might impactEarth's stratosphere, study shows

According to conventional wisdom, hydrogen-fueled cars are environmentally friendly because they emit only water vapor -- a naturally abundant atmospheric gas. But leakage of the hydrogen gas that can fuel such cars could cause problems for the upper atmosphere, new research shows.

In an article appearing this week in the journal Science, researchers from the California Institute of Technology report that the leaked hydrogen gas that would inevitably result from a hydrogen economy, if it accumulates, could indirectly cause as much as a 10-percent decrease in atmospheric ozone. The researchers are physics research scientist Tracey Tromp, Assistant Professor of Geochemistry John Eiler, planetary science professor Yuk Yung, planetary science research scientist Run-Lie Shia, and Jet Propulsion Laboratory scientist Mark Allen.

If hydrogen were to replace fossil fuel entirely, the researchers estimate that 60 to 120 trillion grams of hydrogen would be released each year into the atmosphere, assuming a 10-to-20-percent loss rate due to leakage. This is four to eight times as much hydrogen as is currently released into the atmosphere by human activity, and would result in doubling or tripling of inputs to the atmosphere from all sources, natural or human.

Because molecular hydrogen freely moves up and mixes with stratospheric air, the result would be the creation of additional water at high altitudes and, consequently, an increased dampening of the stratosphere. This in turn would result in cooling of the lower stratosphere and disturbance of ozone chemistry, which depends on a chain of chemical reactions involving hydrochloric acid and chlorine nitrate on water ice.

The estimates of potential damage to stratospheric ozone levels are based on an atmospheric modeling program that tests the various scenarios that might result, depending on how much hydrogen ends up in the stratosphere from all sources, both natural and anthropogenic.

Ideally, a hydrogen fuel-cell vehicle has no environmental impact. Energy is produced by combining hydrogen with oxygen pulled from the atmosphere, and the tailpipe emission is water. The hydrogen fuel could come from a number of sources (Iceland recently started pulling it out of the ground). Nuclear power could be used to generate the electricity needed to split water, and in principle, the electricity needed could also be derived from renewable sources such as solar of wind power.

By comparison, the internal combustion engine uses fossil fuels and produces many pollutants, including soot, noxious nitrogen and sulfur gases, and the "greenhouse gas" carbon dioxide. While a hydrogen fuel-cell economy would almost certainly improve urban air quality, it has the potential unexpected consequences due to the inevitable leakage of hydrogen from cars, hydrogen production facilities, the transportation of the fuel.

Uncertainty remains about the effects on the atmosphere because scientists still have a limited understanding of the hydrogen cycle. At present, it seems likely such emissions could accumulate in the air. Such a build-up would have several consequences, chief of which would be a moistening and cooling of the upper atmosphere and, indirectly, destruction of ozone.

In this respect, hydrogen would be similar to the chlorofluorocarbons (once the standard substance used for air conditioning and refrigeration), which were intended to be contained within their devices, but which in practice leaked into the atmosphere and attacked the stratospheric ozone layer.

The authors of the Science article say that the current situation is unique in that society has the opportunity to understand the potential environmental impact well ahead of the growth of a hydrogen economy. This contrasts with the cases of atmospheric carbon dioxide, methyl bromide, CFCs, and lead, all of which were released into the environment by humans long before their consequences were understood.

"We have an unprecedented opportunity this time to understand what we're getting into before we even switch to the new technology," says Tromp, the lead author. "It won't be like the case with the internal-combustion engine, when we started learning the effects of carbon dioxide decades later."

The question of whether or not hydrogen is bad for the environment hinges on whether the planet has the ability to consume excess anthropogenic hydrogen, explains Eiler. "This man-made hydrogen will either be absorbed in the soil -- a process that is still poorly understood but likely free of environmental consequences -- or react with other compounds in the atmosphere.

"The balance of these two processes will be key to the outcome," says Eiler. "If soils dominate, a hydrogen economy might have little effect on the environment. But if the atmosphere is the big player, the stratospheric cooling and destruction of ozone modeled in this Science paper are more likely to occur.

"Determining which of these two processes dominates should be a solvable problem," states Eiler, whose research group is currently exploring the natural budget of hydrogen using new isotopic techniques.

"Understanding the effects of hydrogen on the environment now should help direct the technologies that will be the basis of a hydrogen economy," Tromp adds. "If hydrogen emissions present an environmental hazard, then recognizing that hazard now can help guide investments in technologies to favor designs that minimize leakage.

"On the other hand, if hydrogen is shown to be environmentally friendly in every respect, then designers could pursue the most cost-effective technologies and potentially save billions in needless safeguards."

"Either way, it's good for society that we have an emission scenario at this stage," says Eiler. "In past cases -- with chlorofluorocarbons, nitrogen oxides, methane, methyl bromide, carbon dioxide, and carbon monoxide -- we always found out that there were problems long after they were in common use. But this time, we have a unique opportunity to study the anthropogenic implications of a new technology before it's even a problem."

If hydrogen indeed turns out to be bad for the ozone layer, should the transition to hydrogen-fueled cars be abandoned? Not necessarily, Tromp and Eiler claim.

"If it's the best way to provide a new energy source for our needs, then we can, and probably should, do it," Tromp says.

Eiler adds, "If we had had perfect foreknowledge of the effects of carbon dioxide a hundred years ago, would we have abandoned the internal combustion engine? Probably not. But we might have begun the process of controlling CO2 emissions earlier."

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Astronomers "weigh" pulsar's planets

For the first time, the planets orbiting a pulsar have been "weighed" by measuring precisely variations in the time it takes them to complete an orbit, according to a team of astronomers from the California Institute of Technology and Pennsylvania State University.

Reporting at the summer meeting of the American Astronomical Society, Caltech postdoctoral researcher Maciej Konacki and Penn State astronomy professor Alex Wolszczan announced today that masses of two of the three known planets orbiting a rapidly spinning pulsar 1,500 light-years away in the constellation Virgo have been successfully measured. The planets are 4.3 and 3.0 times the mass of Earth, with an error of 5 percent.

The two measured planets are nearly in the same orbital plane. If the third planet is co-planar with the other two, it is about twice the mass of the moon. These results provide compelling evidence that the planets must have evolved from a disk of matter surrounding the pulsar, in a manner similar to that envisioned for planets around sun-like stars, the researchers say.

The three pulsar planets, with their orbits spaced in an almost exact proportion to the spacings between Mercury, Venus, and Earth, comprise a planetary system that is astonishingly similar in appearance to the inner solar system. They are clearly the precursors to any Earth-like planets that might be discovered around nearby sun-like stars by the future space interferometers such as the Space Interferometry Mission or the Terrestrial Planet Finder.

"Surprisingly, the planetary system around the pulsar 1257+12 resembles our own solar system more than any extrasolar planetary system discovered around a sun-like star," Konacki said. "This suggests that planet formation is more universal than anticipated."

The first planets orbiting a star other than the sun were discovered by Wolszczan and Frail around an old, rapidly spinning neutron star, PSR B1257+12, during a large search for pulsars conducted in 1990 with the giant, 305-meter Arecibo radio telescope. Neutron stars are often observable as radio pulsars, because they reveal themselves as sources of highly periodic, pulse-like bursts of radio emission. They are extremely compact and dense leftovers from supernova explosions that mark the deaths of massive, normal stars.

The exquisite precision of millisecond pulsars offers a unique opportunity to search for planets and even large asteroids orbiting the pulsar. This "pulsar timing" approach is analogous to the well-known Doppler effect so successfully used by optical astronomers to identify planets around nearby stars. Essentially, the orbiting object induces reflex motion to the pulsar which result in perturbing the arrival times of the pulses. However, just like the Doppler method, the pulsar timing method is sensitive to stellar motions along the line-of-sight, the pulsar timing can only detect pulse arrival time variations caused by a pulsar wobble along the same line. The consequence of this limitation is that one can only measure a projection of the planetary motion onto the line-of-sight and cannot determine the true size of the orbit.

Soon after the discovery of the planets around PSR 1257+12, astronomers realized that the heavier two must interact gravitationally in a measurable way, because of a near 3:2 commensurability of their 66.5- and 98.2-day orbital periods. As the magnitude and the exact pattern of perturbations resulting from this near-resonance condition depend on a mutual orientation of planetary orbits and on planet masses, one can, in principle, extract this information from precise timing observations.

Wolszczan showed the feasibility of this approach in 1994 by demonstrating the presence of the predicted perturbation effect in the timing of the planet pulsar. In fact, it was the first observation of such an effect beyond the solar system, in which resonances between planets and planetary satellites are commonly observed. In recent years, astronomers have also detected examples of gravitational interactions between giant planets around normal stars.

Konacki and Wolszczan applied the resonance-interaction technique to the microsecond-precision timing observations of PSR B1257+12 made between 1990 and 2003 with the giant Arecibo radio telescope. In a paper to appear in the Astrophysical Journal Letters, they demonstrate that the planetary perturbation signature detectable in the timing data is large enough to obtain surprisingly accurate estimates of the masses of the two planets orbiting the pulsar.

The measurements accomplished by Konacki and Wolszczan remove a possibility that the pulsar planets are much more massive, which would be the case if their orbits were oriented more "face-on" with respect to the sky. In fact, these results represent the first unambiguous identification of Earth-sized planets created from a protoplanetary disk beyond the solar system.

Wolszczan said, "This finding and the striking similarity of the appearance of the pulsar system to the inner solar system provide an important guideline for planning the future searches for Earth-like planets around nearby stars."

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Pages

Subscribe to RSS - research_news