Odor recognition is a patterned, time-dependent process, research shows

PASADENA, Calif.-When Hamlet told the courtiers they would eventually "nose out" the hidden corpse of Polonius, he was perhaps a better neurobiologist than he realized. According to research by neuroscientists at the California Institute of Technology, the brain creates and uses subtle temporal codes to identify odors.

This research shows that the signals carried by certain neuron populations change over the duration of a sniff such that one first gets a general notion of the type of odor. Then, the wiring between these neurons performs work that leads to a more subtle discrimination, and thus, a precise recognition of the smell.

In the February 2 issue of the journal Science, Caltech biology and computation and neural systems professor Gilles Laurent and his colleague, postdoctoral scholar Rainer W. Friedrich, now at the Max Planck Institute in Heidelberg, Germany, report that the neurons of the olfactory bulb respond to an odor through a complicated process that evolves over a brief period of time. These neurons, called mitral cells because they resemble miters, thepointed hats worn by bishops, are found by the thousands in the olfactory bulb of humans.

"We're interested in how ensembles of neurons encode sensory information," explains Laurent, lead author of the study. "So we're less interested in where the relevant neurons lie, as revealed by brain mapping studies, than in the patterns of firing these neurons produce and in figuring out from these patterns how recognition, or decoding, works."

The researchers chose to use zebrafish in the study because these animals have comparatively few mitral cells and because much is already known about the types of odors that are behaviorally relevant to them. The Science study likely applies to other animals, including humans, because the olfactory systems of most living creatures appear to follow the same basic principles.

After placing electrodes in the brain of individual fish, the researchers subjected them sequentially to 16 amino-acid odors. Amino acids, the components of proteins, are found in the foods these fish normally go after in their natural environments.

By analyzing the signals produced by a population of mitral cells in response to each one of these odors, the researchers found that the information they could extract about the stimulus became more precise as time went by. The finding was surprising because the signals extracted from the neurons located upstream of the mitral cells, the receptors, showed no such temporal evolution.

"It looks as if the brain actively transforms static patterns into dynamic ones and in so doing, manages to amplify the subtle differences that are hard to perceive between static patterns," Laurent says.

"Music may provide a useful analogy. Imagine that the olfactory system is a chain of choruses-a receptor chorus, feeding onto a mitral-cell chorus and so on-and that each odor causes the receptor chorus to produce a chord.

"Two similar odors evoke two very similar chords from this chorus, making discrimination difficult to a listener," Laurent says. "What the mitral-cell chorus does is to transform each chord it hears into a musical phrase, in such a way that the difference between these phrases becomes greater over time. In this way, odors that, in this analogy, sounded alike, can progressively become more unique and more easily identified."

Applied to our own experience, this result could be described as follows: When we detect a citrus smell in a garden, for example, the odor is first conveyed by the receptors and the mitral cells. The initial firing of the cells allows for little more than the generic detection of the citrus nature of the smell.

Within a few tenths of a second, however, this initial activity causes new mitral cells to be recruited, leading the pattern of activity to change rapidly and become more unique. This quickly allows us to determine whether the citrus smell is actually a lemon or an orange.

However, the individual tuning of the mitral cells first stimulated by the citrus odor do not themselves become more specific. Instead, the manner in which the firing patterns unfold through the lateral circuitry of the olfactory bulb is ultimately responsible for the fine discrimination of the odor.

"Hence, as the system evolves, it loses information about the class of odors, but becomes able to convey information about precise identity," says Laurent. This study furthers progress toward understanding the logic of the olfactory coding.

Writer: 
Robert Tindol
Writer: 

Caltech/MIT Issue Voting Technology Report to Florida Task Force

PASADENA, Calif.- The Caltech/MIT Voting Technology Project has submitted a preliminary report to the task force studying the election in Florida. Their nationwide study of voting machines offers further evidence supporting the task force's call to replace punch card voting in Florida. The statistical analysis also uncovered a more surprising finding: electronic voting, as currently implemented, has performed less well than was widely believed.

The report examines the effect of voting technologies on unmarked and/or spoiled ballots. Researchers from both universities are collaboratively studying five voting technologies: paper ballots with hand-marked votes, lever machines, punch cards, optical scanning devices, and direct-recording electronic devices (DREs), which are similar to automatic teller machines.

The study focuses on so-called "undervotes" and "overvotes," which are combined into a group of uncounted ballots called "residual votes." These include ballots with votes for more than one candidate, with no vote, or that are marked in a way that is uncountable.

Careful statistical analysis shows that there are systematic differences across these technologies, and that paper ballots, optical scanning devices, and lever machines have significantly lower residual voting rates than punch-card systems and DREs. Overall, the residual voting rate for the first three systems averages about 2 percent, and for the last two systems averages about 3 percent. This study is the most extensive analysis ever of the effects of voting technology on under- and overvotes. The study covers the entire country for all presidential elections since 1988, and examines variations at the county level. When the study is complete, it will encompass presidential elections going back to 1980, and will examine a finer breakdown of the different technologies, and a breakdown of residual votes into its two components: over- and undervotes. A final report will be released in June. The Voting Project was the brainchild of California Institute of Technology president David Baltimore in collaboration with Massachusetts Institute of Technology president Charles Vest. It was announced in December 2000 and faculty from both campuses immediately began collecting data and studying the range of voting methods across the nation in the hope of avoiding in the future the vote-counting chaos that followed the 2000 presidential election.

The analysis is complicated by the fact that voting systems vary from county to county and across time. When a voting system is switched, say from lever machines to DREs, the number of residual votes can go up due to voter unfamiliarity with the new technology.

"We don't want to give the impression that electronic systems are necessarily inaccurate, but there is much room for improvement," said Thomas Palfrey, Caltech professor of economics and political science.

"Electronic voting technology is in its infancy and seems the most likely one to benefit significantly from new innovations and increased voter familiarity," states the 11-page report.

Caltech Contact: Jill Perry Media Relations (626) 395-3226 jperry@caltech.edu

MIT Contact: Kenneth Campbell News Office (617) 253-2700 kdc@mit.edu

Writer: 
JEP

Caltech and MIT Join Forces to Create Reliable, Uniform Voting System

Caltech and MIT are joining forces to develop a United States voting system that will be easy to use, reliable, secure and modestly priced, the presidents of the two universities announced today.

"America needs a uniform balloting procedure," said Caltech President David Baltimore and MIT President Charles M. Vest in a letter to Carnegie Corporation of New York President Vartan Gregorian who is recommending the corporation fund the initial research. "This has become painfully obvious in the current national election, but the issue is deeper and broader than one series of events."

Vest and Baltimore said the new technology "should minimize the possibility of confusion about how to vote, and offer clear verification of what vote is to be recorded. It should decrease to near zero the probability of miscounting votes... The voting technology should be tamper resistant and should minimize the prospect of manipulation and fraud."

The two university presidents proposed that their prestigious institutions give the project high priority for two major reasons:

"First, the technologies in wide use today are unacceptably unreliable. This manifests itself in at least three forms: undercounts (failure to correctly record a choice of candidate), overcounts (voting for two candidates), and missed ballots (machine failure or feeding error). Punch cards and optically scanned ballots are two of the most widely used technologies, and both suffer unacceptably high error rates in all three categories. For example, in the recent Florida election, optical scanning technology had an undercount rate of approximately 3 out of 1,000, and the punch card undercount rate was approximately 15 out of 1,000. Including the other two sources of errors, the overall ballot failure rate with machine counting was about three times this.

"Second, some of the most common types of machinery date from the late nineteenth century and have become obsolete. Most notably, many models of lever machines are no longer manufactured, and although spare parts are difficult to obtain, they are still widely used (accounting for roughly 15 percent of all ballots cast).

"States and municipalities using lever machines will have to replace them in the near future, and the two most common alternatives are punch cards and optical scanning devices. Ironically, many localities in Massachusetts have recently opted for lever machines over punch card ballots because of problems with punch cards registering preferences. "

Gregorian of Carnegie Corporation of New York, will recommend a grant of $250,000 to fund the initial research.

"I want to congratulate the two presidents of our nation's most distinguished universities for their leadership in this welcome and timely initiative on behalf of our election system," said Gregorian. "Voting is the fundamental safeguard of our democracy and we have the technological power to ensure that every person's vote does count. MIT and Caltech have assembled a team of America's top technology and political science scholars to deal with an issue no voter wants ignored. This research is certain to ensure that America's voting process is strengthened."

The project will involve a team of two professors from each university who are experts in technology, design and political science. Initially, they will be charged with defining the problem, surveying experiences with existing voting devices, and making a preliminary analysis of a variety of technological approaches. They will also set goals and create a plan for full-scale research and development of the new system.

The four members of the team are Massachusetts Institute of Technology Professors Stephen Ansolabehere of political science and Nicholas Negroponte, chairman of the Media Lab; and Caltech Professors Thomas Palfrey of political science and economics and Jehoshua Bruck of computation and neural systems and electrical engineering. Vest and Baltimore noted their credentials:

"Professors Ansolabehere and Palfrey have deep knowledge of the U.S. electoral process, have facility in tapping into what is known about existing voting systems and equipment, and have expertise in performing the statistical analyses that will be an integral part of the study. Equally important, any system that is suggested must be politically acceptable. Professors Ansolabehere and Palfrey will undertake a consultative process with the various relevant levels of government to find a solution with a reasonable likelihood of acceptance.

"Professor Negroponte and his MIT Media Lab colleagues and Professor Bruck at Caltech combine capabilities in technology, cognitive science, and graphic design. They can assess the various voting schemes that are currently available and, if necessary, design a new system that fulfills the technological and political needs of a fair voting process."

###

Caltech Contact: Robert Tindol Media Relations (626) 395-3631

MIT Contact: Kenneth Campbell News Office (617) 253-2700

Writer: 
jep

New planets still being createdIn our stellar neighborhood, study shows

In a study that strengthens the likelihood that solar systems like our own are still being formed, an international team of scientists is reporting today that three young stars in the sun's neighborhood have the raw materials necessary for the formation of Jupiter-sized planets.

Data obtained from the European Space Agency's Infrared Space Observatory (ISO) indicate for the first time that molecular hydrogen is present in the debris disks around young nearby stars. The results are important because experts have long thought that primordial hydrogen—the central building block of gas giants such as Jupiter and Saturn—is no longer present in sufficient quantities in the sun's stellar vicinity to form new planets.

The paper appears in the January 4 issue of the journal Nature.

"We looked at only three stars, but the results could indicate that it's easier to make Jupiter-sized planets than previously thought," said Geoffrey Blake, professor of cosmochemistry at the California Institute of Technology and corresponding author of the study. "There are over 100 candidate debris disks within about 200 light-years of the sun, and our work suggests that many of these systems may still be capable of making planets."

The abundance of Jupiter-sized planets is good news, though indirectly, in the search for extraterrestrial life. A gas giant such as Jupiter, may not be particularly hospitable for the formation of life, but experts think the mere presence of such huge bodies in the outer reaches of a solar system protects smaller rocky planets like Earth from catastrophic comet and meteor impacts. A Jupiter-sized planet possesses a gravitational field sufficient to kick primordial debris into the farthest reaches of the solar system, as Jupiter has presumably done by sending perhaps billions of comets into the Oort Cloud beyond the orbit of Pluto and safely away from Earth.

If comets and meteors were not ejected by gas giants, Blake said, life on Earth and any other Earth-like planets in the universe could periodically be "sterilized" by impacts.

"A comet the size of Hale-Bopp, for example, would vaporize much of Earth's oceans if it hit there," Blake said. "The impact from a 500 km object (about ten times the size of Hale-Bopp) could create nearly 100 atmospheres of rock vapor, the heat from which can evaporate all of the Earth's oceans."

The researchers did not directly detect any planets in the study, but nonetheless found that molecular hydrogen was abundant in all three disks they targeted. In the disk surrounding Beta Pictoris, a Southern Hemisphere star that formed about 20 million years ago approximately 60 light-years from Earth, the team found evidence that hydrogen is present in a quantity at least one-fifth the mass of Jupiter, or about four Neptune's worth of material.

The debris disk of the star 49 Ceti, which is visible near the celestial equator in the constellation Cetus, was found to contain hydrogen in a quantity at least 40 percent of the mass of Jupiter. Saturn's mass is just under a third that of Jupiter. 49 Ceti, which is about 10 million years old, is about 200 light-years from Earth.

Best of all was a 10-million-year-old Southern Hemisphere star about 260 light-years away that goes by the rather unpoetic name HD135344. That star's surrounding debris disk was found to contain the equivalent of at least six Jupiter masses of molecular hydrogen.

"There may not be enough material to form Jupiters around Beta Pictoris or 49 Ceti, but our figures establish a lower limit that is well within the gas-giant planet range, which means we definitely detected a fair amount of gas. And there could be more," Blake said. "Around HD135344, there's at least enough material to make six Jupiters."

Not only does the study reveal that there is still sufficient molecular hydrogen to make gas giants but also that planetary formation is not limited to a narrow time frame in the early history of a star, as previously thought. Because molecular hydrogen is quite difficult to detect from ground-based observatories, experts have relied on measurements of the more easily detectable carbon monoxide (CO) to model the gas dynamics of developing solar systems.

But because results showed that CO tends to dissipate quite rapidly in the early history of debris disks, researchers assumed that molecular hydrogen was likewise absent. Further, the presumed lack of hydrogen limited the time that Jupiter-sized planets could form. However, the new study, coupled with recent theoretical models, shows that CO is not a particularly good tracer of the total gas mass surrounding a new star.

Blake said the study opens new doors to the understanding of planetary growth processes around sun-like stars. He and his colleagues anticipate further progress when the Space Infrared Telescope Facility (SIRTF) and the Stratospheric Observatory for Infrared Astronomy (SOFIA) are launched in 2002. SIRTF, which will have its science headquarters at Caltech, alone could detect literally hundreds of stars that still contain enough primordial hydrogen in their debris disks to form Jupiter-sized planets.

The other authors of the paper are professor of astronomy Ewine F. van Dishoeck and Wing-Fai Thi (the study's lead author), both of the Leiden University in the Netherlands; Jochen Horn and professor Eric Becklin, both of the UCLA Department of Physics and Astronomy; Anneila Sargent, professor of astronomy at Caltech; Mario van den Ancker of the Harvard-Smithsonian Center for Astrophysics; and Antonella Natta of the Osservatorio Astrofisico di Arcetri in Firenze, Italy.

Writer: 
RT
Writer: 

Caltech, Agere Systems scientists developtechnique to shrink memory chips

Researchers at the California Institute of Technology and Agere Systems, formerly known as the Microelectronics Group of Lucent Technologies, have developed a technique that could result in a new generation of reliable nanoscale memory chips. This research could lead to smaller, less expensive cellular phones and digital cameras.

The research development, announced December 13 at the International Electron Devices Meeting, applies to a type of memory called "flash" memory, which continues to store information even when the devices are turned off. This information could include personal phone directories in a cellular phone or the pictures captured by a digital camera. In a typical cellular phone, there are 16 to 32 million bits of data stored on a silicon flash memory chip. Each bit of data is stored in a part of the flash memory chip called a "cell." As the size of silicon memory chips decreases, the chips are more and more difficult to make leakproof, resulting in the loss of stored date.

Using an aerosol technique developed at Caltech, the researchers formed memory cells by spraying silicon nanocrystals through a bath of high-temperature oxygen gas. The end result was memory cells comprised of silicon on the inside with a silicon dioxide outer shell. The silicon nanocrystals store the electrical charge, whereas the insulating silicon dioxide shell makes the nanocrystal memory cells more leakproof.

"As compared to conventional flash memories, these silicon nanocrystal memories offer higher performance, simpler fabrication processes, and greater promise for carrying memory miniaturization to its ultimate limit," said Harry Atwater, professor of applied physics and materials science at Caltech and project director.

To overcome the potential leakage problem, Atwater and Richard Flagan, McCollum Professor of Chemical Engineering, and their students at Caltech, and colleagues Jan de Blauwe and Martin Green at Agere Systems developed a method to break up each memory cell into 20,000 to 40,000 smaller cells. Therefore, even if several of the smaller cells spring a leak, the vast majority of the charge will not be lost and the bit of data stored in the whole memory cell will be retained.

The aerosol approach has several advantages over the conventional lithographic techniques used to make today's flash memory cells. Because it requires fewer steps, it is less expensive and takes less time to produce. In addition, the aerosol approach will allow researchers to continue making smaller and smaller devices.

So far, the researchers have created extremely robust flash memory cells. For instance, they have charged and dissipated a single cell one million cycles without significant degradation, whereas with traditional silicon chips, 10,000 cycles is considered satisfactory. While these research results are promising, it is premature to predict if or when the technology will be commercially implemented.

In addition to Atwater and Flagan, other members of the Caltech nanocrystal memory team are postdoctoral scholar Mark Brongersma, and graduate students Elizabeth Boer, Julie Casperson, and Michele Ostraat.

The research was supported by funding from the National Science Foundation and NASA.

Writer: 
RT
Writer: 
Exclude from News Hub: 
No

New research shows that the ears can sometimes trick the eyes

Though it seems to follow common sense that vision is the most dominant of the human senses, a new study by California Institute of Technology researchers shows that auditory signals can sometimes trick test subjects into misinterpreting what they have seen.

In a new study appearing in the Dec. 14 issue of the journal Nature, Caltech psychophysicists Ladan Shams, Yukiyasu Kamitani, and Shinsuke Shimojo report that auditory information can alter the perception of accompanying visual information, even when the visual input is otherwise unambiguous.

"We have discovered a visual illusion that is induced by sound," the authors write in the paper. Using a computer program that runs very short blips of light accompanied by beeps, the researchers asked test subjects to determine whether there was one or two flashes.

However, unknown to the subjects, the number of flashes mismatch that of beeps in some trials. When the subjects were shown the flash accompanied by one beep, everyone correctly stated that they had seen one flash. But when they were shown the flash with two very quick beeps spaced about 50 milliseconds apart, the subjects all erroneously reported that they had seen two flashes.

What's more, test subjects who were told that there was actually only one flash still continued to perceive two flashes when they heard two beeps.

According to Shimojo, a professor of biology at Caltech, the effect works only if the beeps are very rapid. When they are, "there's no way within the time window for vision to tell whether there's a single or double flash," he says.

According to Shams, a postdoctoral scholar working in Shimojo's lab and lead author of the paper, the results contribute to a shift in our view of visual processing from one "that is independent of other modalities, toward one that is more intertwined with other modalities, and can get as profoundly influenced by signals of other modalities as it influences them."

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Sequencing of Arabidopsis genome will havehuge payoffs, Caltech plant geneticist says

Whether or not the man was right when he said a mustard seed can move mountains, a poorer cousin of mustard named Arabidopsis has just been certified one of the heavy lifters of 21st-century biology.

With today's announcement that the international effort to sequence the Arabidopsis genome has been completed, plant biologists now have a powerful tool that is a triumph for biology as well as world agriculture, says Caltech plant geneticist Elliot Meyerowitz.

"Anything you learn in Arabidopsis is easily applied to crop plants," says Meyerowitz, in whose Caltech lab the first cloning and sequencing of an Arabidopsis gene took place.

"With knowledge from the genome sequencing, you might be able to make crops more resistant to disease and other plant problems," he said. "Fifty percent of all pre- and postharvest losses are due to pests, so if you could solve these problems, you could double the efficiency of world agriculture."

Arabidopsis is a nondescript weed of the mustard family that has a thin 6-inch-long stem, small green leaves, and tiny white blooms when it flowers. With no commercial, medicinal, decorative, or other practical uses, the plant is hardly even worth grubbing out of the flower bed when it springs up in its various habitats around the world.

But for geneticists, Arabidopsis is the powerhouse of the plant world. It is easy to plant and grow, maturing in a couple of weeks; it is small and thus requires little precious lab space; it is easy to clone and sequence its genes; and it produces plenty of seeds very quickly so that future generations—mutants and otherwise—can be studied. And now, Arabidopsis is the only plant species whose genome has been totally sequenced.

"Arabidopsis took off in the 1980s after it was demonstrated it has a very small genome, which makes it easier to clone genes," said Meyerowitz, a longtime supporter of and adviser to the international Arabidopsis genome project.

"One reason the plant was chosen was because it doesn't have that much DNA," he said. "Arabidopsis has about 125 million base pairs in the entire genome—and that's 20 times smaller than the human genome, and thus about 20 times less expensive to sequence. It's been a bargain."

The sequencing of the plant genome was originally proposed in 1994 for a 2004 completion, but experts later realized the project could be completed four years early—and under budget.

"Everybody shared the cost, and everybody will share the benefits—all the information is in the public domain," Meyerowitz says. "Taxpayers got a big bargain."

Sequencing Arabidopsis has benefits for the understanding of basic biological mechanisms, in much the same way that sequencing the roundworm or fruit fly has benefits. As a consequence of evolution, all organisms on Earth share a huge number of genes.

Thus, the information obtained from sequencing Arabidopsis as well as fruit flies and roundworms will contribute to advances in understanding how the genes of all living organisms are related. These underlying genetic interactions, in turn, will eventually lead to new treatments of human disease as well as the genetic engineering of agricultural products.

In addition to making crops more disease- and pest-resistant, genetic engineering could also change the time of flowering so that crops could be fitted to new environments; make plants more resistant to temperature changes; and possibly lengthen the roots so that plants could make more efficient use of nutrients.

Also, approximately one-fourth of all medicines were originally derived from plants, Meyerowitz says. So better understanding of the enzymes that create these pharmaceutical products could be used for creating new drugs as well as making existing drugs better and more efficient.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Human brain employs the same neurons in seeing an objectand later imagining it

In a study of nine epilepsy patients awaiting brain surgery, researchers have discovered that humans use the same neurons to conjure up mental images that they use when they see the real object with their eyes.

In the November 16 issue of the journal Nature, UCLA neurosurgeon and neuroscientist Itzhak Fried and Caltech neuroscientists Christof Koch and Gabriel Kreiman report on results obtained by questioning nine patients who had been fitted with brain sensors. The patients, all suffering from severe epilepsy uncontrolled with drugs, were being observed for a period of 1-2 weeks so that the regions of their brains responsible for their seizures could be identified and later surgically removed.

During their extended hospital stay, the patients were asked to look at photos of famous people such as President Clinton, pictures of animals, abstract drawings, and other images. While they were looking at the images, the researchers noted the precise neurons that were active.

Then, the subjects were instructed to close their eyes and vividly imagine the images. Again, the researchers took note of the neurons active at the time of visual imagery.

Analysis of the data showed that a subset of neurons in the hippocampus, amygdala, entorhinal cortex, and parahippocampal gyrus would fire both when the patient looked at the image, as well as when he or she imagined the image.

The results build upon previous work by Fried's group showing that single neurons in the human brain are involved in memory and can respond selectively to a wide variety of visual stimuli and stimulus features such as facial expression and gender.

According to Koch, a professor of computation and neural systems at Caltech, the study helps settle long-standing questions about the nature of human imagery. Particularly, the research sheds light on the process at work when humans see things with the "mind's eye."

"If you try to recall how many sunflowers there are in the Van Gogh painting, there is something that goes on in your head that gives rise to this visual image," Koch says. "There has been an ongoing debate about whether the brain areas involved in perception during 'vision with your eyes' are the same ones used during visual imagery."

The problem has been difficult to address because the techniques that yield very precise results in animals are generally not suitable for humans, and because the brain imaging techniques suitable for humans are not very precise, Koch says. Such techniques can image only large portions of the brain, each containing on the order of one million very diverse nerve cells.

"Recording the activity of single cells allows us to investigate the neuronal correlates of visual awareness at a detailed level of temporal and spatial resolution," says Kreiman.

The work was supported by the National Institutes of Health, the National Science Foundation, and the Center for Consciousness Studies at the University of Arizona.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

New results on Martian meteorite support hypothesisthat life can jump between planets

According to one version of the "panspermia" theory, life on Earth could originally have arrived here by way of meteorites from Mars, where conditions early in the history of the solar system are thought to have been more favorable for the creation of life from nonliving ingredients. The only problem has been how a meteorite could get blasted off of Mars without frying any microbial life hitching a ride.

But new research on the celebrated Martian meteorite ALH84001 shows that the rock never got hotter than 105 degrees Fahrenheit during its journey from the Red Planet to Earth, even during the impact that ejected it from Mars, or while plunging through Earth's atmosphere before landing on Antarctic ice thousands of years ago.

In the October 27 issue of the journal Science, Caltech graduate student Benjamin Weiss, undergraduate student Francis Macdonald, geobiology professor Joseph Kirschvink, and their collaborators at Vanderbilt and McGill universities explain results they obtained when testing several thin slices of the meteorite with a new state-of-the-art device known as an Ultra-High Resolution Scanning Superconducting Quantum Interference Device Microscope (UHRSSM). The machine, developed by Franz Baudenbacher and other researchers at Vanderbilt, is designed to detect microscopic differences in the orientation of magnetic lines in rock samples, with a sensitivity up to 10,000 times greater than existing machines.

"What's exciting about this study is that it shows the Martian meteorite made it from the surface of Mars to the surface of Earth without ever getting hot enough to destroy bacteria, or even plant seeds or fungi," says Weiss, the lead author of the Science paper. "Other studies have suggested that rocks can make it from Mars to Earth in a year, and that some living organisms can live in space for several years. So the transfer of life is quite feasible."

The meteorite ALH84001 has been the focus of numerous scientific studies in the last few years because some scientists think there is tantalizing evidence of fossilized life within the rock. The issue has never been conclusively resolved, but Weiss says the matter is not important to the present result.

"In fact, we don't think that this particular meteorite brought life here," says Weiss. "But computer simulations of ejected Martian meteorites demonstrate that about one billion tons of rocks have been brought to Earth from Mars since the two planets formed." Many of these rocks make the transit in less than one year, although ALH84001 took about 15 million years.

"The fact that at least one out of the 16 known Martian rocks made it here without heating tells us that this must be a fairly common process," says Kirschvink.

The sample the Kirschvink team worked with is about 1 mm thick and 2 cm in length and somewhat resembles the African continent, with one side containing a portion of the original surface of the meteorite. Using the UHRSSM, the team found that the sample has a highly aligned and intense magnetic field near the surface, which is to be expected because the surface reached a high temperature when it entered Earth's atmosphere.

The reason this is important is that any weakly magnetized rock will reorient its magnetization to be aligned with the local field direction after it has been heated to high temperatures and cooled. This critical temperature for any magnetic material is known as the blocking temperature. Thus, the outer surface layer of meteorite ALH84001 reached a high temperature well above the blocking temperatures of its magnetic materials, which caused the materials at the surface to realign with Earth's magnetic field.

However, the interior portions of the slice were found to have randomly oriented magnetization, which means that some of the materials inside the meteorite never reached their blocking temperatures since sometime before they left the Martian surface. Further, when the researchers gently heated another slice taken from the interior of the meteorite, they discovered that the interior of the rock started to demagnetize at temperatures as low as 40 degrees Celsius—or 105 degrees Fahrenheit—thus demonstrating that it had never been heated even to that level.

Thus, a radiation-resistant organism able to survive without energy and water for a year could have made the journey from Mars to Earth. Examples of such hardy organisms, like the bacteria bacillus subtilis and deinococcus radiodurans, are already well known.

"Realistically, we don't think any life forms more complicated than single-celled bacterial spores, very tough fungal spores, or well-protected seeds could have made it," Kirschvink says. "They would also have had to go into some kind of dormant stage."

Though the study does not directly address the issue of life in meteorites, the authors say the results eliminate a major objection to the panspermia theory—that any life form reaching Earth by meteorite would have been heat-sterilized during the violent ejection of the rock from its parent planet, or entry into the atmosphere. Prior studies have already shown that a meteorite can enter Earth's atmosphere without its inner material becoming hot.

"ALH 84001 has stimulated a remarkable amount of research to test the hypothesis that life exists elsewhere than on Earth. The present study indicates that the temperature inside the meteorite could have allowed life to persist and possibly travel to Earth from Mars," says Nobel Prize-winning biologist Baruch Blumberg, who is director of NASA's Astrobiology Institute.

The results also demonstrate that critical information could be lost if rocks brought back from Mars by a sample return mission were heat-sterilized, as has been proposed. Thermal sterilization would destroy valuable magnetic, biological, and petrological information contained in the samples.

If life ever evolved on Mars, it is likely to have jumped repeatedly to Earth over geological times. Because the reverse process—the transfer of Earth life to Mars—is dynamically much more difficult, it may be more important to instead protect any Martian biosphere from Earthly microbes.

According to Kirschvink, "The Martian biosphere, if it ever evolved, would most likely have been brought to Earth billions of years ago, and could have participated in the evolution and diversification of bacterial life here.

"So there is at least a chance that we are in part descended from Martian microbes," Kirschvink says.

The ALH84001 research was funded in part by NASA's Astrobiology Institute, an international research consortium involving academic, non-profit and NASA field centers, whose central administrative office is located at NASA's Ames Research Center in California's Silicon Valley. A group from the Jet Propulsion Laboratory in Pasadena, CA, which sponsored the Caltech research, is one of the 11 lead teams of the institute.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

LIGO establishes"first lock"

HANFORD, Wash.—Scientists from the Laser Interferometer Gravitational-wave Observatory (LIGO) announced today that they established "first lock" at the detector near Hanford, Washington.

The operational milestone marked the first time that the LIGO detector at Hanford, Washington, has simultaneously sent laser light back and forth along its two arms—each is one and a quarter miles long—thereby achieving the delicate optical interference that will make the detection of gravitational waves possible. This feat, dubbed "first lock" or "locking the interferometer," is similar to the "first light" of a newly commissioned telescope.

LIGO, a joint project of Caltech and MIT, is being built as a national research facility for detecting gravitational waves in the universe. LIGO comprises three detectors in the United States—two at Hanford and one near Livingston, Louisiana. The detectors will work in concert to detect gravitational waves, which are distortions of space-time caused by accelerating masses, such as exploding stars or vibrating black holes.

The LIGO detectors are set up in such a way that the very slight distortions of space-time in the vicinity of the detector's arms will cause perpendicular laser beams to go out of phase. Two observatory sites hundreds of miles apart are necessary to get a direction for the event causing the gravitational waves, and also to ascertain that the signals come from space and are not some local phenomenon.

To reach maximum sensitivity, LIGO employs a sophisticated computer-based control system to hold mirrors at the ends of the two arms in their proper locations with subatomic precision, while bouncing a laser beam back and forth between them. This "locking" of the interferometer will be the first full test of the first of three similar detectors being constructed by LIGO.

"This achievement brings us an important step closer to our real goal—LIGO's first gravitational-wave observations," said Barry Barish, who is Linde Professor of Physics at Caltech and LIGO director.

"First lock is a step toward bringing the apparatus to its full operating potential, but still some distance from the beginning of the scientific investigations that will ultimately come about," says Gary Sanders, deputy director of LIGO.

A complex scientific instrument

The LIGO detector comprises mirrors suspended in vacuum on fine wires at the corner and end of a long L. A highly stable laser beam is split, the two halves are sent back and forth about 100 times between the mirrors on the two arms, and then the beams are recombined. A passing gravitational wave will cause very small motions of the mirrors at the ends of the L, which scientists will observe by the changes they cause in the amplitude of the recombined light.

"The challenge is that the predicted motions of the mirrors due to even the strongest gravitational waves are incredibly small—about ten billionths of the diameter of an atom," explains Rainer Weiss, a physics professor at MIT. Weiss initially proposed building such a detector in 1973, and has worked to that end ever since.

"The detectors require extreme sensitivity to measure such motions, as well as to eliminate all other possible sources of disturbance to the mirrors," Weiss says.

A new tool for astrophysics

Gravitational waves are an important prediction of Einstein's theory of general relativity. They travel at the speed of light, but are very different from the electromagnetic waves—light, radio waves, X rays—that we are familiar with.

Gravitational waves are a periodic distortion of space itself, expanding distances along one arm while shrinking them along the other, then half a cycle later, shrinking the first arm while expanding the second. The waves that LIGO seeks are created by rapid accelerations of very massive astrophysical objects.

"Many of the sources of gravitational waves that LIGO may detect are difficult or even impossible to study using the familiar electromagnetic spectrum," says Kip Thorne, who is Feynman Professor of Theoretical Physics at Caltech and the leading theorist studying gravitational-wave sources. "These include black holes, neutron stars, and possibly even the Big Bang."

The search for gravitational waves has generated worldwide interest. Detectors are being built by the Japanese (the TAMA detector), by Italian and French collaborators (the Virgo project), and by German and British collaborators (the GEO600 project). Also, an Australian consortium has proposed a Southern Hemisphere detector.

Much remaining work

"As important as this milestone is, there is still a great deal more to do," emphasized Stan Whitcomb, director of commissioning for LIGO. "The detector control systems must be carefully characterized and tuned to achieve maximum sensitivity and reliable operation. And, of course, this is just the first of three interferometers that we have to commission."

Commissioning of the LIGO detectors will continue through the remainder of 2000 and 2001. Short periods of operation to test different aspects of their operation ("engineering runs") will be alternated with installation and commissioning. Full operation to detect gravitational waves will commence at the beginning of 2002.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT
Writer: 

Pages

Subscribe to RSS - research_news