Distant Massive Explosion Reveals a Hidden Stellar Factory

A gamma-ray burst detected in February has led astronomers to a galaxy where the equivalent of 500 new suns is being formed each year.

The discovery of a new "starburst galaxy," made by researchers from the National Radio Astronomy Observatory and the California Institute of Technology, provides support for the theory that gamma-ray bursts are caused by exploding young massive stars. Details of the discovery are being presented today at the Gamma 2001 conference.

"This is a tremendously exciting discovery, since gamma-rays can penetrate dusty veils, and thus gamma-ray bursts can be used to locate the hitherto difficult-to-study dust galaxies at high redshifts," says Fiona Harrison, assistant professor of physics and astronomy at Caltech. "Gamma-ray bursts may offer us a new way to study how stars are formed in the early universe."

Radiation from this gamma-ray burst was first detected in the constellation Bootes by the Italian satellite observatory BeppoSAX on Feb. 21. Within hours, astronomers worldwide received the news of the burst and began looking for a visible light counterpart. The burst was one of the brightest recorded in the four years BeppoSAX has been standing watch.

Gamma-ray bursts were first detected by satellites monitoring the Nuclear Test Ban Treaty in the 1970s, and were thought for many years to represent relatively modest outbursts on nearby neutron stars. The events have now been shown to originate in the farthest reaches of the universe. They produce, in a matter of seconds, more energy than the sun will generate in its entire 10-billion-year lifetime, and represent the most luminous explosions in the cosmos.

After the February event, astronomers at the U.S. Naval Observatory discovered the visible light counterpart, pinpointing the location of the event. An international collaboration, led by Dale Frail of the National Radio Astronomical Observatory and Harrison and Shri Kulkarni of Caltech, conducted a variety of observations using the Hubble Space Telescope, the Very Large Array radio telescope, the Chandra X-ray Observatory, Institut de RadioAstronomie Millimétrique (IRAM), and the James Clerk Maxwell telescope (JCMT).

Pivotal to the detection of the starburst galaxy was the latter telescope, which sits high atop Mauna Kea in Hawaii and is designed to make measurements at the shortest radio wavelengths capable of penetrating Earth's atmosphere, called the "submillimeter" portion of the spectrum. Only five and a half hours after the first sighting, a submillimeter source was found at the burst location.

Astronomers had expected to see a rapidly brightening signal with JCMT, a sign that the shock generated by the burst was moving through the dense gas surrounding the burst. Instead, much to everyone's surprise, the signal stayed constant, never varying during this time.

Furthermore, observations conducted at a slightly lower frequency by observers on the IRAM telescope in Southern Spain showed a much fainter source, strongly suggesting that the submillimeter observation was not simply detecting the afterglow of the explosion.

"The simplest explanation is that we have detected the light from the host galaxy of the burst," says Frail, explaining that it is rare to detect galaxies at submillimeter wavelengths. Only about one in every thousand that are visible with optical telescopes are observed by short-wavelength radio telescopes.

Astronomers in Arizona found the gamma-ray burst to lie roughly 8 billion light-years from Earth. This was also confirmed almost simulataneously by the Caltech group using one of the 10-meter Keck telescopes on Mauna Kea. The light we see from it shows the galaxy when the universe was less than half its present age. At this distance, the observed submillimeter brightness implies a prodigious rate of star formation—roughly 500 solar masses of material must be turning into stars each year, meaning that one or two new stars shine forth each day. The galaxy in which the burst occurred, then, may provide a glimpse of what the Milky Way looked like in its youth.

Previous searches for starburst galaxies in the distant universe have been hampered by the imprecise positions current submillimeter telescopes provide, and by the obscuring dust and gas that largely hides such galaxies from the view of optical telescopes. Observers led by Kulkarni used the Hubble Space Telescope to observe the fading embers of the explosion, but the underlying galaxy seems ordinary as seen in visible light, since most of this light is likely absorbed by dust and converted into submillimeter radiation. Had the galaxy been observed only in the optical wavelengths, astronomers would not have guessed that so many stars were being formed in it.

The discovery of a bright gamma-ray burst in a starburst system is exciting for two reasons: it strongly supports one model for the bursts themselves—the explosive destruction of a young, massive star—and it suggests a new way to locate such galaxies. With their enormous penetrating power, the energetic gamma rays punch right through the dusty veil, pinpointing the location of vigorous star-forming activity.

By following up on the hundreds of bursts that will be detected in the next few years, astronomers will be able to collect an unbiased sample of starburst galaxies at different distances in time and space, enabling them to explain the star-formation history of the universe.

The members of the Caltech team also include: Prof. S. George Djorgovski; postdoctoral fellows Derek Fox, Titus Galama, Daniel Reichart, Re'em Sari and Fabian Walter; and graduate students Edo Berger, Joshua Bloom, Paul Price, and Sarah Yost. Astronomers from several other institutions are also involved in the collaboration.

Writer: 
RT
Writer: 

Life rebounded quickly after collision 65 million years ago that wiped out dinosaurs

Though the dinosaurs fared poorly in the comet or meteor impact that destroyed two-thirds of all living species 65 million years ago, new evidence shows that various other forms of life rebounded from the catastrophe in a remarkably short period of time.

In the March 9 issue of the journal Science, a team of geochemists reports that life was indeed virtually wiped out for a period of time, but then reappeared just as abruptly only 10,000 years after the initial collision. Further, the evidence shows that the extinctions 65 million years ago, which mark the geologic time known as the Cretaceous-Tertiary (K-T) boundary, were most likely caused by a single catastrophic impact.

"There's been a longstanding debate whether the mass extinctions at the K-T boundary were caused by a single impact or maybe a swarm of millions of comets," says lead author Sujoy Mukhopadhyay, a graduate student at Caltech. "In addition, figuring out the duration of the extinction event and how long it took life to recover has been a difficult problem."

To address both questions, Mukhopadhyay and his colleagues measured the amount of cosmic dust in the sediments of an ancient sea bed which is now exposed on land about 100 miles north of Rome. In particular they focused on a two-centimeter-thick clay deposit that previously had been dated to about 65 million years ago. The base of this clay deposit corresponds to the date of the extinction event.

The clay deposit lies above a layer of limestone sediments, which are essentially the skeletons of microscopic sea life that settled at the bottom of the ancient sea. The limestone deposit also contains a certain percentage of clay particles, which result from erosion on the continents. Finally, mixed in the sediments is extraterrestrial dust that landed in Earth's oceans and then settled out. This dust carries a high concentration of helium-3 (3He), a rare isotope of helium that is depleted on Earth but highly enriched in cosmic matter.

The lower limestone layer abruptly ends at roughly 65 million years, since the organisms in the ocean were suddenly wiped out by the impact event. Thus, the layer immediately above the limestone contains nothing but the clay deposits and extraterrestrial dust that continued to settle at the bottom of the ancient sea. Immediately above the two-centimeter clay deposit is another layer of limestone deposits from microorganisms of the sea that eventually rebounded after the catastrophe.

In this study, the researchers measured the amount of 3He in the sediments to learn about the K-T extinction. They reasoned that a gigantic impact would not change the amount of 3He in the clay deposit. This is because large impacting bodies are mostly vaporized upon impact and release all their helium into the atmosphere. Because helium is a light element, it is not bound to Earth and tends to drift away into space. Therefore, even if a huge amount were brought to Earth by a large impact, the 3He would soon disappear and not show up in the sedimentary layers.

In contrast, 3He brought to Earth by extraterrestrial dust tends to stay trapped in the dust and not be lost to space, says Kenneth Farley, professor of geochemistry at Caltech and coauthor of the paper. So 3He found in the limestone and the clay deposits came from space in the form of dust.

Based on the 3He record obtained from the limestones, the researchers eliminated the possibility that a string of comets had caused the K-T extinctions. Comets are inherently dusty, so a string of them hitting Earth would have brought along a huge amount of new dust, thereby increasing the amount of 3He in the lower limestone deposit.

But the Italian sediment showed a steady concentration of 3He until the time of the impact, eliminating the possibility of a comet swarm. In fact, the researchers found no evidence for periodic comet showers, which have been suggested as the cause of mass extinction events on Earth.

Mukhopadhyay and his colleagues reason that because the "rain-rate" of the extraterrestrial dust from space did not change across the K-T boundary, the 3He concentration in the clay is proportional to the total depositional time of the clay. "It's been difficult to measure the time it took for this two-centimeter clay layer to be deposited," says Farley.

The researchers conclude that the two-centimeter clay layer was deposited in approximately 10,000 years. Then, very quickly, the tiny creatures that create limestone deposits reemerged and again began leaving their corpses on the ocean bed. The implication is that life can get started again very quickly, Farley says.

Thus the study answers two major questions about the event that led to the extinction of the dinosaurs, says Mukhopadhyay. In addition to Mukhopadhyay and Farley, the paper is also authored by Alessandro Montanari of the Geological Observatory in Apiro, Italy.

Writer: 
Robert Tindol
Writer: 

Researchers progress toward mutating a mousefor studying Parkinson's disease

Some inventors hope to build a better mousetrap, but California Institute of professor of biology Henry Lester's grand goal is to build a better mouse.

Not that the everyday laboratory mouse is inappropriate for a vast variety of biological and biomedical research. But for Parkinson's disease research, it has become clear that a strain of mutant mice with "slight" alterations would be a benefit in future medical studies. And not only would the mutant mice be useful for Parkinson's, but also for studies of anxiety and nicotine addiction.

Though Lester and his colleagues Johannes Schwarz and Cesar Labarca have not yet produced the mouse they envision, they have already achieved encouraging results by altering the molecules that form the receptors for nicotine in the mouse's brain. If they can just make these receptors overly sensitive in the right amount, they reason, the mice will develop Parkinson's disease after a few months of life.

Two earlier strains of mice were not ideal, but nonetheless convinced the Lester team members they were on the right track. One strain of mice suffered from nerve-cell degeneration too quickly, developing ion channels that opened literally before birth. These overly sensitive receptors essentially short-circuited some nerve cells. These mice usually do not survive birth, and never live long enough to reproduce.

Another strain developed modest nerve-cell degeneration in about a year, which is a long time in a mouse's life as well as a long time for a research project to wait for its test subjects. Lester wants the "Goldilocks mouse," with neurons that die "not before birth—that's too fast. Not at a year—that's too slow and incomplete. With a mouse strain that degenerates in three months, we could generate and test hypotheses several times per year."

Though they haven't achieved the "Goldilocks mouse" yet, the strain of mice developing modest degeneration after a year is particularly interesting. Tests showed that they were quite anxious, but tended to be calmed down by minuscule doses of nicotine. For reasons not entirely understood, humans who smoke are less likely to develop Parkinson's disease later in life, pointing to the likelihood that a mouse with hypersensitive nicotine receptors will be a good model for studying the disease.

In fact, the Lester team originally set out to build the strain of mice in order to study nicotine addiction and certain psychiatric diseases that might involve acetylcholine, a natural brain neurotransmitter that is mimicked by nicotine. The work in the past has been funded by the California Tobacco-Related Disease Research Program, the National Institute of Mental Health, and the National Institute of Neurological Disorders and Stroke (NINDS).

Once they had some altered mice, Schwarz (a neurologist who works with many Parkinson's patients) realized that the dopamine-containing nerve cells were dying fastest. The death of these cells is also a cause of Parkinson's disease. Because present mouse models for Parkinson's research are unsatisfactory, the researchers applied for and soon received funding from the National Parkinson Foundation, Inc. (NPF). Not only did the researchers receive the funding from the NPF, but they also were named recipients of the Richard E. Heikkila Research Scholar Award, which is presented for new directions in Parkinson's research.

"The Heikkila award is gratifying recognition for our new attempts to develop research at the intersection of clinical neuroscience and molecular neuroscience here at Caltech," says Lester.

Dr. Yuan Liu, program director at NINDS, says the Lester team's research is important not only because it is the first genetic manipulation of an ion channel that might lead to a mammalian model for Parkinson's disease, but also because the research is a pioneering effort in an emerging field called "channelopathy."

"Channelopathy addresses defects in ion channel function that causes diseases," Liu says. "Dr. Lester is one of the pioneers working in this field.

"We're excited about this development," she says, "because Parkinson's is a disease that affects such a large number of people—500,000 in the US. The research on Parkinson's is one of the research highlights that the NINDS is addressing."

The first results of the Lester team's research are reported in the current issue of the journal Proceedings of the National Academy of Sciences (PNAS).

In addition to Labarca, a member of the professional staff in the Caltech Department of Biology, and Schwarz, a visiting associate, the collaborators include groups led by professors James Boulter of UCLA and Jeanne Wehner of the University of Colorado.

Writer: 
RT
Writer: 

Odor recognition is a patterned, time-dependent process, research shows

PASADENA, Calif.-When Hamlet told the courtiers they would eventually "nose out" the hidden corpse of Polonius, he was perhaps a better neurobiologist than he realized. According to research by neuroscientists at the California Institute of Technology, the brain creates and uses subtle temporal codes to identify odors.

This research shows that the signals carried by certain neuron populations change over the duration of a sniff such that one first gets a general notion of the type of odor. Then, the wiring between these neurons performs work that leads to a more subtle discrimination, and thus, a precise recognition of the smell.

In the February 2 issue of the journal Science, Caltech biology and computation and neural systems professor Gilles Laurent and his colleague, postdoctoral scholar Rainer W. Friedrich, now at the Max Planck Institute in Heidelberg, Germany, report that the neurons of the olfactory bulb respond to an odor through a complicated process that evolves over a brief period of time. These neurons, called mitral cells because they resemble miters, thepointed hats worn by bishops, are found by the thousands in the olfactory bulb of humans.

"We're interested in how ensembles of neurons encode sensory information," explains Laurent, lead author of the study. "So we're less interested in where the relevant neurons lie, as revealed by brain mapping studies, than in the patterns of firing these neurons produce and in figuring out from these patterns how recognition, or decoding, works."

The researchers chose to use zebrafish in the study because these animals have comparatively few mitral cells and because much is already known about the types of odors that are behaviorally relevant to them. The Science study likely applies to other animals, including humans, because the olfactory systems of most living creatures appear to follow the same basic principles.

After placing electrodes in the brain of individual fish, the researchers subjected them sequentially to 16 amino-acid odors. Amino acids, the components of proteins, are found in the foods these fish normally go after in their natural environments.

By analyzing the signals produced by a population of mitral cells in response to each one of these odors, the researchers found that the information they could extract about the stimulus became more precise as time went by. The finding was surprising because the signals extracted from the neurons located upstream of the mitral cells, the receptors, showed no such temporal evolution.

"It looks as if the brain actively transforms static patterns into dynamic ones and in so doing, manages to amplify the subtle differences that are hard to perceive between static patterns," Laurent says.

"Music may provide a useful analogy. Imagine that the olfactory system is a chain of choruses-a receptor chorus, feeding onto a mitral-cell chorus and so on-and that each odor causes the receptor chorus to produce a chord.

"Two similar odors evoke two very similar chords from this chorus, making discrimination difficult to a listener," Laurent says. "What the mitral-cell chorus does is to transform each chord it hears into a musical phrase, in such a way that the difference between these phrases becomes greater over time. In this way, odors that, in this analogy, sounded alike, can progressively become more unique and more easily identified."

Applied to our own experience, this result could be described as follows: When we detect a citrus smell in a garden, for example, the odor is first conveyed by the receptors and the mitral cells. The initial firing of the cells allows for little more than the generic detection of the citrus nature of the smell.

Within a few tenths of a second, however, this initial activity causes new mitral cells to be recruited, leading the pattern of activity to change rapidly and become more unique. This quickly allows us to determine whether the citrus smell is actually a lemon or an orange.

However, the individual tuning of the mitral cells first stimulated by the citrus odor do not themselves become more specific. Instead, the manner in which the firing patterns unfold through the lateral circuitry of the olfactory bulb is ultimately responsible for the fine discrimination of the odor.

"Hence, as the system evolves, it loses information about the class of odors, but becomes able to convey information about precise identity," says Laurent. This study furthers progress toward understanding the logic of the olfactory coding.

Writer: 
Robert Tindol
Writer: 

Caltech/MIT Issue Voting Technology Report to Florida Task Force

PASADENA, Calif.- The Caltech/MIT Voting Technology Project has submitted a preliminary report to the task force studying the election in Florida. Their nationwide study of voting machines offers further evidence supporting the task force's call to replace punch card voting in Florida. The statistical analysis also uncovered a more surprising finding: electronic voting, as currently implemented, has performed less well than was widely believed.

The report examines the effect of voting technologies on unmarked and/or spoiled ballots. Researchers from both universities are collaboratively studying five voting technologies: paper ballots with hand-marked votes, lever machines, punch cards, optical scanning devices, and direct-recording electronic devices (DREs), which are similar to automatic teller machines.

The study focuses on so-called "undervotes" and "overvotes," which are combined into a group of uncounted ballots called "residual votes." These include ballots with votes for more than one candidate, with no vote, or that are marked in a way that is uncountable.

Careful statistical analysis shows that there are systematic differences across these technologies, and that paper ballots, optical scanning devices, and lever machines have significantly lower residual voting rates than punch-card systems and DREs. Overall, the residual voting rate for the first three systems averages about 2 percent, and for the last two systems averages about 3 percent. This study is the most extensive analysis ever of the effects of voting technology on under- and overvotes. The study covers the entire country for all presidential elections since 1988, and examines variations at the county level. When the study is complete, it will encompass presidential elections going back to 1980, and will examine a finer breakdown of the different technologies, and a breakdown of residual votes into its two components: over- and undervotes. A final report will be released in June. The Voting Project was the brainchild of California Institute of Technology president David Baltimore in collaboration with Massachusetts Institute of Technology president Charles Vest. It was announced in December 2000 and faculty from both campuses immediately began collecting data and studying the range of voting methods across the nation in the hope of avoiding in the future the vote-counting chaos that followed the 2000 presidential election.

The analysis is complicated by the fact that voting systems vary from county to county and across time. When a voting system is switched, say from lever machines to DREs, the number of residual votes can go up due to voter unfamiliarity with the new technology.

"We don't want to give the impression that electronic systems are necessarily inaccurate, but there is much room for improvement," said Thomas Palfrey, Caltech professor of economics and political science.

"Electronic voting technology is in its infancy and seems the most likely one to benefit significantly from new innovations and increased voter familiarity," states the 11-page report.

Caltech Contact: Jill Perry Media Relations (626) 395-3226 jperry@caltech.edu

MIT Contact: Kenneth Campbell News Office (617) 253-2700 kdc@mit.edu

Writer: 
JEP

Caltech and MIT Join Forces to Create Reliable, Uniform Voting System

Caltech and MIT are joining forces to develop a United States voting system that will be easy to use, reliable, secure and modestly priced, the presidents of the two universities announced today.

"America needs a uniform balloting procedure," said Caltech President David Baltimore and MIT President Charles M. Vest in a letter to Carnegie Corporation of New York President Vartan Gregorian who is recommending the corporation fund the initial research. "This has become painfully obvious in the current national election, but the issue is deeper and broader than one series of events."

Vest and Baltimore said the new technology "should minimize the possibility of confusion about how to vote, and offer clear verification of what vote is to be recorded. It should decrease to near zero the probability of miscounting votes... The voting technology should be tamper resistant and should minimize the prospect of manipulation and fraud."

The two university presidents proposed that their prestigious institutions give the project high priority for two major reasons:

"First, the technologies in wide use today are unacceptably unreliable. This manifests itself in at least three forms: undercounts (failure to correctly record a choice of candidate), overcounts (voting for two candidates), and missed ballots (machine failure or feeding error). Punch cards and optically scanned ballots are two of the most widely used technologies, and both suffer unacceptably high error rates in all three categories. For example, in the recent Florida election, optical scanning technology had an undercount rate of approximately 3 out of 1,000, and the punch card undercount rate was approximately 15 out of 1,000. Including the other two sources of errors, the overall ballot failure rate with machine counting was about three times this.

"Second, some of the most common types of machinery date from the late nineteenth century and have become obsolete. Most notably, many models of lever machines are no longer manufactured, and although spare parts are difficult to obtain, they are still widely used (accounting for roughly 15 percent of all ballots cast).

"States and municipalities using lever machines will have to replace them in the near future, and the two most common alternatives are punch cards and optical scanning devices. Ironically, many localities in Massachusetts have recently opted for lever machines over punch card ballots because of problems with punch cards registering preferences. "

Gregorian of Carnegie Corporation of New York, will recommend a grant of $250,000 to fund the initial research.

"I want to congratulate the two presidents of our nation's most distinguished universities for their leadership in this welcome and timely initiative on behalf of our election system," said Gregorian. "Voting is the fundamental safeguard of our democracy and we have the technological power to ensure that every person's vote does count. MIT and Caltech have assembled a team of America's top technology and political science scholars to deal with an issue no voter wants ignored. This research is certain to ensure that America's voting process is strengthened."

The project will involve a team of two professors from each university who are experts in technology, design and political science. Initially, they will be charged with defining the problem, surveying experiences with existing voting devices, and making a preliminary analysis of a variety of technological approaches. They will also set goals and create a plan for full-scale research and development of the new system.

The four members of the team are Massachusetts Institute of Technology Professors Stephen Ansolabehere of political science and Nicholas Negroponte, chairman of the Media Lab; and Caltech Professors Thomas Palfrey of political science and economics and Jehoshua Bruck of computation and neural systems and electrical engineering. Vest and Baltimore noted their credentials:

"Professors Ansolabehere and Palfrey have deep knowledge of the U.S. electoral process, have facility in tapping into what is known about existing voting systems and equipment, and have expertise in performing the statistical analyses that will be an integral part of the study. Equally important, any system that is suggested must be politically acceptable. Professors Ansolabehere and Palfrey will undertake a consultative process with the various relevant levels of government to find a solution with a reasonable likelihood of acceptance.

"Professor Negroponte and his MIT Media Lab colleagues and Professor Bruck at Caltech combine capabilities in technology, cognitive science, and graphic design. They can assess the various voting schemes that are currently available and, if necessary, design a new system that fulfills the technological and political needs of a fair voting process."

###

Caltech Contact: Robert Tindol Media Relations (626) 395-3631

MIT Contact: Kenneth Campbell News Office (617) 253-2700

Writer: 
jep

New planets still being createdIn our stellar neighborhood, study shows

In a study that strengthens the likelihood that solar systems like our own are still being formed, an international team of scientists is reporting today that three young stars in the sun's neighborhood have the raw materials necessary for the formation of Jupiter-sized planets.

Data obtained from the European Space Agency's Infrared Space Observatory (ISO) indicate for the first time that molecular hydrogen is present in the debris disks around young nearby stars. The results are important because experts have long thought that primordial hydrogen—the central building block of gas giants such as Jupiter and Saturn—is no longer present in sufficient quantities in the sun's stellar vicinity to form new planets.

The paper appears in the January 4 issue of the journal Nature.

"We looked at only three stars, but the results could indicate that it's easier to make Jupiter-sized planets than previously thought," said Geoffrey Blake, professor of cosmochemistry at the California Institute of Technology and corresponding author of the study. "There are over 100 candidate debris disks within about 200 light-years of the sun, and our work suggests that many of these systems may still be capable of making planets."

The abundance of Jupiter-sized planets is good news, though indirectly, in the search for extraterrestrial life. A gas giant such as Jupiter, may not be particularly hospitable for the formation of life, but experts think the mere presence of such huge bodies in the outer reaches of a solar system protects smaller rocky planets like Earth from catastrophic comet and meteor impacts. A Jupiter-sized planet possesses a gravitational field sufficient to kick primordial debris into the farthest reaches of the solar system, as Jupiter has presumably done by sending perhaps billions of comets into the Oort Cloud beyond the orbit of Pluto and safely away from Earth.

If comets and meteors were not ejected by gas giants, Blake said, life on Earth and any other Earth-like planets in the universe could periodically be "sterilized" by impacts.

"A comet the size of Hale-Bopp, for example, would vaporize much of Earth's oceans if it hit there," Blake said. "The impact from a 500 km object (about ten times the size of Hale-Bopp) could create nearly 100 atmospheres of rock vapor, the heat from which can evaporate all of the Earth's oceans."

The researchers did not directly detect any planets in the study, but nonetheless found that molecular hydrogen was abundant in all three disks they targeted. In the disk surrounding Beta Pictoris, a Southern Hemisphere star that formed about 20 million years ago approximately 60 light-years from Earth, the team found evidence that hydrogen is present in a quantity at least one-fifth the mass of Jupiter, or about four Neptune's worth of material.

The debris disk of the star 49 Ceti, which is visible near the celestial equator in the constellation Cetus, was found to contain hydrogen in a quantity at least 40 percent of the mass of Jupiter. Saturn's mass is just under a third that of Jupiter. 49 Ceti, which is about 10 million years old, is about 200 light-years from Earth.

Best of all was a 10-million-year-old Southern Hemisphere star about 260 light-years away that goes by the rather unpoetic name HD135344. That star's surrounding debris disk was found to contain the equivalent of at least six Jupiter masses of molecular hydrogen.

"There may not be enough material to form Jupiters around Beta Pictoris or 49 Ceti, but our figures establish a lower limit that is well within the gas-giant planet range, which means we definitely detected a fair amount of gas. And there could be more," Blake said. "Around HD135344, there's at least enough material to make six Jupiters."

Not only does the study reveal that there is still sufficient molecular hydrogen to make gas giants but also that planetary formation is not limited to a narrow time frame in the early history of a star, as previously thought. Because molecular hydrogen is quite difficult to detect from ground-based observatories, experts have relied on measurements of the more easily detectable carbon monoxide (CO) to model the gas dynamics of developing solar systems.

But because results showed that CO tends to dissipate quite rapidly in the early history of debris disks, researchers assumed that molecular hydrogen was likewise absent. Further, the presumed lack of hydrogen limited the time that Jupiter-sized planets could form. However, the new study, coupled with recent theoretical models, shows that CO is not a particularly good tracer of the total gas mass surrounding a new star.

Blake said the study opens new doors to the understanding of planetary growth processes around sun-like stars. He and his colleagues anticipate further progress when the Space Infrared Telescope Facility (SIRTF) and the Stratospheric Observatory for Infrared Astronomy (SOFIA) are launched in 2002. SIRTF, which will have its science headquarters at Caltech, alone could detect literally hundreds of stars that still contain enough primordial hydrogen in their debris disks to form Jupiter-sized planets.

The other authors of the paper are professor of astronomy Ewine F. van Dishoeck and Wing-Fai Thi (the study's lead author), both of the Leiden University in the Netherlands; Jochen Horn and professor Eric Becklin, both of the UCLA Department of Physics and Astronomy; Anneila Sargent, professor of astronomy at Caltech; Mario van den Ancker of the Harvard-Smithsonian Center for Astrophysics; and Antonella Natta of the Osservatorio Astrofisico di Arcetri in Firenze, Italy.

Writer: 
RT
Writer: 

Caltech, Agere Systems scientists developtechnique to shrink memory chips

Researchers at the California Institute of Technology and Agere Systems, formerly known as the Microelectronics Group of Lucent Technologies, have developed a technique that could result in a new generation of reliable nanoscale memory chips. This research could lead to smaller, less expensive cellular phones and digital cameras.

The research development, announced December 13 at the International Electron Devices Meeting, applies to a type of memory called "flash" memory, which continues to store information even when the devices are turned off. This information could include personal phone directories in a cellular phone or the pictures captured by a digital camera. In a typical cellular phone, there are 16 to 32 million bits of data stored on a silicon flash memory chip. Each bit of data is stored in a part of the flash memory chip called a "cell." As the size of silicon memory chips decreases, the chips are more and more difficult to make leakproof, resulting in the loss of stored date.

Using an aerosol technique developed at Caltech, the researchers formed memory cells by spraying silicon nanocrystals through a bath of high-temperature oxygen gas. The end result was memory cells comprised of silicon on the inside with a silicon dioxide outer shell. The silicon nanocrystals store the electrical charge, whereas the insulating silicon dioxide shell makes the nanocrystal memory cells more leakproof.

"As compared to conventional flash memories, these silicon nanocrystal memories offer higher performance, simpler fabrication processes, and greater promise for carrying memory miniaturization to its ultimate limit," said Harry Atwater, professor of applied physics and materials science at Caltech and project director.

To overcome the potential leakage problem, Atwater and Richard Flagan, McCollum Professor of Chemical Engineering, and their students at Caltech, and colleagues Jan de Blauwe and Martin Green at Agere Systems developed a method to break up each memory cell into 20,000 to 40,000 smaller cells. Therefore, even if several of the smaller cells spring a leak, the vast majority of the charge will not be lost and the bit of data stored in the whole memory cell will be retained.

The aerosol approach has several advantages over the conventional lithographic techniques used to make today's flash memory cells. Because it requires fewer steps, it is less expensive and takes less time to produce. In addition, the aerosol approach will allow researchers to continue making smaller and smaller devices.

So far, the researchers have created extremely robust flash memory cells. For instance, they have charged and dissipated a single cell one million cycles without significant degradation, whereas with traditional silicon chips, 10,000 cycles is considered satisfactory. While these research results are promising, it is premature to predict if or when the technology will be commercially implemented.

In addition to Atwater and Flagan, other members of the Caltech nanocrystal memory team are postdoctoral scholar Mark Brongersma, and graduate students Elizabeth Boer, Julie Casperson, and Michele Ostraat.

The research was supported by funding from the National Science Foundation and NASA.

Writer: 
RT
Writer: 
Exclude from News Hub: 
No

New research shows that the ears can sometimes trick the eyes

Though it seems to follow common sense that vision is the most dominant of the human senses, a new study by California Institute of Technology researchers shows that auditory signals can sometimes trick test subjects into misinterpreting what they have seen.

In a new study appearing in the Dec. 14 issue of the journal Nature, Caltech psychophysicists Ladan Shams, Yukiyasu Kamitani, and Shinsuke Shimojo report that auditory information can alter the perception of accompanying visual information, even when the visual input is otherwise unambiguous.

"We have discovered a visual illusion that is induced by sound," the authors write in the paper. Using a computer program that runs very short blips of light accompanied by beeps, the researchers asked test subjects to determine whether there was one or two flashes.

However, unknown to the subjects, the number of flashes mismatch that of beeps in some trials. When the subjects were shown the flash accompanied by one beep, everyone correctly stated that they had seen one flash. But when they were shown the flash with two very quick beeps spaced about 50 milliseconds apart, the subjects all erroneously reported that they had seen two flashes.

What's more, test subjects who were told that there was actually only one flash still continued to perceive two flashes when they heard two beeps.

According to Shimojo, a professor of biology at Caltech, the effect works only if the beeps are very rapid. When they are, "there's no way within the time window for vision to tell whether there's a single or double flash," he says.

According to Shams, a postdoctoral scholar working in Shimojo's lab and lead author of the paper, the results contribute to a shift in our view of visual processing from one "that is independent of other modalities, toward one that is more intertwined with other modalities, and can get as profoundly influenced by signals of other modalities as it influences them."

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Sequencing of Arabidopsis genome will havehuge payoffs, Caltech plant geneticist says

Whether or not the man was right when he said a mustard seed can move mountains, a poorer cousin of mustard named Arabidopsis has just been certified one of the heavy lifters of 21st-century biology.

With today's announcement that the international effort to sequence the Arabidopsis genome has been completed, plant biologists now have a powerful tool that is a triumph for biology as well as world agriculture, says Caltech plant geneticist Elliot Meyerowitz.

"Anything you learn in Arabidopsis is easily applied to crop plants," says Meyerowitz, in whose Caltech lab the first cloning and sequencing of an Arabidopsis gene took place.

"With knowledge from the genome sequencing, you might be able to make crops more resistant to disease and other plant problems," he said. "Fifty percent of all pre- and postharvest losses are due to pests, so if you could solve these problems, you could double the efficiency of world agriculture."

Arabidopsis is a nondescript weed of the mustard family that has a thin 6-inch-long stem, small green leaves, and tiny white blooms when it flowers. With no commercial, medicinal, decorative, or other practical uses, the plant is hardly even worth grubbing out of the flower bed when it springs up in its various habitats around the world.

But for geneticists, Arabidopsis is the powerhouse of the plant world. It is easy to plant and grow, maturing in a couple of weeks; it is small and thus requires little precious lab space; it is easy to clone and sequence its genes; and it produces plenty of seeds very quickly so that future generations—mutants and otherwise—can be studied. And now, Arabidopsis is the only plant species whose genome has been totally sequenced.

"Arabidopsis took off in the 1980s after it was demonstrated it has a very small genome, which makes it easier to clone genes," said Meyerowitz, a longtime supporter of and adviser to the international Arabidopsis genome project.

"One reason the plant was chosen was because it doesn't have that much DNA," he said. "Arabidopsis has about 125 million base pairs in the entire genome—and that's 20 times smaller than the human genome, and thus about 20 times less expensive to sequence. It's been a bargain."

The sequencing of the plant genome was originally proposed in 1994 for a 2004 completion, but experts later realized the project could be completed four years early—and under budget.

"Everybody shared the cost, and everybody will share the benefits—all the information is in the public domain," Meyerowitz says. "Taxpayers got a big bargain."

Sequencing Arabidopsis has benefits for the understanding of basic biological mechanisms, in much the same way that sequencing the roundworm or fruit fly has benefits. As a consequence of evolution, all organisms on Earth share a huge number of genes.

Thus, the information obtained from sequencing Arabidopsis as well as fruit flies and roundworms will contribute to advances in understanding how the genes of all living organisms are related. These underlying genetic interactions, in turn, will eventually lead to new treatments of human disease as well as the genetic engineering of agricultural products.

In addition to making crops more disease- and pest-resistant, genetic engineering could also change the time of flowering so that crops could be fitted to new environments; make plants more resistant to temperature changes; and possibly lengthen the roots so that plants could make more efficient use of nutrients.

Also, approximately one-fourth of all medicines were originally derived from plants, Meyerowitz says. So better understanding of the enzymes that create these pharmaceutical products could be used for creating new drugs as well as making existing drugs better and more efficient.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Pages

Subscribe to RSS - research_news