Caltech to Receive $10 Million Grant from BP to Study Methane Conversion

Pasadena—A 10-year research grant totaling $10 million for the study of catalytically converting methane, the principal component of natural gas, to useful liquid fuels and chemicals, has been awarded to the California Institute of Technology by BP, one of the world's largest oil, gas, and petrochemical companies. An equal amount was awarded to the University of California, Berkeley.

Details about the grant and the implications this work could have on fuel usage internationally will be announced at a press conference at 4 p.m. Friday, September 22, at the Los Angeles Athletic Club, 431 West Seventh St., President's Room, fourth floor, Olive Street parking entrance.

Speakers will include Sir John Browne, BP's Chief Executive, David Baltimore, president of Caltech, and UC Berkeley Vice Chancellor Joseph Cerny.

Browne also will field questions about other energy-related issues including gas prices, crude oil supply, BP's take-over of ARCO, and cleaner burning fuels.

The grants have been agreed upon in principle by the universities, clearing the way for BP to establish a university research program at the two institutions, similar to one it previously announced at England's Cambridge University. The research will be directed by the respective faculty members and will involve undergraduate, graduate, and postdoctoral level students. Under the pending BP-funded proposal, each of the universities will work closely with BP and will receive $1 million per year for 10 years for the study of methane conversion.

The discovery of large reserves of natural gas in many parts of the world, some very remote, has stimulated efforts by BP to catalytically convert methane to useful end products, such as much cleaner fuels and chemicals that are more economical to transport and market.

"We believe the next breakthrough in the conversion of natural gas to liquids, which will help bring us the next generation of cleaner burning fuels, will come from catalysis combined with process engineering. These two universities have some of the world's finest scientific and engineering minds to help us accomplish this," said Browne.

"By undertaking this progressive collaboration with Caltech and UC Berkeley - which have taken vastly different approaches to solving this difficult methane conversion problem - we have formed a very innovative and substantial team."

The U.C. Berkeley group will be headed by Professor Alex Bell and will focus on heterogeneous catalytic approaches for producing liquid fuels and chemicals. Building on its strength in understanding catalyst structure-performance relationships, this group will seek major breakthroughs in catalyst and process design for both direct and indirect conversion of methane. By contrast, the Caltech team, led by Caltech's Beckman Institute Administrator Jay Labinger and Professor John Bercaw, will develop novel, homogeneous catalytic approaches, building on work their group has pursued for 10 years.

This new grant allows Caltech to take its research to the next level. The new, expanded research team will encompass work by some of the Institute's most respected researchers, Robert Grubbs, Mark Davis, Harry Gray, and Nate Lewis.

The funding will partially support as many as eight faculty members and 30 to 35 research staff, graduate students, and postdoctoral fellows at the two universities. As part of the grant, there will be frequently scheduled meetings and collaboration between the two groups.

Information in a number of supporting research areas - such as theoretical modeling, catalyst preparation, and process design - will also be shared, according to BP.

The two schools were selected based on their history of progressive research into catalytic conversion and on the reputation of their combined schools of chemistry and chemical engineering.

In accepting the grant, Caltech's president, David Baltimore, a Nobel prize-winning biologist, said, "The work performed will contribute to the education of a large number of young researchers, as it concurrently advances our ability to develop and exploit emerging technological and scientific concepts. It also enables us to broaden our base of funding for important scientific research that might otherwise go unexplored.

We applaud BP for its sincere efforts to bridge the gap between academia and the private sector in seeking ways to prevent the waste of natural resources and minimize environmental impact through research on converting natural gas to more economical and environmentally sensitive end uses," he added.

"From BP's perspective, partnering with leading educational and research institutions enables us to demonstrate responsible leadership while remaining on the cutting edge of scientific development, through funding projects that will not only benefit the company, but society as well, while offering opportunities that can result in a cleaner environment and a stronger economy. That is very much the way we expect to pursue aspects of fundamental scientific research," said Browne.

He also said that BP feels that because liquefaction and shipping of natural gas are expensive, the conversion of its principal component, methane, into useful end products is very attractive. The economics of methane conversion are strongly related to the capital investment required. To some extent, high capital costs are a consequence of the small scale that has been envisioned for most processes.

An overall aim is a major reduction in the energy required for conversion, potentially leading to substantially lower emissions of greenhouse gases.

London-based BP is a leader in solar power generation. It recently announced its "Clean Fuels 40 Cities' 2000 Program," dedicated to bringing cleaner fuels to cities worldwide.

###

For further information, contact:

BP: Cheryl Burnett (714) 670-5161 Berkeley: Jane Scheiber (510) 642-8782 Caltech: Jill Perry (626) 395-3226

Writer: 
JP
Writer: 

New visual test devised by Caltech, USC researchers

A new five-minute vision test using a desktop computer and touch-sensitive screen is showing promise as a diagnostic tool for a variety of eye diseases and even certain brain tumors.

Invented by California Institute of Technology physicist Wolfgang Fink and University of Southern California professor of ophthalmology and neurosurgery Alfredo A. Sadun, the 3-D Computer-Based Threshold Amsler Grid Test offers a novel method for medical personnel to evaluate the central visual field. The test is sensitive and specific enough to allow an ophthalmologist to diagnose visual disorders such as macular degeneration, and to discriminate between visual disorders with subtly different symptoms, such as macular edema and optic neuritis.

In order to take the test, the patient sits in front of a touch-sensitive computer screen displaying a grid pattern and a central bright spot. Staring at the central spot with one eye closed, the patient traces a finger around the portions of the grid he or she can see, and the computer records the information.

After the computer records the patient's tracings, the operator changes the contrast of the grid slightly and the patient again traces the visible portions of the grid. This process is repeated and information is gathered for the computer to process a three-dimensional profile of the patient's visual field for that eye. Then, the process is repeated for the other eye.

Patients suffering from macular degeneration, for example, experience a loss of vision at the central focus and thus will have trouble seeing the grid pattern near the center. Since macular degeneration sufferers have peripheral vision, they would likely trace a central hole on the screen, and if they also had a relative field defect, they might trace an ever-smaller circle as the brightness of the grid pattern intensified. Once the information was processed, the 3-D graph would provide doctors with a complete description of what the patient sees under various conditions.

The test will also be useful for diseases and conditions such as optic neuritis, detached retina, glaucoma, anterior ischemic optic neuropathy (AION), macular edema, central or branch retinal artery occlusions, and several genetic impairments. Also, the test can be used to detect, characterize, and even locate several types of brain tumors.

Thus, the new test is not only more revealing than standard visual field tests, but it is also much quicker and simpler than existing methods of characterizing the visual field, says Sadun. Likening the test to a recreational video game, Sadun says the new technology will be cheap and easily marketable, and also will be a powerful means of processing patient data.

"The patient is playing the game while the machine is digesting the information," Sadun says.

Fink created a program that permits the computer acquisition and analysis of the psychophysical techniques developed by Sadun. The computer processes patient responses into a computer profile.

Fink says the test is "a completely noninvasive way to understand and diagnose certain eye diseases."

"We can gain more information from this test than any other visual field test," Sadun says. "The test creates a greater sensitivity for detecting problems, it provides quantitative measures for monitoring, and it characterizes the 3-D visual field, which makes a big contribution to diagnosis."

The test has already been used since April on about 40 patients suffering from macular degeneration, AION, and optic neuritis. In the coming months, the researchers will begin testing the program on patients with glaucoma.

The 3-D Computer-Based Threshold Amsler Grid Test has been approved by USC's institutional review board. Fink and Sadun have applied for a U.S. patent.

Fink was supported by a grant from the National Science Foundation during the course of his work.

Writer: 
RT
Writer: 

NSF funds new Institute for Quantum Information at Caltech

The National Science Foundation has awarded a five-year, $5 million grant to the California Institute of Technology to create an institute devoted to quantum information science—a new field that could ultimately lead to devices such as quantum computers.

The announcement was part of a $90 million information technology research initiative the NSF announced today in Washington. The awards are aimed at seeding fundamental research in innovative applications of information technology.

Caltech's new Institute for Quantum Information will draw on several fields, including quantum physics, theoretical computer science, mathematics, and control and dynamical systems engineering, says founding director John Preskill, a professor of theoretical physics at Caltech.

"The goal of the institute will be to understand ways in which the principles of quantum physics can be exploited to enhance the performance of tasks involving the transmission, processing, and acquisition of information," says Preskill, who has worked on quantum computation algorithms for the last five years.

"The most potentially exciting aspect of the field is the promise of a quantum computer," he says. "If you could process quantum states instead of classical information, there are problems you could solve that could never be solved with classical technology."

Quantum computers would be more efficient than conventional computers because they would greatly reduce the number of steps the computer would have to jump through to solve many problems. For example, the encryption used to protect credit cards relies on the fact that it would take huge amounts of time for a conventional computer to break down a large number into its factors (the numbers one multiplies together that will equal this number).

It now takes the best computers several months to find the factors of a 130-digit number, and it would take 10 billion years to factor a 400-digit number—nearly the entire age of the universe. But a quantum computer with the same clock speed could factor the 400-digit number in about a minute, according to the figures Preskill has worked out.

At the same time, quantum information would provide a new means to thoroughly protect information from any intruder, Preskill says.

"By using quantum information, it's possible to make unbreakable codes, and this security is founded on fundamental physical laws," he says.

Also, the work of the new institute will advance research in the further miniaturization of classical electronic components. Quantum effects are becoming increasingly important for microelectronics as devices continue to shrink toward atomic dimensions.

In addition to Preskill, the Institute for Quantum Information will be led by two co-principal investigators who, in consultation with other Caltech researchers, will guide and supervise scientific activities. The initial co-principal investigators will be Jeff Kimble, an experimental physicist who has done groundbreaking work in the transmission of quantum information, and John Doyle, a professor of electrical engineering who is interested in control issues of quantum systems.

Other investigators at the institute will include Michelle Effros, Hideo Mabuchi, Michael Roukes, Axel Scherer, and Leonard Schulman, all Caltech faculty members. The institute will develop a substantial visitors' program and will aim at hiring postdoctoral researchers and graduate students who wish to enter the field of quantum information systems.

Contact: Robert Tindol (626) 395-3631

Writer: 
RT

Caltech researchers breed new genes to make natural products in bacteria

PASADENA—Using a new process of "sex in the test tube," a California Institute of Technology research group has been able to mate genes from different organisms and breed new genetic pathways in bacteria. These bacteria make an array of natural products that are naturally found in much more complex organisms.

The natural products, which are carotenoids similar to the pigment that gives carrots their color, are made by many different plants and microbes, but are totally foreign to the E. coli bacteria the researchers used. The new results, reported in the July issue of the journal Nature Biotechnology, show that the carotenoid-producing genes from different parent organisms can be shuffled together to create many-colored E. coli. Many of the carotenoids made in the bacteria are not even made by the organisms from which the parent genes came.

One of the reddish products, torulene, is not produced by any known bacteria, although it is found in certain red yeasts. "With molecular breeding, the experimenter can train the molecules and organisms to make new things that may not even be found in nature, but are valuable to us," says Frances Arnold, professor of chemical engineering and biochemistry at Caltech and coauthor of the new study.

Conceptually similar to dog breeding, the process generates progeny that are selected by researchers on the basis of attractive features. In this study, former Caltech researcher Claudia Schmidt-Dannert (now on the faculty at the University of Minnesota) and Caltech postdoctoral researcher Daisuke Umeno selected the new bacteria by their color.

This process of directed evolution, which Arnold has been instrumental in developing, is capable of creating new biological molecules and even new organisms with new or vastly improved characteristics. Unlike evolution in nature, where mutations are selected by "survival of the fittest," directed evolution, like breeding, allows scientists to dictate the characteristics of the molecules selected in each generation.

"We are now able to create natural products that usually have to come at great cost from esoteric sources simply by breeding ordinary genes in ordinary laboratory organisms," says Schmidt-Dannert.

The researchers believe that this method will be widely useful for making complex and expensive natural molecules such as antibiotics, dyes, and flavors. "Imagine being able to produce in simple bacteria many of the compounds that come from all over nature," says Arnold.

And, according to the authors, an even more irresistible target of directed evolution is finding bacteria that make biological molecules not yet found in nature.

Writer: 
Robert Tindol
Images: 
Writer: 
Exclude from News Hub: 
No

Caltech and the Human Genome Project

PASADENA- Two of the key inventions that made possible the monumental task of sequencing the human genome came from the California Institute of Technology. These were especially important in the sequencing of the 3 billion DNA base pairs composing the human genome because the inventions speeded up progress on the task.

The first landmark invention was a method for the automated sequencing of DNA by Leroy Hood, then a professor of biology at Caltech, and his colleagues, Mike Hunkapiller, Tim Hunkapiller, Charles Connell, and Lloyd Smith. Before their discovery, figuring out the sequence of a segment of DNA had been exceedingly difficult and laborious. Because the process was so slow and required the work of highly skilled technicians, it was clear to most scientists in the mid '80s that it would not be possible to sequence entire genomes by manual methods.

The method devised by Hood and his colleagues changed that. They developed a novel chemistry that permitted a machine to detect DNA molecules, using fluorescent light. This method revolutionized DNA sequencing, ultimately making it possible to launch the Human Genome Project. Coupled with some recent advances, the method remained the core for the just-completed phase of sequencing the human genome.

A second key invention for the genome project was developed at Caltech by Professor Melvin Simon, chair of Caltech's biology division, and his coworker Hiroaki Shizuya. They recognized that a critical part of sequencing would be preparing large DNA segments for the process. To accomplish this, they invented "bacterial artificial chromosomes" (BACs), which permit scientists to use bacteria as micromachines to accurately replicate pieces of human DNA that are over 100,000 base pairs in length. These BACs provided the major input DNA for both the public genome project and Celera.

The Simon research group was also a major contributor to the mapping and sequencing of chromosome 22-a substantial segment of the human genome, which was completed in 1999. These researchers are presently using genomic information to create an "onco-chip," which will give researchers convenient experimental access to a miniature array containing hundreds of BACs, each carrying a gene whose mutation can cause human cancer.

Caltech researchers, both current and past, have also been important in promoting the Human Genome Project itself-a project that originally met with scientific skepticism when it was born 12 years ago, particularly when the goal of a fully sequenced human genome by the year 2003 was announced.

That skepticism has long since been replaced by wholesale enthusiasm from the scientific community. David Baltimore, president of Caltech and a Nobel laureate for his work on the genes of viruses, was a highly influential supporter of the Human Genome Project at its inception. Baltimore, then a professor of biology at MIT, was one of an international cadre of farsighted biologists that also included Hood and Simon. They shared a vision of the future in which knowledge of every gene that composes the human genome would be available to any scientist in the world at the click of a computer key.

To shape this unprecedented and complex project, Caltech professors Norman Davidson, Barbara Wold, and Steve Koonin have served in national scientific advisory roles to the genome project in the intervening years. Also, Baltimore chaired the National Institutes of Health (NIH) meeting where the human genome project was launched.

Koonin, who is Caltech's provost, was chair of the JASON study of 1997, which noted to the scientific community that quality standards could be relaxed so that a "rough draft" of the human genome could be made years earlier and still be of great utility. This, in fact, was the approach that prevailed.

The Human Genome Project is unique among scientific projects for having set aside, from the beginning, research support for studies of the ethical, legal, and social implications of the new knowledge of human genes that would result. In Caltech's Division of the Humanities and Social Sciences, Professor Daniel Kevles has examined these ethical issues in his book The Code of Codes: Scientific and Social Issues in the Human Genome Project, which he coedited in 1992 with Leroy Hood.

Caltech scientists are also actively engaged in the future of genomics, which is the use of the newly obtained DNA sequences to discover and understand the function of genes in normal biology and in disease and disease susceptibility. This includes devising new ways to extract and manipulate information from the human genome sequence and from recently completed genome sequences of important experimental organisms used by scientists in the laboratory, such as the fruit fly, mustard weed, and yeast.

In one new project, Caltech recently became the home site for the international genome database for a key experimental organism called C. elegans, under the direction of Caltech Professor Paul Sternberg. This tiny worm has about 19,000 different genes, many of which correspond to related genes in humans. The shared origin and functional relationships between the genes of worm and man (and fruit fly and all other animals) let scientists learn much about how human genes work, by studying these small creatures in the laboratory.

The Worm Genome Database, called Wormbase, is undertaking the major task of collecting and making computer-accessible key information about every worm gene, its DNA sequence, and what its function is in the animal. This will require that new methods in automated data-mining and computing be brought together and fused with expert knowledge in biology, and then made accessible by computer to anyone interested.

Because of the relatedness of many genes and their functions among all animals, this information about the worm and its genome will be important for understanding human genes, and vice versa.

Another major genomics effort at Caltech is aimed at understanding how groups of genes work to direct development from a fertilized egg to an adult organism, and how these groups of genes change their action or fail in aging, cancer, or degenerative disease. The genomics approach to these problems involves the application of new computational methods and automated experimental technologies.

To do this, Barbara Wold, together with Mel Simon, Professor Stephen Quake from Caltech's Division of Engineering and Applied Science, and Dr. Eric Mjolsness of the NASA's Jet Propulsion Laboratory, have established the L. K. Whittier/Caltech Gene Expression Center, funded by the Whittier Foundation. The new work in genomics is also fueling new interdisciplinary programs at Caltech in the computational modeling of cells and organisms.

Writer: 
Robert Tindol
Writer: 

Physicists observe the quantum of heat flow

Physicists at the California Institute of Technology have announced the first observation of the quantum of thermal conductance. This discovery reveals a fundamental limit to the heat that can be conducted by objects of atomic dimensions.

The findings, reported in the April 27 issue of the journal Nature, could have profound implications for the future design of microscopic electronic devices and for the transmission of information, according to the research team leader, Caltech physics professor Michael Roukes.

The quantum of thermal conductance is best understood by beginning with a simple explanation of heat flow. In the everyday world, the amount of heat carried by an object can vary in a smooth and continuous way. Heat actually flows by means of collective, wavelike vibrations of the atoms that make up a solid material. Usually immense numbers of such waves, each inducing a unique type of synchronous motion of the atoms, act simultaneously to carry heat along a material.

Physicists know that waves sometimes act like particles and vice versa, so they've given these vibrations the particle-like name phonon (reminiscent of "electron" but named after the Greek root phon for sound.) For heat flow in the macroworld, since each phonon is just one among a sea of many others, an individual phonon's contribution alters the total only imperceptibly.

But in the nanoworld, this "phonon sea" is actually rather finite, quantum effects rule, and the heat conduction can become radically different. When an object becomes extremely small, only a limited number of phonons remain active and play a significant role in heat flow within it. In fact, in small devices at temperatures close to absolute zero, most types of motion become almost completely "frozen out," and heat must then be carried by only the several remaining types of wavelike motions that persist.

It has recently become apparent that, in this regime, a strict limit exists to the amount of heat that can be conducted in a small structure or device. Although never before observed, this maximum value is actually a fundamental law of nature, independent of composition or material. It stipulates that the only way thermal conductance can be increased in a very small device is simply to make the conductor larger.

The Nature paper reports that this fundamental limiting value, called the quantum of thermal conductance, can be observed by using tiny devices with specially patterned features only 100 billionths of a meter across (about 300 atoms wide). To carry out this work, Keith Schwab, a postdoctoral fellow in Roukes's group, developed special devices from silicon nitride with assistance from research staff member Erik Henriksen. The work was carried out in the group's nanofabrication and ultralow-temperature laboratories in Pasadena, in collaboration with University of Utah research professor, John Worlock, a visiting associate at Caltech.

The Roukes team has demonstrated that the maximum possible value of energy transported per wavelike motion (phonon mode) is a number composed of only fundamental physical constants and absolute temperature itself. (The relation is given by the product of pi squared, Boltzmann's constant squared, and absolute temperature, over three times Planck's constant.)

Numerically, at an ambient temperature of one kelvin, this quantized conductance roughly translates into a temperature rise of one kelvin upon the application of only a thousandth of a billionth of a watt of power (its precise value is 9.4 x 10^-13 W/K).]

Their new result has important implications for nanotechnology as well as for the transmission of information. Moore's Law, a popularized rule-of-thumb, can be used to loosely describe the continuous decrease in size of the individual building blocks (the transistors) that populate, now in the tens of millions, the integrated circuits forming today's powerful computer chips.

In the unrelenting technological drive toward increased function and decreased size, these individual transistor components have been scaled downward in size to a realm where the underlying physics of their operation can change. In the most extreme cases at the smallest scales, conventional operation may completely break down.

One example is the so-called "power dissipation problem" stemming from the fact that when each individual transistor on a microchip is turned on, each gives off a little heat. This accumulates to become a very significant problem when millions of such transistors, each in effect a microscopic heat generator, are placed in close proximity.

"This will become especially serious for future molecular-scale devices," says Roukes. "No matter how small it is, you always have to put a finite amount of power into a device to turn it on. In this quantum regime, when only a limited number of modes are capable of transferring heat, it will be crucial to take this fundamental limitation into account."

Separate theoretical studies carried out elsewhere indicate that this quantum of thermal conductance is universal, and independent of whether the heat is carried by electrons, phonons, or any other mechanism. "It would seem there is no way of escaping this fundamental law of nature," says Roukes.

These other studies indicate that the maximum thermal conductance, observed in this work, is linked to the maximum rate that information can flow into a device having a single quantum "channel." This surprising connection between information theory and thermodynamics is a manifestation of a deep connection between information and entropy.

"As we engineer smaller and higher speed computational elements, we will also encounter this fundamental quantum limitation in the rate of information flow," Schwab says.

The group's three-year effort followed upon work of Thomas Tighe, a previous postdoctoral fellow in the group, and culminated in new techniques for creating the miniature devices studied. At the heart of each device is an isolated heat reservoir, which the researchers term a "phonon cavity." It resembles a miniature plate freely suspended by four narrow beams. Each beam acts as a quasi one-dimensional "phonon waveguide" for heat flow, and it is precisely this reduced-dimensional flow that is the focus of the researchers' measurements.

On top of the cavity, Schwab and Henriksen patterned two small, separate patches of thin-film gold, described by Roukes as "puddles of electrons." In the course of a measurement, one of these is heated by passing a very small electrical current through it. Electrical connections allowing this current to flow were made using superconducting leads (patterned on top of the phonon waveguides).

This insures that heat is deposited only within the resistive gold film and, therefore, transferred only to the phonon cavity. To escape from the suspended device, the heat must eventually flow through the phonon waveguides. Since the waveguides' thermal conductance is weak, the phonon cavity temperature ultimately rises to a new, and hotter, steady-state level that directly reflects the thermal conductance of the phonon waveguides.

Measurement of the current-induced temperature rise within the small devices is a significant challenge in its own right, and required both ingenuity and the investment of a significant portion of the researchers' efforts. Most available thermometry techniques applicable at the nanoscale are electrical, and thus involve power levels that greatly exceed that used by the researchers in their measurements.

"The power level we used to carry out these experiments, about a femtowatt, is equivalent to the power your eye would receive from a 100-watt light bulb at a distance of about 60 miles," says Schwab. Instead of the standard electrical methods, the researchers coupled the second "electron puddle" to extremely sensitive dc SQUID (superconducting quantum interference device) circuitry.

This allowed them to observe the feeble current fluctuations that have a magnitude directly proportional to the absolute temperature of the nanoscale device. This so-called Johnson/Nyquist noise, which is also the origin of the electrical noise causing background hiss in audio systems, here plays a pivotal role by allowing the local temperature of the phonon cavity to be measured without perturbing the ultraminiature device.

In the end, because the researchers know the precise amount of heat deposited, and can directly measure the absolute temperature reached by the phonon cavity in response to it, they can directly measure the thermal conductance of the narrow beams acting as phonon waveguides. Simply stated, the ratio of the heat flowing through the waveguides to the rise in cavity temperature is the phonon thermal conductance of the quasi one-dimensional waveguides.

This work was carried out over the past three years within the research laboratories of Caltech Professor of Physics, Michael Roukes. Schwab, formerly a Sherman Fairchild Distinguished Postdoctoral Scholar within Roukes' group, is the principal author of the paper.

Schwab's life as a young postdoctoral scientist, and his role in the efforts to observe the quantum of thermal conductance, are the subjects of an upcoming documentary film by independent filmmaker Toni Sherwood. The title of the film is The Uncertainty Principle: Making of an American Scientist.

Coauthors of the paper are John Worlock, visiting associate at Caltech and research professor of physics at the University of Utah, a long time collaborator with Professor Roukes; and former research staff member Erik Henriksen.

Writer: 
RT
Writer: 

Cosmologists reveal first detailed images of early universe

PASADENA—Caltech cosmologists and other scientists involved in an international collaboration have released the first detailed images of the universe in its infancy. The images reveal the structure that existed in the universe when it was 50,000 times younger and 1,000 times smaller and hotter than it is today.

Detailed analysis of the images is already shedding light on some of cosmology's outstanding mysteries, including the nature of the dark matter and energy that dominate intergalactic space, and whether space is "curved" or "flat." The team's results are being published in the April 27 issue of the scientific journal Nature.

Cosmologists believe that the universe was created approximately 12–15 billion years ago in an enormous explosion called the Big Bang. The intense heat that filled the embryonic universe is still detectable today as a faint glow of microwave radiation that is visible in all directions. This radiation is known as the cosmic microwave background (CMB).

Since the CMB was first discovered by a ground-based radio telescope in 1965, scientists have eagerly sought to obtain high-resolution images of this radiation. NASA's COBE (Cosmic Background Explorer) satellite discovered the first evidence for structures, or spatial variations, in the CMB in 1991.

The new experiment, dubbed BOOMERANG (Balloon Observations of Millimetric Extragalactic Radiation and Geophysics), obtained the images using a telescope suspended from a balloon that circumnavigated the Antarctic in late 1998.

The balloon carried the telescope at an altitude of almost 37 kilometers (120,000 feet) for 10 1/2 days. As it flew, an extremely sensitive detector system developed at Caltech recorded the faint signals from the early universe.

The BOOMERANG images are the first to bring the CMB into sharp focus. The images reveal hundreds of complex regions that are visible as tiny variations—typically only one ten-thousandth of a degree (0.0001 C)— in the temperature of the CMB. The complex patterns visible in the images confirm predictions of the patterns that would result from sound waves racing through the early universe, creating the structures that by now have evolved into giant clusters and super-clusters of galaxies.

"These images represent the ultimate limit of our vision," said U.S. team leader Andrew Lange, physics professor at Caltech.

"The enormous structures that they reveal predate the first star or galaxy in the universe."

Lange and Italian team leader Paolo deBernardis of the University of Rome, La Sapienza, together led the international team that developed the sophisticated experiment. The entire payload was integrated at Caltech for months of extensive testing before it was taken to Antarctica.

Already, analysis of the size of the structures has produced the most precise measurements to date of the geometry of space-time, which strongly indicate that the geometry of the universe is flat, not curved.

"It is really exciting to obtain such strong evidence for a flat universe. This result is in agreement with a fundamental prediction of the 'inflationary' theory of the universe," said Caltech Postdoctoral Scholar Eric Hivon.

The theory hypothesizes that the entire universe grew from a tiny sub-atomic region during a period of violent expansion that occurred a split second after the Big Bang. The enormous expansion stretched the geometry of space till it was precisely flat.

"These measurements represent a watershed event in cosmology" commented Mark Kamionkowski, professor of theoretical astrophysics at Caltech. "The results suggest that we are on the right track with inflation—a hitherto speculative theory for the origin of the universe—and thus open up a path toward scientifically addressing what happened in the very first micro-micro-second after the Big Bang."

"The key to BOOMERANG's ability to obtain these powerful new images," explained Lange, "is the marriage of a powerful new detector technology developed at Caltech and the Jet Propulsion Lab with the superb microwave telescope and cryogenic systems developed in Italy."

The telescope optics focus the radiation from the early universe onto button-size "bolometric" detectors cooled to a fraction of a degree above absolute zero. Extremely sensitive thermometers embedded in each detector record tiny changes in temperature as the telescope scans across the sky.

"These detectors can 'see' tiny differences in the temperature of the early universe in much the same way as the back of your hand responds to the heat from the sun," explained Caltech graduate student Brendan Crill.

"What really sets this detector system apart," continued Viktor Hristov, a senior electronics engineer at Caltech, "is the stability of the detectors and the electronics used to record the faint signals."

Caltech and JPL are responsible for fabricating a similar detector system for the Planck Surveyor, a satellite that will someday image the CMB over the entire sky from a vantage point 1 million miles from Earth.

In a complementary effort, another Caltech team led by Professor Anthony Readhead is now obtaining images of the CMB at even sharper resolution, using a specially built radio telescope, the Cosmic Background Imager (CBI), from a remote site in the Chilean Andes. BOOMERANG and CBI herald a new era of precision cosmological measurement that promises to provide new insights into fundamental physics.

The 36 BOOMERANG team members come from 16 universities and organizations in Canada, Italy, the United Kingdom, and the United States. Primary support for the BOOMERANG project comes from the Italian Space Agency, Italian Antarctic Research Programme; and the University of Rome, La Sapienza; from the Particle Physics and Astronomy Research Council in the United Kingdom; and from the National Science Foundation and NASA in the United States.

Writer: 
Robert Tindol
Writer: 

Caltech scientists develop first microscopic system of pumps and valves made from soft materials

PASADENA—Researchers at the California Institute of Technology have developed a pump that is less than one-half the width of a human hair. The device is a breakthrough in the 3-D microfabrication of soft materials and could be applied to revolutionize and simplify many technologies, including drug discovery and delivery, according to Caltech applied physics professor Stephen R. Quake and his colleagues, who report their findings in the April 7 issue of Science.

Unlike the silicon-based micromachining techniques used for computer chips, this team has developed a technique called multilayer soft lithography, which is essentially an intricate casting of soft rubber. The work is an extension of soft lithography casting, originally developed by George Whitesides at Harvard University.

"Basically, it's plumbing on a very small scale," says Quake. "We are trying to show that it is useful to make microdevices out of soft rubber for certain applications, rather than the hard materials like glass or silicon used in traditional micromachining. In order to make a valve, one needs to figure out how to make it seal, which is usually done with a rubber washer. We made the entire valve out of the sealing material."

The pump is made possible because of the material's softness and pliability. Embedded in a small clear rubber chip the size of a postage stamp, the pump is actually a series of tiny, multilayer channels that each measure 50 by 30 by 10 microns. By contrast, a human hair is about 100 microns wide.

Operation of the pump is similar to the peristaltic motions that make human digestion possible. By applying pressure in one of the channels, another channel above it or below it in the 3-D matrix can be closed off, thereby allowing the channel to act either as a pump or as a valve.

While the research is basic and mainly aimed at demonstrating the feasibility of the technique, Quake says the pump could have a number of practical applications, including drug delivery, one day possibly enabling doctors to implant a biocompatible device about the size of a postage stamp into a patient's body to deliver drugs for chronic disorders such as allergies, pain, diabetes and cancer.

The device may allow the drug to be delivered in a time-released manner customized for each patient. In addition to delivering the drug, the device could also contain a microsized component that would enable regular monitoring of the patient's condition.

Quake's own lab intends to use the microfabricated valves and pumps in two devices: a DNA sizer, which is a replacement for the current technique known as gel electrophoresis; and a cell sorter, a machine that physically separates microscopic materials such as bacteria or viruses. Both devices originated from research in Quake's lab. Caltech has licensed this technology to Mycometrix Corporation of South San Francisco, which will apply it to develop a variety of commercial products.

In addition to Quake, the others involved in the research are Axel Scherer, a professor of electrical engineering, applied physics, and physics at Caltech; Marc Unger, a postdoctoral scholar in applied physics; Hou-Pu Chou, a graduate student in electrical engineering; and Todd Thorsen, a graduate student in biochemistry.

Writer: 
Robert Tindol
Writer: 

Using quantum atomistic computer simulations to solve industrial problems

PASADENA—In the world of engineering and applied science, ideas that look good on the drawing board often turn out to have annoying real-world problems, even though the finished products still look pretty good. An example is the aluminum car engine, which has the advantage of being lightweight, but tends to wear out more quickly than its heavier steel counterpart.

To solve such bedeviling problems, experts often find it necessary to go back to "first principles," which in the case of the aluminum engine may include a computer simulation of how the individual atoms slide around under wear and tear.

California Institute of Technology chemistry professor Bill Goddard had this type of problem in mind when he established a special center a decade ago within the campus's Beckman Institute. Christened the Materials and Process Simulation Center (MSC), Goddard's group set as their goals the development of computer simulation tools necessary to deal with materials and process issues, and the transfer of solutions to government and industry for the creation and improvement of products.

"We started the center to follow the dream of being able to predict chemical, biological, and materials processes with a computer," says Goddard. "The idea was to get a simulation that was close enough so that you wouldn't have to do the experiment."

Now that the MSC is celebrating its 10th anniversary, Goddard says the group has made some genuine progress on a number of real industrial problems—much to the satisfaction of corporate collaborators and sponsors, which at present are underwriting about 10 new projects each year.

In addition, the conference celebrates the 100th birthday of Arnold Beckman, the founder of Beckman Instruments and the benefactor of the Beckman Institute.

Since technology transfer and real-world results are a high priority, Goddard and his colleagues sponsor an annual meeting in which the collaborators showcase all their activities. This year's meeting, to be held March 23–24 at the Beckman Institute on campus, is also the 10th anniversary celebration of the center itself.

"There are several new accomplishments we'll discuss at this year's meeting," Goddard says. "We've had the first prediction of the structure of a membrane-bound protein, we've shown how to grow a new class of semiconductors to make real-world devices, and with our local collaborator Avery Dennison we've had success in predicting gas diffusion polymers.

"The bottom line is that it has worked out," he says. "In this center we have probably the most complete group of theorists in the world—about 40 people—and we've continued to have a flow of excellent grad students and postdocs who have gone on to be leaders in their fields."

A unique feature of the MSC is its emphasis in starting out with first principles, using quantum mechanics (the Schrödinger equation) to describe what is happening between atoms. For example, if the real-world problem is how best to lubricate a certain type of moving part (which is an actual industrially funded project the center has worked on), then the researchers would use the Schrodinger wave equation to build a simulation to show precisely how the electrons of a certain lubricant would interact with other electrons, how variable factors such as temperature and pressure would enter into the picture, and how a host of other interactions at the atomic level would play out.

But the quantum level is only the first in a hierarchy of regimes the center researchers might use in investigating complex problems. The quantum level with its Schrödinger equation is good for a system of about 100 atoms, but currently no computer can use quantum mechanics to predict the structure of hemoglobin, the protein that carries oxygen to our muscles.

Rather, for systems with up to about a million atoms, the center uses molecular dynamics techniques, essentially solving Newtonian equations.

For the billion or so atoms or particles that compose a "segment" of material, the MSC investigators employ the techniques of coarse-grain meso-scale modeling and tools such as phase diagrams. Beyond this point, for process simulation, materials applications, and engineering design involving the entire object, the center has developed yet another set of techniques.

This hierarchy of materials modeling is not describable merely by the number or size scale of particles. Time scales are also involved, with quantum mechanics operating at the femtosecond scale (a millionth of a billionth of a second), molecular dynamics at the nanosecond scale (a billionth of a second), coarse-grain meso-scale modeling at the millisecond scale, process simulation at the scale of minutes, and engineering design over periods ranging up to years.

Finally, the hierarchy has many crossover points, which particularly allow the center's research to be innovative and interdisciplinary.

"So you start with fundamentals of quantum mechanics, and imbed this in the next steps at all length scales and time scales," Goddard says. "The idea is to figure out why these things happen, and how looking at first principles can solve industrial problems."

Writer: 
Robert Tindol
Writer: 

Caltech grad student's team first to detect radio emission from a brown dwarf

A graduate student in astronomy from the California Institute of Technology recently led a team of researchers in finding the first radio emission ever detected from a brown dwarf, an enigmatic object that is neither star nor planet, but something in between.

The discovery, reported in the March 15 issue of the journal Nature by lead author Edo Berger and his colleagues, demonstrates that brown dwarfs can flare 10,000 times more intensely than theory predicted. The results will likely force experts to rethink their theories about magnetism in brown dwarfs and gas giants, says Berger's supervisor, Shri Kulkarni, who is John E. and Catherine T. MacArthur Professor of Astronomy and Planetary Science at Caltech.

Berger was leader of a student team that made the discovery during a special National Science Foundation student summer program at the NSF's Very Large Array (VLA) near Socorro, New Mexico. The brown dwarf they observed is named LP944-20.

Berger and his colleagues decided to make a long-shot gamble in attempting to observe a brown dwarf from which X-ray flares had been recently discovered with NASA's Chandra X-ray satellite.

"We did some background reading and realized that, based on predictions, the brown dwarf would be undetectable with the VLA," said Berger. "But we decided to try it anyway."

After consulting with Dale Frail, an astronomer at the National Radio Astronomy Observatory (NRAO), Berger and his colleagues decided to utilize a block of observing time traditionally dedicated to the summer students.

The day after they collected their data, the students gathered at the NRAO array operations center in Socorro to process the data and make the images. Berger, who had prior experience processing VLA data, worked alone in the same room as the other students, who were working together on another computer. Berger finished first and was shocked at his image.

I saw a bright object at the exact position of the brown dwarf, and was pretty sure I had made a mistake," Berger said.

He waited for the others, who were working under the guidance of another NRAO astronomer. Ten minutes later, the others also produced an image on the screen in which the same bright object showed up at the brown dwarf's location.

Berger then began breaking up the approximately 90 minutes' worth of data into smaller segments. His results showed that the brown dwarf's radio emission had risen to a strong peak, then weakened. This demonstrated that the brown dwarf had flared.

"Then we got real excited," Berger said, adding that the students immediately sought and received additional observing time. Soon they had captured two more flares.

"The radio emission these students discovered coming from this brown dwarf is 10,000 times stronger than anyone expected," Frail said. "This is going to open up a whole new area of research for the VLA."

The existence of brown dwarfs—objects with masses intermediate between stars and planets—had long been suspected but never confirmed until 1995, when Kulkarni made the first observation at Caltech's Palomar Observatory. Since then, a large number of brown dwarfs have been identified in systematic surveys of the sky. Astronomers now believe that there are as many brown dwarfs as stars in our galaxy.

Flaring and quiescent radio emissions have been seen previously from stars and from the giant planets of our solar system, but never before from a brown dwarf. Moreover, the strength of the magnetic field near the brown dwarf—as inferred from the radio observations—is well below that of Jupiter and orders of magnitude below that of low-mass stars, said Kulkarni.

Conventional wisdom would require large magnetic fields to accelerate the energetic particles responsible for the radio emissions. The same conventional wisdom says that brown dwarfs are expected to generate only short-lived magnetic fields.

However, the persistence of the radio emission of LP944-20 shows that the picture is not complete, Kulkarni said.

"I am very pleased that a first-year Caltech graduate student was able to spearhead such an undertaking, which led to this big discovery," said Kulkarni. "This discovery will spur theorists into obtaining a better understanding of magnetism in stars and planets."

In addition to Berger and Frail, the other authors of the paper are Steven Ball of New Mexico Institute of Mining and Technology, Kate Becker of Oberlin University, Melanie Clark of Carleton College, Therese Fukuda of the University of Denver, Ian Hoffman of the University of New Mexico, Richard Mellon of Penn State, Emmanuel Momjian of the University of Kentucky, Michael Murphy of Amherst College, Stacy Teng of the University of Maryland, Timothy Woodruff of Southwestern University, Ashley Zauderer of Agnes Scott College, and Bob Zavala of New Mexico State University.

[Editors: Additional information on this discovery is available at the NRAO Web site at http://www.nrao.edu/pr/browndwarf.html]

Writer: 
RT

Pages

Subscribe to RSS - research_news