NSF funds new Institute for Quantum Information at Caltech

The National Science Foundation has awarded a five-year, $5 million grant to the California Institute of Technology to create an institute devoted to quantum information science—a new field that could ultimately lead to devices such as quantum computers.

The announcement was part of a $90 million information technology research initiative the NSF announced today in Washington. The awards are aimed at seeding fundamental research in innovative applications of information technology.

Caltech's new Institute for Quantum Information will draw on several fields, including quantum physics, theoretical computer science, mathematics, and control and dynamical systems engineering, says founding director John Preskill, a professor of theoretical physics at Caltech.

"The goal of the institute will be to understand ways in which the principles of quantum physics can be exploited to enhance the performance of tasks involving the transmission, processing, and acquisition of information," says Preskill, who has worked on quantum computation algorithms for the last five years.

"The most potentially exciting aspect of the field is the promise of a quantum computer," he says. "If you could process quantum states instead of classical information, there are problems you could solve that could never be solved with classical technology."

Quantum computers would be more efficient than conventional computers because they would greatly reduce the number of steps the computer would have to jump through to solve many problems. For example, the encryption used to protect credit cards relies on the fact that it would take huge amounts of time for a conventional computer to break down a large number into its factors (the numbers one multiplies together that will equal this number).

It now takes the best computers several months to find the factors of a 130-digit number, and it would take 10 billion years to factor a 400-digit number—nearly the entire age of the universe. But a quantum computer with the same clock speed could factor the 400-digit number in about a minute, according to the figures Preskill has worked out.

At the same time, quantum information would provide a new means to thoroughly protect information from any intruder, Preskill says.

"By using quantum information, it's possible to make unbreakable codes, and this security is founded on fundamental physical laws," he says.

Also, the work of the new institute will advance research in the further miniaturization of classical electronic components. Quantum effects are becoming increasingly important for microelectronics as devices continue to shrink toward atomic dimensions.

In addition to Preskill, the Institute for Quantum Information will be led by two co-principal investigators who, in consultation with other Caltech researchers, will guide and supervise scientific activities. The initial co-principal investigators will be Jeff Kimble, an experimental physicist who has done groundbreaking work in the transmission of quantum information, and John Doyle, a professor of electrical engineering who is interested in control issues of quantum systems.

Other investigators at the institute will include Michelle Effros, Hideo Mabuchi, Michael Roukes, Axel Scherer, and Leonard Schulman, all Caltech faculty members. The institute will develop a substantial visitors' program and will aim at hiring postdoctoral researchers and graduate students who wish to enter the field of quantum information systems.

Contact: Robert Tindol (626) 395-3631


Caltech researchers breed new genes to make natural products in bacteria

PASADENA—Using a new process of "sex in the test tube," a California Institute of Technology research group has been able to mate genes from different organisms and breed new genetic pathways in bacteria. These bacteria make an array of natural products that are naturally found in much more complex organisms.

The natural products, which are carotenoids similar to the pigment that gives carrots their color, are made by many different plants and microbes, but are totally foreign to the E. coli bacteria the researchers used. The new results, reported in the July issue of the journal Nature Biotechnology, show that the carotenoid-producing genes from different parent organisms can be shuffled together to create many-colored E. coli. Many of the carotenoids made in the bacteria are not even made by the organisms from which the parent genes came.

One of the reddish products, torulene, is not produced by any known bacteria, although it is found in certain red yeasts. "With molecular breeding, the experimenter can train the molecules and organisms to make new things that may not even be found in nature, but are valuable to us," says Frances Arnold, professor of chemical engineering and biochemistry at Caltech and coauthor of the new study.

Conceptually similar to dog breeding, the process generates progeny that are selected by researchers on the basis of attractive features. In this study, former Caltech researcher Claudia Schmidt-Dannert (now on the faculty at the University of Minnesota) and Caltech postdoctoral researcher Daisuke Umeno selected the new bacteria by their color.

This process of directed evolution, which Arnold has been instrumental in developing, is capable of creating new biological molecules and even new organisms with new or vastly improved characteristics. Unlike evolution in nature, where mutations are selected by "survival of the fittest," directed evolution, like breeding, allows scientists to dictate the characteristics of the molecules selected in each generation.

"We are now able to create natural products that usually have to come at great cost from esoteric sources simply by breeding ordinary genes in ordinary laboratory organisms," says Schmidt-Dannert.

The researchers believe that this method will be widely useful for making complex and expensive natural molecules such as antibiotics, dyes, and flavors. "Imagine being able to produce in simple bacteria many of the compounds that come from all over nature," says Arnold.

And, according to the authors, an even more irresistible target of directed evolution is finding bacteria that make biological molecules not yet found in nature.

Robert Tindol
Exclude from News Hub: 

Caltech and the Human Genome Project

PASADENA- Two of the key inventions that made possible the monumental task of sequencing the human genome came from the California Institute of Technology. These were especially important in the sequencing of the 3 billion DNA base pairs composing the human genome because the inventions speeded up progress on the task.

The first landmark invention was a method for the automated sequencing of DNA by Leroy Hood, then a professor of biology at Caltech, and his colleagues, Mike Hunkapiller, Tim Hunkapiller, Charles Connell, and Lloyd Smith. Before their discovery, figuring out the sequence of a segment of DNA had been exceedingly difficult and laborious. Because the process was so slow and required the work of highly skilled technicians, it was clear to most scientists in the mid '80s that it would not be possible to sequence entire genomes by manual methods.

The method devised by Hood and his colleagues changed that. They developed a novel chemistry that permitted a machine to detect DNA molecules, using fluorescent light. This method revolutionized DNA sequencing, ultimately making it possible to launch the Human Genome Project. Coupled with some recent advances, the method remained the core for the just-completed phase of sequencing the human genome.

A second key invention for the genome project was developed at Caltech by Professor Melvin Simon, chair of Caltech's biology division, and his coworker Hiroaki Shizuya. They recognized that a critical part of sequencing would be preparing large DNA segments for the process. To accomplish this, they invented "bacterial artificial chromosomes" (BACs), which permit scientists to use bacteria as micromachines to accurately replicate pieces of human DNA that are over 100,000 base pairs in length. These BACs provided the major input DNA for both the public genome project and Celera.

The Simon research group was also a major contributor to the mapping and sequencing of chromosome 22-a substantial segment of the human genome, which was completed in 1999. These researchers are presently using genomic information to create an "onco-chip," which will give researchers convenient experimental access to a miniature array containing hundreds of BACs, each carrying a gene whose mutation can cause human cancer.

Caltech researchers, both current and past, have also been important in promoting the Human Genome Project itself-a project that originally met with scientific skepticism when it was born 12 years ago, particularly when the goal of a fully sequenced human genome by the year 2003 was announced.

That skepticism has long since been replaced by wholesale enthusiasm from the scientific community. David Baltimore, president of Caltech and a Nobel laureate for his work on the genes of viruses, was a highly influential supporter of the Human Genome Project at its inception. Baltimore, then a professor of biology at MIT, was one of an international cadre of farsighted biologists that also included Hood and Simon. They shared a vision of the future in which knowledge of every gene that composes the human genome would be available to any scientist in the world at the click of a computer key.

To shape this unprecedented and complex project, Caltech professors Norman Davidson, Barbara Wold, and Steve Koonin have served in national scientific advisory roles to the genome project in the intervening years. Also, Baltimore chaired the National Institutes of Health (NIH) meeting where the human genome project was launched.

Koonin, who is Caltech's provost, was chair of the JASON study of 1997, which noted to the scientific community that quality standards could be relaxed so that a "rough draft" of the human genome could be made years earlier and still be of great utility. This, in fact, was the approach that prevailed.

The Human Genome Project is unique among scientific projects for having set aside, from the beginning, research support for studies of the ethical, legal, and social implications of the new knowledge of human genes that would result. In Caltech's Division of the Humanities and Social Sciences, Professor Daniel Kevles has examined these ethical issues in his book The Code of Codes: Scientific and Social Issues in the Human Genome Project, which he coedited in 1992 with Leroy Hood.

Caltech scientists are also actively engaged in the future of genomics, which is the use of the newly obtained DNA sequences to discover and understand the function of genes in normal biology and in disease and disease susceptibility. This includes devising new ways to extract and manipulate information from the human genome sequence and from recently completed genome sequences of important experimental organisms used by scientists in the laboratory, such as the fruit fly, mustard weed, and yeast.

In one new project, Caltech recently became the home site for the international genome database for a key experimental organism called C. elegans, under the direction of Caltech Professor Paul Sternberg. This tiny worm has about 19,000 different genes, many of which correspond to related genes in humans. The shared origin and functional relationships between the genes of worm and man (and fruit fly and all other animals) let scientists learn much about how human genes work, by studying these small creatures in the laboratory.

The Worm Genome Database, called Wormbase, is undertaking the major task of collecting and making computer-accessible key information about every worm gene, its DNA sequence, and what its function is in the animal. This will require that new methods in automated data-mining and computing be brought together and fused with expert knowledge in biology, and then made accessible by computer to anyone interested.

Because of the relatedness of many genes and their functions among all animals, this information about the worm and its genome will be important for understanding human genes, and vice versa.

Another major genomics effort at Caltech is aimed at understanding how groups of genes work to direct development from a fertilized egg to an adult organism, and how these groups of genes change their action or fail in aging, cancer, or degenerative disease. The genomics approach to these problems involves the application of new computational methods and automated experimental technologies.

To do this, Barbara Wold, together with Mel Simon, Professor Stephen Quake from Caltech's Division of Engineering and Applied Science, and Dr. Eric Mjolsness of the NASA's Jet Propulsion Laboratory, have established the L. K. Whittier/Caltech Gene Expression Center, funded by the Whittier Foundation. The new work in genomics is also fueling new interdisciplinary programs at Caltech in the computational modeling of cells and organisms.

Robert Tindol

Physicists observe the quantum of heat flow

Physicists at the California Institute of Technology have announced the first observation of the quantum of thermal conductance. This discovery reveals a fundamental limit to the heat that can be conducted by objects of atomic dimensions.

The findings, reported in the April 27 issue of the journal Nature, could have profound implications for the future design of microscopic electronic devices and for the transmission of information, according to the research team leader, Caltech physics professor Michael Roukes.

The quantum of thermal conductance is best understood by beginning with a simple explanation of heat flow. In the everyday world, the amount of heat carried by an object can vary in a smooth and continuous way. Heat actually flows by means of collective, wavelike vibrations of the atoms that make up a solid material. Usually immense numbers of such waves, each inducing a unique type of synchronous motion of the atoms, act simultaneously to carry heat along a material.

Physicists know that waves sometimes act like particles and vice versa, so they've given these vibrations the particle-like name phonon (reminiscent of "electron" but named after the Greek root phon for sound.) For heat flow in the macroworld, since each phonon is just one among a sea of many others, an individual phonon's contribution alters the total only imperceptibly.

But in the nanoworld, this "phonon sea" is actually rather finite, quantum effects rule, and the heat conduction can become radically different. When an object becomes extremely small, only a limited number of phonons remain active and play a significant role in heat flow within it. In fact, in small devices at temperatures close to absolute zero, most types of motion become almost completely "frozen out," and heat must then be carried by only the several remaining types of wavelike motions that persist.

It has recently become apparent that, in this regime, a strict limit exists to the amount of heat that can be conducted in a small structure or device. Although never before observed, this maximum value is actually a fundamental law of nature, independent of composition or material. It stipulates that the only way thermal conductance can be increased in a very small device is simply to make the conductor larger.

The Nature paper reports that this fundamental limiting value, called the quantum of thermal conductance, can be observed by using tiny devices with specially patterned features only 100 billionths of a meter across (about 300 atoms wide). To carry out this work, Keith Schwab, a postdoctoral fellow in Roukes's group, developed special devices from silicon nitride with assistance from research staff member Erik Henriksen. The work was carried out in the group's nanofabrication and ultralow-temperature laboratories in Pasadena, in collaboration with University of Utah research professor, John Worlock, a visiting associate at Caltech.

The Roukes team has demonstrated that the maximum possible value of energy transported per wavelike motion (phonon mode) is a number composed of only fundamental physical constants and absolute temperature itself. (The relation is given by the product of pi squared, Boltzmann's constant squared, and absolute temperature, over three times Planck's constant.)

Numerically, at an ambient temperature of one kelvin, this quantized conductance roughly translates into a temperature rise of one kelvin upon the application of only a thousandth of a billionth of a watt of power (its precise value is 9.4 x 10^-13 W/K).]

Their new result has important implications for nanotechnology as well as for the transmission of information. Moore's Law, a popularized rule-of-thumb, can be used to loosely describe the continuous decrease in size of the individual building blocks (the transistors) that populate, now in the tens of millions, the integrated circuits forming today's powerful computer chips.

In the unrelenting technological drive toward increased function and decreased size, these individual transistor components have been scaled downward in size to a realm where the underlying physics of their operation can change. In the most extreme cases at the smallest scales, conventional operation may completely break down.

One example is the so-called "power dissipation problem" stemming from the fact that when each individual transistor on a microchip is turned on, each gives off a little heat. This accumulates to become a very significant problem when millions of such transistors, each in effect a microscopic heat generator, are placed in close proximity.

"This will become especially serious for future molecular-scale devices," says Roukes. "No matter how small it is, you always have to put a finite amount of power into a device to turn it on. In this quantum regime, when only a limited number of modes are capable of transferring heat, it will be crucial to take this fundamental limitation into account."

Separate theoretical studies carried out elsewhere indicate that this quantum of thermal conductance is universal, and independent of whether the heat is carried by electrons, phonons, or any other mechanism. "It would seem there is no way of escaping this fundamental law of nature," says Roukes.

These other studies indicate that the maximum thermal conductance, observed in this work, is linked to the maximum rate that information can flow into a device having a single quantum "channel." This surprising connection between information theory and thermodynamics is a manifestation of a deep connection between information and entropy.

"As we engineer smaller and higher speed computational elements, we will also encounter this fundamental quantum limitation in the rate of information flow," Schwab says.

The group's three-year effort followed upon work of Thomas Tighe, a previous postdoctoral fellow in the group, and culminated in new techniques for creating the miniature devices studied. At the heart of each device is an isolated heat reservoir, which the researchers term a "phonon cavity." It resembles a miniature plate freely suspended by four narrow beams. Each beam acts as a quasi one-dimensional "phonon waveguide" for heat flow, and it is precisely this reduced-dimensional flow that is the focus of the researchers' measurements.

On top of the cavity, Schwab and Henriksen patterned two small, separate patches of thin-film gold, described by Roukes as "puddles of electrons." In the course of a measurement, one of these is heated by passing a very small electrical current through it. Electrical connections allowing this current to flow were made using superconducting leads (patterned on top of the phonon waveguides).

This insures that heat is deposited only within the resistive gold film and, therefore, transferred only to the phonon cavity. To escape from the suspended device, the heat must eventually flow through the phonon waveguides. Since the waveguides' thermal conductance is weak, the phonon cavity temperature ultimately rises to a new, and hotter, steady-state level that directly reflects the thermal conductance of the phonon waveguides.

Measurement of the current-induced temperature rise within the small devices is a significant challenge in its own right, and required both ingenuity and the investment of a significant portion of the researchers' efforts. Most available thermometry techniques applicable at the nanoscale are electrical, and thus involve power levels that greatly exceed that used by the researchers in their measurements.

"The power level we used to carry out these experiments, about a femtowatt, is equivalent to the power your eye would receive from a 100-watt light bulb at a distance of about 60 miles," says Schwab. Instead of the standard electrical methods, the researchers coupled the second "electron puddle" to extremely sensitive dc SQUID (superconducting quantum interference device) circuitry.

This allowed them to observe the feeble current fluctuations that have a magnitude directly proportional to the absolute temperature of the nanoscale device. This so-called Johnson/Nyquist noise, which is also the origin of the electrical noise causing background hiss in audio systems, here plays a pivotal role by allowing the local temperature of the phonon cavity to be measured without perturbing the ultraminiature device.

In the end, because the researchers know the precise amount of heat deposited, and can directly measure the absolute temperature reached by the phonon cavity in response to it, they can directly measure the thermal conductance of the narrow beams acting as phonon waveguides. Simply stated, the ratio of the heat flowing through the waveguides to the rise in cavity temperature is the phonon thermal conductance of the quasi one-dimensional waveguides.

This work was carried out over the past three years within the research laboratories of Caltech Professor of Physics, Michael Roukes. Schwab, formerly a Sherman Fairchild Distinguished Postdoctoral Scholar within Roukes' group, is the principal author of the paper.

Schwab's life as a young postdoctoral scientist, and his role in the efforts to observe the quantum of thermal conductance, are the subjects of an upcoming documentary film by independent filmmaker Toni Sherwood. The title of the film is The Uncertainty Principle: Making of an American Scientist.

Coauthors of the paper are John Worlock, visiting associate at Caltech and research professor of physics at the University of Utah, a long time collaborator with Professor Roukes; and former research staff member Erik Henriksen.


Cosmologists reveal first detailed images of early universe

PASADENA—Caltech cosmologists and other scientists involved in an international collaboration have released the first detailed images of the universe in its infancy. The images reveal the structure that existed in the universe when it was 50,000 times younger and 1,000 times smaller and hotter than it is today.

Detailed analysis of the images is already shedding light on some of cosmology's outstanding mysteries, including the nature of the dark matter and energy that dominate intergalactic space, and whether space is "curved" or "flat." The team's results are being published in the April 27 issue of the scientific journal Nature.

Cosmologists believe that the universe was created approximately 12–15 billion years ago in an enormous explosion called the Big Bang. The intense heat that filled the embryonic universe is still detectable today as a faint glow of microwave radiation that is visible in all directions. This radiation is known as the cosmic microwave background (CMB).

Since the CMB was first discovered by a ground-based radio telescope in 1965, scientists have eagerly sought to obtain high-resolution images of this radiation. NASA's COBE (Cosmic Background Explorer) satellite discovered the first evidence for structures, or spatial variations, in the CMB in 1991.

The new experiment, dubbed BOOMERANG (Balloon Observations of Millimetric Extragalactic Radiation and Geophysics), obtained the images using a telescope suspended from a balloon that circumnavigated the Antarctic in late 1998.

The balloon carried the telescope at an altitude of almost 37 kilometers (120,000 feet) for 10 1/2 days. As it flew, an extremely sensitive detector system developed at Caltech recorded the faint signals from the early universe.

The BOOMERANG images are the first to bring the CMB into sharp focus. The images reveal hundreds of complex regions that are visible as tiny variations—typically only one ten-thousandth of a degree (0.0001 C)— in the temperature of the CMB. The complex patterns visible in the images confirm predictions of the patterns that would result from sound waves racing through the early universe, creating the structures that by now have evolved into giant clusters and super-clusters of galaxies.

"These images represent the ultimate limit of our vision," said U.S. team leader Andrew Lange, physics professor at Caltech.

"The enormous structures that they reveal predate the first star or galaxy in the universe."

Lange and Italian team leader Paolo deBernardis of the University of Rome, La Sapienza, together led the international team that developed the sophisticated experiment. The entire payload was integrated at Caltech for months of extensive testing before it was taken to Antarctica.

Already, analysis of the size of the structures has produced the most precise measurements to date of the geometry of space-time, which strongly indicate that the geometry of the universe is flat, not curved.

"It is really exciting to obtain such strong evidence for a flat universe. This result is in agreement with a fundamental prediction of the 'inflationary' theory of the universe," said Caltech Postdoctoral Scholar Eric Hivon.

The theory hypothesizes that the entire universe grew from a tiny sub-atomic region during a period of violent expansion that occurred a split second after the Big Bang. The enormous expansion stretched the geometry of space till it was precisely flat.

"These measurements represent a watershed event in cosmology" commented Mark Kamionkowski, professor of theoretical astrophysics at Caltech. "The results suggest that we are on the right track with inflation—a hitherto speculative theory for the origin of the universe—and thus open up a path toward scientifically addressing what happened in the very first micro-micro-second after the Big Bang."

"The key to BOOMERANG's ability to obtain these powerful new images," explained Lange, "is the marriage of a powerful new detector technology developed at Caltech and the Jet Propulsion Lab with the superb microwave telescope and cryogenic systems developed in Italy."

The telescope optics focus the radiation from the early universe onto button-size "bolometric" detectors cooled to a fraction of a degree above absolute zero. Extremely sensitive thermometers embedded in each detector record tiny changes in temperature as the telescope scans across the sky.

"These detectors can 'see' tiny differences in the temperature of the early universe in much the same way as the back of your hand responds to the heat from the sun," explained Caltech graduate student Brendan Crill.

"What really sets this detector system apart," continued Viktor Hristov, a senior electronics engineer at Caltech, "is the stability of the detectors and the electronics used to record the faint signals."

Caltech and JPL are responsible for fabricating a similar detector system for the Planck Surveyor, a satellite that will someday image the CMB over the entire sky from a vantage point 1 million miles from Earth.

In a complementary effort, another Caltech team led by Professor Anthony Readhead is now obtaining images of the CMB at even sharper resolution, using a specially built radio telescope, the Cosmic Background Imager (CBI), from a remote site in the Chilean Andes. BOOMERANG and CBI herald a new era of precision cosmological measurement that promises to provide new insights into fundamental physics.

The 36 BOOMERANG team members come from 16 universities and organizations in Canada, Italy, the United Kingdom, and the United States. Primary support for the BOOMERANG project comes from the Italian Space Agency, Italian Antarctic Research Programme; and the University of Rome, La Sapienza; from the Particle Physics and Astronomy Research Council in the United Kingdom; and from the National Science Foundation and NASA in the United States.

Robert Tindol

Caltech scientists develop first microscopic system of pumps and valves made from soft materials

PASADENA—Researchers at the California Institute of Technology have developed a pump that is less than one-half the width of a human hair. The device is a breakthrough in the 3-D microfabrication of soft materials and could be applied to revolutionize and simplify many technologies, including drug discovery and delivery, according to Caltech applied physics professor Stephen R. Quake and his colleagues, who report their findings in the April 7 issue of Science.

Unlike the silicon-based micromachining techniques used for computer chips, this team has developed a technique called multilayer soft lithography, which is essentially an intricate casting of soft rubber. The work is an extension of soft lithography casting, originally developed by George Whitesides at Harvard University.

"Basically, it's plumbing on a very small scale," says Quake. "We are trying to show that it is useful to make microdevices out of soft rubber for certain applications, rather than the hard materials like glass or silicon used in traditional micromachining. In order to make a valve, one needs to figure out how to make it seal, which is usually done with a rubber washer. We made the entire valve out of the sealing material."

The pump is made possible because of the material's softness and pliability. Embedded in a small clear rubber chip the size of a postage stamp, the pump is actually a series of tiny, multilayer channels that each measure 50 by 30 by 10 microns. By contrast, a human hair is about 100 microns wide.

Operation of the pump is similar to the peristaltic motions that make human digestion possible. By applying pressure in one of the channels, another channel above it or below it in the 3-D matrix can be closed off, thereby allowing the channel to act either as a pump or as a valve.

While the research is basic and mainly aimed at demonstrating the feasibility of the technique, Quake says the pump could have a number of practical applications, including drug delivery, one day possibly enabling doctors to implant a biocompatible device about the size of a postage stamp into a patient's body to deliver drugs for chronic disorders such as allergies, pain, diabetes and cancer.

The device may allow the drug to be delivered in a time-released manner customized for each patient. In addition to delivering the drug, the device could also contain a microsized component that would enable regular monitoring of the patient's condition.

Quake's own lab intends to use the microfabricated valves and pumps in two devices: a DNA sizer, which is a replacement for the current technique known as gel electrophoresis; and a cell sorter, a machine that physically separates microscopic materials such as bacteria or viruses. Both devices originated from research in Quake's lab. Caltech has licensed this technology to Mycometrix Corporation of South San Francisco, which will apply it to develop a variety of commercial products.

In addition to Quake, the others involved in the research are Axel Scherer, a professor of electrical engineering, applied physics, and physics at Caltech; Marc Unger, a postdoctoral scholar in applied physics; Hou-Pu Chou, a graduate student in electrical engineering; and Todd Thorsen, a graduate student in biochemistry.

Robert Tindol

Using quantum atomistic computer simulations to solve industrial problems

PASADENA—In the world of engineering and applied science, ideas that look good on the drawing board often turn out to have annoying real-world problems, even though the finished products still look pretty good. An example is the aluminum car engine, which has the advantage of being lightweight, but tends to wear out more quickly than its heavier steel counterpart.

To solve such bedeviling problems, experts often find it necessary to go back to "first principles," which in the case of the aluminum engine may include a computer simulation of how the individual atoms slide around under wear and tear.

California Institute of Technology chemistry professor Bill Goddard had this type of problem in mind when he established a special center a decade ago within the campus's Beckman Institute. Christened the Materials and Process Simulation Center (MSC), Goddard's group set as their goals the development of computer simulation tools necessary to deal with materials and process issues, and the transfer of solutions to government and industry for the creation and improvement of products.

"We started the center to follow the dream of being able to predict chemical, biological, and materials processes with a computer," says Goddard. "The idea was to get a simulation that was close enough so that you wouldn't have to do the experiment."

Now that the MSC is celebrating its 10th anniversary, Goddard says the group has made some genuine progress on a number of real industrial problems—much to the satisfaction of corporate collaborators and sponsors, which at present are underwriting about 10 new projects each year.

In addition, the conference celebrates the 100th birthday of Arnold Beckman, the founder of Beckman Instruments and the benefactor of the Beckman Institute.

Since technology transfer and real-world results are a high priority, Goddard and his colleagues sponsor an annual meeting in which the collaborators showcase all their activities. This year's meeting, to be held March 23–24 at the Beckman Institute on campus, is also the 10th anniversary celebration of the center itself.

"There are several new accomplishments we'll discuss at this year's meeting," Goddard says. "We've had the first prediction of the structure of a membrane-bound protein, we've shown how to grow a new class of semiconductors to make real-world devices, and with our local collaborator Avery Dennison we've had success in predicting gas diffusion polymers.

"The bottom line is that it has worked out," he says. "In this center we have probably the most complete group of theorists in the world—about 40 people—and we've continued to have a flow of excellent grad students and postdocs who have gone on to be leaders in their fields."

A unique feature of the MSC is its emphasis in starting out with first principles, using quantum mechanics (the Schrödinger equation) to describe what is happening between atoms. For example, if the real-world problem is how best to lubricate a certain type of moving part (which is an actual industrially funded project the center has worked on), then the researchers would use the Schrodinger wave equation to build a simulation to show precisely how the electrons of a certain lubricant would interact with other electrons, how variable factors such as temperature and pressure would enter into the picture, and how a host of other interactions at the atomic level would play out.

But the quantum level is only the first in a hierarchy of regimes the center researchers might use in investigating complex problems. The quantum level with its Schrödinger equation is good for a system of about 100 atoms, but currently no computer can use quantum mechanics to predict the structure of hemoglobin, the protein that carries oxygen to our muscles.

Rather, for systems with up to about a million atoms, the center uses molecular dynamics techniques, essentially solving Newtonian equations.

For the billion or so atoms or particles that compose a "segment" of material, the MSC investigators employ the techniques of coarse-grain meso-scale modeling and tools such as phase diagrams. Beyond this point, for process simulation, materials applications, and engineering design involving the entire object, the center has developed yet another set of techniques.

This hierarchy of materials modeling is not describable merely by the number or size scale of particles. Time scales are also involved, with quantum mechanics operating at the femtosecond scale (a millionth of a billionth of a second), molecular dynamics at the nanosecond scale (a billionth of a second), coarse-grain meso-scale modeling at the millisecond scale, process simulation at the scale of minutes, and engineering design over periods ranging up to years.

Finally, the hierarchy has many crossover points, which particularly allow the center's research to be innovative and interdisciplinary.

"So you start with fundamentals of quantum mechanics, and imbed this in the next steps at all length scales and time scales," Goddard says. "The idea is to figure out why these things happen, and how looking at first principles can solve industrial problems."

Robert Tindol

Caltech grad student's team first to detect radio emission from a brown dwarf

A graduate student in astronomy from the California Institute of Technology recently led a team of researchers in finding the first radio emission ever detected from a brown dwarf, an enigmatic object that is neither star nor planet, but something in between.

The discovery, reported in the March 15 issue of the journal Nature by lead author Edo Berger and his colleagues, demonstrates that brown dwarfs can flare 10,000 times more intensely than theory predicted. The results will likely force experts to rethink their theories about magnetism in brown dwarfs and gas giants, says Berger's supervisor, Shri Kulkarni, who is John E. and Catherine T. MacArthur Professor of Astronomy and Planetary Science at Caltech.

Berger was leader of a student team that made the discovery during a special National Science Foundation student summer program at the NSF's Very Large Array (VLA) near Socorro, New Mexico. The brown dwarf they observed is named LP944-20.

Berger and his colleagues decided to make a long-shot gamble in attempting to observe a brown dwarf from which X-ray flares had been recently discovered with NASA's Chandra X-ray satellite.

"We did some background reading and realized that, based on predictions, the brown dwarf would be undetectable with the VLA," said Berger. "But we decided to try it anyway."

After consulting with Dale Frail, an astronomer at the National Radio Astronomy Observatory (NRAO), Berger and his colleagues decided to utilize a block of observing time traditionally dedicated to the summer students.

The day after they collected their data, the students gathered at the NRAO array operations center in Socorro to process the data and make the images. Berger, who had prior experience processing VLA data, worked alone in the same room as the other students, who were working together on another computer. Berger finished first and was shocked at his image.

I saw a bright object at the exact position of the brown dwarf, and was pretty sure I had made a mistake," Berger said.

He waited for the others, who were working under the guidance of another NRAO astronomer. Ten minutes later, the others also produced an image on the screen in which the same bright object showed up at the brown dwarf's location.

Berger then began breaking up the approximately 90 minutes' worth of data into smaller segments. His results showed that the brown dwarf's radio emission had risen to a strong peak, then weakened. This demonstrated that the brown dwarf had flared.

"Then we got real excited," Berger said, adding that the students immediately sought and received additional observing time. Soon they had captured two more flares.

"The radio emission these students discovered coming from this brown dwarf is 10,000 times stronger than anyone expected," Frail said. "This is going to open up a whole new area of research for the VLA."

The existence of brown dwarfs—objects with masses intermediate between stars and planets—had long been suspected but never confirmed until 1995, when Kulkarni made the first observation at Caltech's Palomar Observatory. Since then, a large number of brown dwarfs have been identified in systematic surveys of the sky. Astronomers now believe that there are as many brown dwarfs as stars in our galaxy.

Flaring and quiescent radio emissions have been seen previously from stars and from the giant planets of our solar system, but never before from a brown dwarf. Moreover, the strength of the magnetic field near the brown dwarf—as inferred from the radio observations—is well below that of Jupiter and orders of magnitude below that of low-mass stars, said Kulkarni.

Conventional wisdom would require large magnetic fields to accelerate the energetic particles responsible for the radio emissions. The same conventional wisdom says that brown dwarfs are expected to generate only short-lived magnetic fields.

However, the persistence of the radio emission of LP944-20 shows that the picture is not complete, Kulkarni said.

"I am very pleased that a first-year Caltech graduate student was able to spearhead such an undertaking, which led to this big discovery," said Kulkarni. "This discovery will spur theorists into obtaining a better understanding of magnetism in stars and planets."

In addition to Berger and Frail, the other authors of the paper are Steven Ball of New Mexico Institute of Mining and Technology, Kate Becker of Oberlin University, Melanie Clark of Carleton College, Therese Fukuda of the University of Denver, Ian Hoffman of the University of New Mexico, Richard Mellon of Penn State, Emmanuel Momjian of the University of Kentucky, Michael Murphy of Amherst College, Stacy Teng of the University of Maryland, Timothy Woodruff of Southwestern University, Ashley Zauderer of Agnes Scott College, and Bob Zavala of New Mexico State University.

[Editors: Additional information on this discovery is available at the NRAO Web site at http://www.nrao.edu/pr/browndwarf.html]


East and West Antarctica once began separating but then stopped, new research shows

PASADENA—Earth was well on its way to having two Antarcticas long ago, but a tectonic separation between the eastern and western portions of the continent suddenly stopped after 17 million years of spreading, researchers say.

In the March 9 issue of Nature, lead author Steve Cande of the Scripps Institution of Oceanography, Joann Stock of Caltech, and their colleagues in Australia and Japan report that the rift between East and West Antarctica began about 43 million years ago, then ended 17 million years later, after the seafloor had spread about 180 kilometers. The researchers discovered the motion after making several cruises over a period of years in the waters off the Antarctic coast and after gathering data on the seafloor itself.

"The two pieces of Antarctica pulled apart and then stopped," says Stock, a professor of geology and geophysics at Caltech. "If it had kept on going, there would eventually have been two Antarcticas."

The primary scientific value of the study is that it answers some nagging questions about the "missing" motion in the Antarctic region. For a variety of reasons, geophysicists have had a hard time getting a handle on the precise directions and amounts of motion there, and how the motion fits into the grand scheme of global plate tectonics.

"It's like a jigsaw puzzle," Stock says. "You have to know how one piece moved relative to the other pieces to understand how it all fits together.

"A lot of the tectonic plate history for western North America, for example, depends on what happened in Antarctica. You wouldn't think so, but that's the way plate tectonic movements work."

The key to the new results was the researchers' discovery of an underwater feature off Cape Adare that they have named the Adare Trough. This trough is about 230 kilometers long and runs roughly northwest-southeast near the 170th meridian. The sharp break in the direction of the magnetic lines on either side of the trough allows the researchers to infer the ancient relative motions of the plates, and the age and shape of the trough and seafloor around it indicates the period when the spreading occurred.

Seafloor spreading in the area accounts for the "missing" motion in the plate circuit linking the Australia, Antarctic, and Pacific plates, the researchers also found. Too, the 180-kilometer-wide zone of extension is most likely related to the uplift that has occurred in the Transantarctic Mountains to the west, and explains other geological features that have hitherto been puzzling.

And finally, the new results could shed new light on global issues such as the motion between hotspots in the Pacific and Indo-Atlantic oceans.

In addition to Cande and Stock, the other authors are Dietmar Müller of the University of Sidney and Takemi Ishihara of the Geological Survey of Japan.

Robert Tindol

Physicists create atom-cavity microscope, track single atoms bound in orbit with single photons

PASADENA—In a promising development with applications to science at the single-atom level, physicists have constructed an "atom-cavity microscope" that tracks the motion of individual atoms.

California Institute of Technology physics professor H. Jeff Kimble, his Caltech colleagues, and collaborators from New Zealand report in the February 25 issue of Science that they have succeeded in monitoring the motion of individual cesium atoms bound in orbit by single photons inside a high-quality optical resonator. The atom is trapped in orbit by a weak light field, and the same light field can be used to observe the atom's motion within the cavity.

This advance is an important development toward the eventual realization of quantum technologies, which would enable quantum computation and communication.

The stage for this microscopic dance is the optical cavity, a pair of highly reflective mirrors that face each other only 10 microns (0.0004 inches) apart. The mirrors are so reflective that a photon, the fundamental quantum of light, enters the cavity and bounces back and forth between the mirrors hundreds of thousands of times before it escapes again through one of the mirrors.

In this way a single photon confined in the cavity builds up an electric field strong enough to influence the motion of an atom and even to bind the atom in orbit within the cavity.

Collaborating theorists A. Scott Parkins and Andrew Doherty in New Zealand first recognized the potential of this trapping technique, in which the atom and the cavity share a quantum of excitation.

"I like to think of it as an atom-cavity molecule," says Christina Hood, a Caltech graduate student and primary author of the paper. "In a molecule, two atoms give up their own electron orbits, their separate identities, to share electrons and form something qualitatively different. In the same sense, in our experiment the atom and the cavity field are bound together strongly by sharing a series of single photons."

How do the scientists actually "see" what is going on inside the tiny optical system? The Caltech group and others had already used similar cavities to sense single atoms whizzing through the cavity. To do this, they illuminate one mirror of the cavity and measure the light escaping from the opposite mirror. "The cavity is a resonator for light, like a half-filled soda bottle is for sound," says Theresa Lynn, a Caltech graduate student and coauthor of the paper. "What we do is similar to holding a tuning fork up to the bottle and listening to hear it resonate. You'll only hear a ring if the right amount of water is in the bottle."

In this case, amazingly, it's a single atom that plays the role of the water in the bottle, dramatically altering the resonance properties of the cavity by its presence or absence. By measuring the amount of light emerging from the cavity, the researchers can tell whether an atom is in the cavity or not.

The major step of the current work is that now they can determine precisely where the atom is located within the light field, creating "movies" of atomic motion in the space between the cavity mirrors. Examples of these movies can be viewed at the group's web site:


The movies show atoms trapped in the cavity as they orbit in a plane parallel to the cavity mirrors. The atoms have orbital periods of about 150 microseconds and are typically confined to within about 20 microns of the cavity's center axis.

The Kimble team was able to measure the atomic position to within about 2 microns in measurement times of about 10 microseconds. Continuous position measurements at this level of accuracy and speed allowed them to capture the orbital motions of the atoms.

"The interaction of the atom with the cavity field gives us advantages in two distinct ways," says Kimble. "On the one hand, it provides forces sufficient to trap the atom within the cavity at the level of single photons. On the other hand and more importantly, the strong interaction enables us to sense atomic motion in a fashion that has not been possible before," he says.

Both aspects are important to physicists who probe the limits of our ability to observe and to control the microscopic world, in which the rules and regulations are set by quantum mechanics. According to one basic rule of quantum mechanics, the Heisenberg uncertainty principle, any measurement performed on a system inherently disturbs the future evolution of that system. The principle presents a challenge to physicists who strive to control or "servo" individual quantum systems for use in quantum computation and other quantum technologies.

In collaboration with Caltech assistant professor Hideo Mabuchi, the Caltech team is pursuing extensions of the current research to implement real-time quantum feedback to control atomic motion within the cavity. The operating principles for such "quantum servos" are a topic of contemporary theoretical investigation at Caltech being pursued by Mabuchi, Doherty, and their colleagues.

The cavity as a powerful sensing tool by itself also presents possibilities outside the quantum realm. The same techniques that produce movies of orbiting atoms could be adapted to more general settings, such as to "watch" the dynamics of molecules engaged in chemical and biochemical reactions. Mabuchi is pursuing an independent effort along these lines to monitor single molecules engaged in important biological processes such as conformationally gated electron transfer.

Robert Tindol