Human brain employs the same neurons in seeing an objectand later imagining it

In a study of nine epilepsy patients awaiting brain surgery, researchers have discovered that humans use the same neurons to conjure up mental images that they use when they see the real object with their eyes.

In the November 16 issue of the journal Nature, UCLA neurosurgeon and neuroscientist Itzhak Fried and Caltech neuroscientists Christof Koch and Gabriel Kreiman report on results obtained by questioning nine patients who had been fitted with brain sensors. The patients, all suffering from severe epilepsy uncontrolled with drugs, were being observed for a period of 1-2 weeks so that the regions of their brains responsible for their seizures could be identified and later surgically removed.

During their extended hospital stay, the patients were asked to look at photos of famous people such as President Clinton, pictures of animals, abstract drawings, and other images. While they were looking at the images, the researchers noted the precise neurons that were active.

Then, the subjects were instructed to close their eyes and vividly imagine the images. Again, the researchers took note of the neurons active at the time of visual imagery.

Analysis of the data showed that a subset of neurons in the hippocampus, amygdala, entorhinal cortex, and parahippocampal gyrus would fire both when the patient looked at the image, as well as when he or she imagined the image.

The results build upon previous work by Fried's group showing that single neurons in the human brain are involved in memory and can respond selectively to a wide variety of visual stimuli and stimulus features such as facial expression and gender.

According to Koch, a professor of computation and neural systems at Caltech, the study helps settle long-standing questions about the nature of human imagery. Particularly, the research sheds light on the process at work when humans see things with the "mind's eye."

"If you try to recall how many sunflowers there are in the Van Gogh painting, there is something that goes on in your head that gives rise to this visual image," Koch says. "There has been an ongoing debate about whether the brain areas involved in perception during 'vision with your eyes' are the same ones used during visual imagery."

The problem has been difficult to address because the techniques that yield very precise results in animals are generally not suitable for humans, and because the brain imaging techniques suitable for humans are not very precise, Koch says. Such techniques can image only large portions of the brain, each containing on the order of one million very diverse nerve cells.

"Recording the activity of single cells allows us to investigate the neuronal correlates of visual awareness at a detailed level of temporal and spatial resolution," says Kreiman.

The work was supported by the National Institutes of Health, the National Science Foundation, and the Center for Consciousness Studies at the University of Arizona.

Contact: Robert Tindol (626) 395-3631


New results on Martian meteorite support hypothesisthat life can jump between planets

According to one version of the "panspermia" theory, life on Earth could originally have arrived here by way of meteorites from Mars, where conditions early in the history of the solar system are thought to have been more favorable for the creation of life from nonliving ingredients. The only problem has been how a meteorite could get blasted off of Mars without frying any microbial life hitching a ride.

But new research on the celebrated Martian meteorite ALH84001 shows that the rock never got hotter than 105 degrees Fahrenheit during its journey from the Red Planet to Earth, even during the impact that ejected it from Mars, or while plunging through Earth's atmosphere before landing on Antarctic ice thousands of years ago.

In the October 27 issue of the journal Science, Caltech graduate student Benjamin Weiss, undergraduate student Francis Macdonald, geobiology professor Joseph Kirschvink, and their collaborators at Vanderbilt and McGill universities explain results they obtained when testing several thin slices of the meteorite with a new state-of-the-art device known as an Ultra-High Resolution Scanning Superconducting Quantum Interference Device Microscope (UHRSSM). The machine, developed by Franz Baudenbacher and other researchers at Vanderbilt, is designed to detect microscopic differences in the orientation of magnetic lines in rock samples, with a sensitivity up to 10,000 times greater than existing machines.

"What's exciting about this study is that it shows the Martian meteorite made it from the surface of Mars to the surface of Earth without ever getting hot enough to destroy bacteria, or even plant seeds or fungi," says Weiss, the lead author of the Science paper. "Other studies have suggested that rocks can make it from Mars to Earth in a year, and that some living organisms can live in space for several years. So the transfer of life is quite feasible."

The meteorite ALH84001 has been the focus of numerous scientific studies in the last few years because some scientists think there is tantalizing evidence of fossilized life within the rock. The issue has never been conclusively resolved, but Weiss says the matter is not important to the present result.

"In fact, we don't think that this particular meteorite brought life here," says Weiss. "But computer simulations of ejected Martian meteorites demonstrate that about one billion tons of rocks have been brought to Earth from Mars since the two planets formed." Many of these rocks make the transit in less than one year, although ALH84001 took about 15 million years.

"The fact that at least one out of the 16 known Martian rocks made it here without heating tells us that this must be a fairly common process," says Kirschvink.

The sample the Kirschvink team worked with is about 1 mm thick and 2 cm in length and somewhat resembles the African continent, with one side containing a portion of the original surface of the meteorite. Using the UHRSSM, the team found that the sample has a highly aligned and intense magnetic field near the surface, which is to be expected because the surface reached a high temperature when it entered Earth's atmosphere.

The reason this is important is that any weakly magnetized rock will reorient its magnetization to be aligned with the local field direction after it has been heated to high temperatures and cooled. This critical temperature for any magnetic material is known as the blocking temperature. Thus, the outer surface layer of meteorite ALH84001 reached a high temperature well above the blocking temperatures of its magnetic materials, which caused the materials at the surface to realign with Earth's magnetic field.

However, the interior portions of the slice were found to have randomly oriented magnetization, which means that some of the materials inside the meteorite never reached their blocking temperatures since sometime before they left the Martian surface. Further, when the researchers gently heated another slice taken from the interior of the meteorite, they discovered that the interior of the rock started to demagnetize at temperatures as low as 40 degrees Celsius—or 105 degrees Fahrenheit—thus demonstrating that it had never been heated even to that level.

Thus, a radiation-resistant organism able to survive without energy and water for a year could have made the journey from Mars to Earth. Examples of such hardy organisms, like the bacteria bacillus subtilis and deinococcus radiodurans, are already well known.

"Realistically, we don't think any life forms more complicated than single-celled bacterial spores, very tough fungal spores, or well-protected seeds could have made it," Kirschvink says. "They would also have had to go into some kind of dormant stage."

Though the study does not directly address the issue of life in meteorites, the authors say the results eliminate a major objection to the panspermia theory—that any life form reaching Earth by meteorite would have been heat-sterilized during the violent ejection of the rock from its parent planet, or entry into the atmosphere. Prior studies have already shown that a meteorite can enter Earth's atmosphere without its inner material becoming hot.

"ALH 84001 has stimulated a remarkable amount of research to test the hypothesis that life exists elsewhere than on Earth. The present study indicates that the temperature inside the meteorite could have allowed life to persist and possibly travel to Earth from Mars," says Nobel Prize-winning biologist Baruch Blumberg, who is director of NASA's Astrobiology Institute.

The results also demonstrate that critical information could be lost if rocks brought back from Mars by a sample return mission were heat-sterilized, as has been proposed. Thermal sterilization would destroy valuable magnetic, biological, and petrological information contained in the samples.

If life ever evolved on Mars, it is likely to have jumped repeatedly to Earth over geological times. Because the reverse process—the transfer of Earth life to Mars—is dynamically much more difficult, it may be more important to instead protect any Martian biosphere from Earthly microbes.

According to Kirschvink, "The Martian biosphere, if it ever evolved, would most likely have been brought to Earth billions of years ago, and could have participated in the evolution and diversification of bacterial life here.

"So there is at least a chance that we are in part descended from Martian microbes," Kirschvink says.

The ALH84001 research was funded in part by NASA's Astrobiology Institute, an international research consortium involving academic, non-profit and NASA field centers, whose central administrative office is located at NASA's Ames Research Center in California's Silicon Valley. A group from the Jet Propulsion Laboratory in Pasadena, CA, which sponsored the Caltech research, is one of the 11 lead teams of the institute.

Contact: Robert Tindol (626) 395-3631


LIGO establishes"first lock"

HANFORD, Wash.—Scientists from the Laser Interferometer Gravitational-wave Observatory (LIGO) announced today that they established "first lock" at the detector near Hanford, Washington.

The operational milestone marked the first time that the LIGO detector at Hanford, Washington, has simultaneously sent laser light back and forth along its two arms—each is one and a quarter miles long—thereby achieving the delicate optical interference that will make the detection of gravitational waves possible. This feat, dubbed "first lock" or "locking the interferometer," is similar to the "first light" of a newly commissioned telescope.

LIGO, a joint project of Caltech and MIT, is being built as a national research facility for detecting gravitational waves in the universe. LIGO comprises three detectors in the United States—two at Hanford and one near Livingston, Louisiana. The detectors will work in concert to detect gravitational waves, which are distortions of space-time caused by accelerating masses, such as exploding stars or vibrating black holes.

The LIGO detectors are set up in such a way that the very slight distortions of space-time in the vicinity of the detector's arms will cause perpendicular laser beams to go out of phase. Two observatory sites hundreds of miles apart are necessary to get a direction for the event causing the gravitational waves, and also to ascertain that the signals come from space and are not some local phenomenon.

To reach maximum sensitivity, LIGO employs a sophisticated computer-based control system to hold mirrors at the ends of the two arms in their proper locations with subatomic precision, while bouncing a laser beam back and forth between them. This "locking" of the interferometer will be the first full test of the first of three similar detectors being constructed by LIGO.

"This achievement brings us an important step closer to our real goal—LIGO's first gravitational-wave observations," said Barry Barish, who is Linde Professor of Physics at Caltech and LIGO director.

"First lock is a step toward bringing the apparatus to its full operating potential, but still some distance from the beginning of the scientific investigations that will ultimately come about," says Gary Sanders, deputy director of LIGO.

A complex scientific instrument

The LIGO detector comprises mirrors suspended in vacuum on fine wires at the corner and end of a long L. A highly stable laser beam is split, the two halves are sent back and forth about 100 times between the mirrors on the two arms, and then the beams are recombined. A passing gravitational wave will cause very small motions of the mirrors at the ends of the L, which scientists will observe by the changes they cause in the amplitude of the recombined light.

"The challenge is that the predicted motions of the mirrors due to even the strongest gravitational waves are incredibly small—about ten billionths of the diameter of an atom," explains Rainer Weiss, a physics professor at MIT. Weiss initially proposed building such a detector in 1973, and has worked to that end ever since.

"The detectors require extreme sensitivity to measure such motions, as well as to eliminate all other possible sources of disturbance to the mirrors," Weiss says.

A new tool for astrophysics

Gravitational waves are an important prediction of Einstein's theory of general relativity. They travel at the speed of light, but are very different from the electromagnetic waves—light, radio waves, X rays—that we are familiar with.

Gravitational waves are a periodic distortion of space itself, expanding distances along one arm while shrinking them along the other, then half a cycle later, shrinking the first arm while expanding the second. The waves that LIGO seeks are created by rapid accelerations of very massive astrophysical objects.

"Many of the sources of gravitational waves that LIGO may detect are difficult or even impossible to study using the familiar electromagnetic spectrum," says Kip Thorne, who is Feynman Professor of Theoretical Physics at Caltech and the leading theorist studying gravitational-wave sources. "These include black holes, neutron stars, and possibly even the Big Bang."

The search for gravitational waves has generated worldwide interest. Detectors are being built by the Japanese (the TAMA detector), by Italian and French collaborators (the Virgo project), and by German and British collaborators (the GEO600 project). Also, an Australian consortium has proposed a Southern Hemisphere detector.

Much remaining work

"As important as this milestone is, there is still a great deal more to do," emphasized Stan Whitcomb, director of commissioning for LIGO. "The detector control systems must be carefully characterized and tuned to achieve maximum sensitivity and reliable operation. And, of course, this is just the first of three interferometers that we have to commission."

Commissioning of the LIGO detectors will continue through the remainder of 2000 and 2001. Short periods of operation to test different aspects of their operation ("engineering runs") will be alternated with installation and commissioning. Full operation to detect gravitational waves will commence at the beginning of 2002.

Contact: Robert Tindol (626) 395-3631


NSF awards $9.6 million for materials research center at Caltech

The National Science Foundation today awarded $9.6 million in start-up funding for the Center for the Science and Engineering of Materials (CSEM) at the California Institute of Technology. The new center pioneers a number of exotic and futuristic materials and applications such as "liquid" metals, responsive gels, and tiny medical sensors.

The CSEM will be one of four new National Science Foundation-funded centers in the Materials Research Science and Engineering Center (MRSEC) program. Each of the four new centers will be in keeping with the MRSEC's mission "to undertake materials research of scope and complexity that would not be feasible under traditional funding of individual research projects."

The Caltech center will focus on four areas of research that are already emerging as unique, high-impact activities on campus, says Julia Kornfield, associate professor of chemical engineering at Caltech and director of the center.

"We've chosen four major areas of scientific interest that will help solve critical societal needs of the twenty-first century," says Kornfield. "We'll have strong ties to industrial and other laboratories, and we'll provide a substantial educational program for public schools and other institutions."

The four major areas will be biological synthesis and assembly of macromolecular materials, bulk metallic glasses and composites, mesophotonic materials, and ferroelectric thin films.

The biosynthesis initiative of the center will be led by David Tirrell, chair of Caltech's Division of Chemistry and Chemical Engineering. Research efforts will include the use of artificial proteins to make polymers with exquisite control of properties, and responsive polymers and gels for biomedical and industrial applications, including materials for entrapment of cells in tissue engineering or biosensors.

"This new technology takes advantage of traditional materials properties and biological functions such as signaling and information transfer," Tirrell says.

The team investigating glassy metallic alloys will be led by Bill Johnson, a professor of materials science at Caltech. This group will pursue basic science and new engineering strategies that will lead to custom-designed materials with desirable characteristics such as ultrahigh strength, exceptional elasticity, and ease of fabrication into complex parts.

The effort toward mesophotonics will be led by Harry Atwater of the Caltech applied physics faculty. Mesophotonic devices are optical components and devices at or below the wavelength of light. Future applications include engineered optical probes for biology and medicine, and photonic devices that could replace certain electrical devices in telecommunications and computing.

Kaushik Bhattacharya, professor of applied mechanics and mechanical engineering, will lead research to enable microactuators based on high strain ferroelectrics. The team's integrated simulation and experimental approach promises to reveal the microscopic basis of large strain behavior in this class of materials.

The new center will be interdisciplinary, not only because researchers >from varied backgrounds will work in each area, but also because the areas themselves overlap, Kornfield says. The self-assembly of nanostructured materials is applicable to the mesophotonics area as well as to the biological synthesis area, she says.

"Caltech provides an excellent setting for this ambitious center due to the extensive investments that the Institute has made over the past decade, building the campus-wide activities and facilities in materials research. And strong institutional commitments to technology transfer and educational outreach will ensure that the impact of this center is felt far beyond the Caltech campus," says Kornfield.

Outreach programs will enrich educational opportunities in science and engineering for underrepresented minority students at the undergraduate, middle school and high school levels. The center will establish a materials pPartnership between Caltech and nearby California State University, Los Angeles, that will involve CSULA students and faculty in CSEM research, and foster materials research and curriculum on the CSULA campus.

The center will also establish an extensive network of research collaborations in the private sector with companies such as Lucent, Exxon, Dow, Procter and Gamble, 3M, and General Motors, and with government laboratories such as JPL and Brookhaven.

Robert Tindol
Exclude from News Hub: 

Astronomers improve "cosmic yardstick" by measuringdistance to star in Gemini with Palomar Testbed Interferometer

Researchers using the testbed interferometer at Palomar Observatory have achieved the best-ever distance measurement to a type of star known as a Cepheid variable. The new results improve the "cosmic yardstick" used to infer the size and age of the universe.

In the September 28 issue of the British journal Nature, a group of astronomers from the California Institute of Technology, the Jet Propulsion Laboratory, and the Infrared Processing and Analysis Center announce that the distance to the star Zeta Geminorum in the Gemini constellation is 1,100 light years. The degree of accuracy in the measurement is about 13 percent, meaning that the star could be as close as 960 or as far away as 1,240 light-years. This represents an improvement of a factor of three over previous measurements.

The improvement is due to the use of the Palomar Testbed Interferometer, of which JPL engineer Mark Colavita is the principal investigator and codesigner. "This has been a bit of a Holy Grail in the field," says Benjamin Lane, a graduate student in Caltech's planetary science program and the lead author of the study. "The measurement of accurate distances to Cepheids is widely considered to be a principal limitation in determining the Hubble constant."

Cepheid variables for several decades have been an important link in the chain of measurements that allow astronomers to estimate the distances to the farthest objects in the universe—and ultimately, the overall size and expansion rate of the universe itself.

Cepheid variables are stars that have very predictable relationships between their absolute brightness and the frequency with which they brighten up. A Cepheid is useful for measuring distances because, if it is known how bright the star really is, then it is a simple task to measure how bright it appears on Earth and then calculate the distance.

A good analogy is a light bulb shining at an unknown distance. If we are certain that only 100-watt light bulbs brighten once a day, and we observe that the light indeed brightens once daily, then we can calculate its distance by measuring the brightness of the light reaching us and comparing it to the known absolute brightness of a 100-watt light bulb.

"Zeta Geminorum is known to grow larger and smaller," says Lane. "We already knew this because we can see the Doppler effect." In other words, astronomers can measure a slight difference in light coming from the star because the surface of the star moves toward us and away from us as the star expands and contracts.

In the Nature study, the researchers couple this information with new data collected with the Palomar Testbed Interferometer. The interferometer combines the images from two 16-inch telescope mirrors in such a way that images are as sharp as they would be if the telescope mirror were 360 feet in diameter.

Data from the interferometer showed that Zeta Geminorum went through a change in angular size of about five hundred-millionths of a degree during its 10-day cycle. "That's roughly the size of a basketball on the moon, as seen from Earth," says Colavita.

From previous Doppler measurements, the researchers already knew that the change in the star's diameter was about 4.2 million kilometers. By combining that information with the newly measured change in angular size, they were able to deduce the distance to the Cepheid.

The direct measurement of distance to Zeta Geminorum shows that the basic technique works, Lane says. "As a graduate student, it has been exciting to be at the leading edge of this field."

The Palomar Testbed Interferometer was designed and built by a team of researchers from the Jet Propulsion Laboratory in Pasadena led by Colavita and Michael Shao. Funded by NASA, the interferometer is located at the Palomar Observatory near the historic 200-inch Hale Telescope.

The device is intended as an engineering testbed for the interferometer that will soon link the 10-meter Keck Telescopes atop Mauna Kea in Hawaii.

The Keck Interferometer has been funded to find and study extrasolar planets. The Navy and the NSF are also funding the development of interferometers for astrometry and stellar astronomy.

"The current precision is a significant improvement over the previous determinations, but we expect to achieve distance measurements at the level of a few percent in the near future," says Shri Kulkarni, a professor of astronomy and planetary science at Caltech and a coauthor of the paper.

In addition to Lane and Kulkarni, the other authors are Marc Kuchner, a Caltech graduate student in astronomy; Andrew Boden of the Infrared Processing and Analysis Center (IPAC), and Michelle Creech-Eakman, a postdoctoral scholar at JPL.

Robert Tindol

Caltech to Receive $10 Million Grant from BP to Study Methane Conversion

Pasadena—A 10-year research grant totaling $10 million for the study of catalytically converting methane, the principal component of natural gas, to useful liquid fuels and chemicals, has been awarded to the California Institute of Technology by BP, one of the world's largest oil, gas, and petrochemical companies. An equal amount was awarded to the University of California, Berkeley.

Details about the grant and the implications this work could have on fuel usage internationally will be announced at a press conference at 4 p.m. Friday, September 22, at the Los Angeles Athletic Club, 431 West Seventh St., President's Room, fourth floor, Olive Street parking entrance.

Speakers will include Sir John Browne, BP's Chief Executive, David Baltimore, president of Caltech, and UC Berkeley Vice Chancellor Joseph Cerny.

Browne also will field questions about other energy-related issues including gas prices, crude oil supply, BP's take-over of ARCO, and cleaner burning fuels.

The grants have been agreed upon in principle by the universities, clearing the way for BP to establish a university research program at the two institutions, similar to one it previously announced at England's Cambridge University. The research will be directed by the respective faculty members and will involve undergraduate, graduate, and postdoctoral level students. Under the pending BP-funded proposal, each of the universities will work closely with BP and will receive $1 million per year for 10 years for the study of methane conversion.

The discovery of large reserves of natural gas in many parts of the world, some very remote, has stimulated efforts by BP to catalytically convert methane to useful end products, such as much cleaner fuels and chemicals that are more economical to transport and market.

"We believe the next breakthrough in the conversion of natural gas to liquids, which will help bring us the next generation of cleaner burning fuels, will come from catalysis combined with process engineering. These two universities have some of the world's finest scientific and engineering minds to help us accomplish this," said Browne.

"By undertaking this progressive collaboration with Caltech and UC Berkeley - which have taken vastly different approaches to solving this difficult methane conversion problem - we have formed a very innovative and substantial team."

The U.C. Berkeley group will be headed by Professor Alex Bell and will focus on heterogeneous catalytic approaches for producing liquid fuels and chemicals. Building on its strength in understanding catalyst structure-performance relationships, this group will seek major breakthroughs in catalyst and process design for both direct and indirect conversion of methane. By contrast, the Caltech team, led by Caltech's Beckman Institute Administrator Jay Labinger and Professor John Bercaw, will develop novel, homogeneous catalytic approaches, building on work their group has pursued for 10 years.

This new grant allows Caltech to take its research to the next level. The new, expanded research team will encompass work by some of the Institute's most respected researchers, Robert Grubbs, Mark Davis, Harry Gray, and Nate Lewis.

The funding will partially support as many as eight faculty members and 30 to 35 research staff, graduate students, and postdoctoral fellows at the two universities. As part of the grant, there will be frequently scheduled meetings and collaboration between the two groups.

Information in a number of supporting research areas - such as theoretical modeling, catalyst preparation, and process design - will also be shared, according to BP.

The two schools were selected based on their history of progressive research into catalytic conversion and on the reputation of their combined schools of chemistry and chemical engineering.

In accepting the grant, Caltech's president, David Baltimore, a Nobel prize-winning biologist, said, "The work performed will contribute to the education of a large number of young researchers, as it concurrently advances our ability to develop and exploit emerging technological and scientific concepts. It also enables us to broaden our base of funding for important scientific research that might otherwise go unexplored.

We applaud BP for its sincere efforts to bridge the gap between academia and the private sector in seeking ways to prevent the waste of natural resources and minimize environmental impact through research on converting natural gas to more economical and environmentally sensitive end uses," he added.

"From BP's perspective, partnering with leading educational and research institutions enables us to demonstrate responsible leadership while remaining on the cutting edge of scientific development, through funding projects that will not only benefit the company, but society as well, while offering opportunities that can result in a cleaner environment and a stronger economy. That is very much the way we expect to pursue aspects of fundamental scientific research," said Browne.

He also said that BP feels that because liquefaction and shipping of natural gas are expensive, the conversion of its principal component, methane, into useful end products is very attractive. The economics of methane conversion are strongly related to the capital investment required. To some extent, high capital costs are a consequence of the small scale that has been envisioned for most processes.

An overall aim is a major reduction in the energy required for conversion, potentially leading to substantially lower emissions of greenhouse gases.

London-based BP is a leader in solar power generation. It recently announced its "Clean Fuels 40 Cities' 2000 Program," dedicated to bringing cleaner fuels to cities worldwide.


For further information, contact:

BP: Cheryl Burnett (714) 670-5161 Berkeley: Jane Scheiber (510) 642-8782 Caltech: Jill Perry (626) 395-3226


New visual test devised by Caltech, USC researchers

A new five-minute vision test using a desktop computer and touch-sensitive screen is showing promise as a diagnostic tool for a variety of eye diseases and even certain brain tumors.

Invented by California Institute of Technology physicist Wolfgang Fink and University of Southern California professor of ophthalmology and neurosurgery Alfredo A. Sadun, the 3-D Computer-Based Threshold Amsler Grid Test offers a novel method for medical personnel to evaluate the central visual field. The test is sensitive and specific enough to allow an ophthalmologist to diagnose visual disorders such as macular degeneration, and to discriminate between visual disorders with subtly different symptoms, such as macular edema and optic neuritis.

In order to take the test, the patient sits in front of a touch-sensitive computer screen displaying a grid pattern and a central bright spot. Staring at the central spot with one eye closed, the patient traces a finger around the portions of the grid he or she can see, and the computer records the information.

After the computer records the patient's tracings, the operator changes the contrast of the grid slightly and the patient again traces the visible portions of the grid. This process is repeated and information is gathered for the computer to process a three-dimensional profile of the patient's visual field for that eye. Then, the process is repeated for the other eye.

Patients suffering from macular degeneration, for example, experience a loss of vision at the central focus and thus will have trouble seeing the grid pattern near the center. Since macular degeneration sufferers have peripheral vision, they would likely trace a central hole on the screen, and if they also had a relative field defect, they might trace an ever-smaller circle as the brightness of the grid pattern intensified. Once the information was processed, the 3-D graph would provide doctors with a complete description of what the patient sees under various conditions.

The test will also be useful for diseases and conditions such as optic neuritis, detached retina, glaucoma, anterior ischemic optic neuropathy (AION), macular edema, central or branch retinal artery occlusions, and several genetic impairments. Also, the test can be used to detect, characterize, and even locate several types of brain tumors.

Thus, the new test is not only more revealing than standard visual field tests, but it is also much quicker and simpler than existing methods of characterizing the visual field, says Sadun. Likening the test to a recreational video game, Sadun says the new technology will be cheap and easily marketable, and also will be a powerful means of processing patient data.

"The patient is playing the game while the machine is digesting the information," Sadun says.

Fink created a program that permits the computer acquisition and analysis of the psychophysical techniques developed by Sadun. The computer processes patient responses into a computer profile.

Fink says the test is "a completely noninvasive way to understand and diagnose certain eye diseases."

"We can gain more information from this test than any other visual field test," Sadun says. "The test creates a greater sensitivity for detecting problems, it provides quantitative measures for monitoring, and it characterizes the 3-D visual field, which makes a big contribution to diagnosis."

The test has already been used since April on about 40 patients suffering from macular degeneration, AION, and optic neuritis. In the coming months, the researchers will begin testing the program on patients with glaucoma.

The 3-D Computer-Based Threshold Amsler Grid Test has been approved by USC's institutional review board. Fink and Sadun have applied for a U.S. patent.

Fink was supported by a grant from the National Science Foundation during the course of his work.


NSF funds new Institute for Quantum Information at Caltech

The National Science Foundation has awarded a five-year, $5 million grant to the California Institute of Technology to create an institute devoted to quantum information science—a new field that could ultimately lead to devices such as quantum computers.

The announcement was part of a $90 million information technology research initiative the NSF announced today in Washington. The awards are aimed at seeding fundamental research in innovative applications of information technology.

Caltech's new Institute for Quantum Information will draw on several fields, including quantum physics, theoretical computer science, mathematics, and control and dynamical systems engineering, says founding director John Preskill, a professor of theoretical physics at Caltech.

"The goal of the institute will be to understand ways in which the principles of quantum physics can be exploited to enhance the performance of tasks involving the transmission, processing, and acquisition of information," says Preskill, who has worked on quantum computation algorithms for the last five years.

"The most potentially exciting aspect of the field is the promise of a quantum computer," he says. "If you could process quantum states instead of classical information, there are problems you could solve that could never be solved with classical technology."

Quantum computers would be more efficient than conventional computers because they would greatly reduce the number of steps the computer would have to jump through to solve many problems. For example, the encryption used to protect credit cards relies on the fact that it would take huge amounts of time for a conventional computer to break down a large number into its factors (the numbers one multiplies together that will equal this number).

It now takes the best computers several months to find the factors of a 130-digit number, and it would take 10 billion years to factor a 400-digit number—nearly the entire age of the universe. But a quantum computer with the same clock speed could factor the 400-digit number in about a minute, according to the figures Preskill has worked out.

At the same time, quantum information would provide a new means to thoroughly protect information from any intruder, Preskill says.

"By using quantum information, it's possible to make unbreakable codes, and this security is founded on fundamental physical laws," he says.

Also, the work of the new institute will advance research in the further miniaturization of classical electronic components. Quantum effects are becoming increasingly important for microelectronics as devices continue to shrink toward atomic dimensions.

In addition to Preskill, the Institute for Quantum Information will be led by two co-principal investigators who, in consultation with other Caltech researchers, will guide and supervise scientific activities. The initial co-principal investigators will be Jeff Kimble, an experimental physicist who has done groundbreaking work in the transmission of quantum information, and John Doyle, a professor of electrical engineering who is interested in control issues of quantum systems.

Other investigators at the institute will include Michelle Effros, Hideo Mabuchi, Michael Roukes, Axel Scherer, and Leonard Schulman, all Caltech faculty members. The institute will develop a substantial visitors' program and will aim at hiring postdoctoral researchers and graduate students who wish to enter the field of quantum information systems.

Contact: Robert Tindol (626) 395-3631


Caltech researchers breed new genes to make natural products in bacteria

PASADENA—Using a new process of "sex in the test tube," a California Institute of Technology research group has been able to mate genes from different organisms and breed new genetic pathways in bacteria. These bacteria make an array of natural products that are naturally found in much more complex organisms.

The natural products, which are carotenoids similar to the pigment that gives carrots their color, are made by many different plants and microbes, but are totally foreign to the E. coli bacteria the researchers used. The new results, reported in the July issue of the journal Nature Biotechnology, show that the carotenoid-producing genes from different parent organisms can be shuffled together to create many-colored E. coli. Many of the carotenoids made in the bacteria are not even made by the organisms from which the parent genes came.

One of the reddish products, torulene, is not produced by any known bacteria, although it is found in certain red yeasts. "With molecular breeding, the experimenter can train the molecules and organisms to make new things that may not even be found in nature, but are valuable to us," says Frances Arnold, professor of chemical engineering and biochemistry at Caltech and coauthor of the new study.

Conceptually similar to dog breeding, the process generates progeny that are selected by researchers on the basis of attractive features. In this study, former Caltech researcher Claudia Schmidt-Dannert (now on the faculty at the University of Minnesota) and Caltech postdoctoral researcher Daisuke Umeno selected the new bacteria by their color.

This process of directed evolution, which Arnold has been instrumental in developing, is capable of creating new biological molecules and even new organisms with new or vastly improved characteristics. Unlike evolution in nature, where mutations are selected by "survival of the fittest," directed evolution, like breeding, allows scientists to dictate the characteristics of the molecules selected in each generation.

"We are now able to create natural products that usually have to come at great cost from esoteric sources simply by breeding ordinary genes in ordinary laboratory organisms," says Schmidt-Dannert.

The researchers believe that this method will be widely useful for making complex and expensive natural molecules such as antibiotics, dyes, and flavors. "Imagine being able to produce in simple bacteria many of the compounds that come from all over nature," says Arnold.

And, according to the authors, an even more irresistible target of directed evolution is finding bacteria that make biological molecules not yet found in nature.

Robert Tindol
Exclude from News Hub: 

Caltech and the Human Genome Project

PASADENA- Two of the key inventions that made possible the monumental task of sequencing the human genome came from the California Institute of Technology. These were especially important in the sequencing of the 3 billion DNA base pairs composing the human genome because the inventions speeded up progress on the task.

The first landmark invention was a method for the automated sequencing of DNA by Leroy Hood, then a professor of biology at Caltech, and his colleagues, Mike Hunkapiller, Tim Hunkapiller, Charles Connell, and Lloyd Smith. Before their discovery, figuring out the sequence of a segment of DNA had been exceedingly difficult and laborious. Because the process was so slow and required the work of highly skilled technicians, it was clear to most scientists in the mid '80s that it would not be possible to sequence entire genomes by manual methods.

The method devised by Hood and his colleagues changed that. They developed a novel chemistry that permitted a machine to detect DNA molecules, using fluorescent light. This method revolutionized DNA sequencing, ultimately making it possible to launch the Human Genome Project. Coupled with some recent advances, the method remained the core for the just-completed phase of sequencing the human genome.

A second key invention for the genome project was developed at Caltech by Professor Melvin Simon, chair of Caltech's biology division, and his coworker Hiroaki Shizuya. They recognized that a critical part of sequencing would be preparing large DNA segments for the process. To accomplish this, they invented "bacterial artificial chromosomes" (BACs), which permit scientists to use bacteria as micromachines to accurately replicate pieces of human DNA that are over 100,000 base pairs in length. These BACs provided the major input DNA for both the public genome project and Celera.

The Simon research group was also a major contributor to the mapping and sequencing of chromosome 22-a substantial segment of the human genome, which was completed in 1999. These researchers are presently using genomic information to create an "onco-chip," which will give researchers convenient experimental access to a miniature array containing hundreds of BACs, each carrying a gene whose mutation can cause human cancer.

Caltech researchers, both current and past, have also been important in promoting the Human Genome Project itself-a project that originally met with scientific skepticism when it was born 12 years ago, particularly when the goal of a fully sequenced human genome by the year 2003 was announced.

That skepticism has long since been replaced by wholesale enthusiasm from the scientific community. David Baltimore, president of Caltech and a Nobel laureate for his work on the genes of viruses, was a highly influential supporter of the Human Genome Project at its inception. Baltimore, then a professor of biology at MIT, was one of an international cadre of farsighted biologists that also included Hood and Simon. They shared a vision of the future in which knowledge of every gene that composes the human genome would be available to any scientist in the world at the click of a computer key.

To shape this unprecedented and complex project, Caltech professors Norman Davidson, Barbara Wold, and Steve Koonin have served in national scientific advisory roles to the genome project in the intervening years. Also, Baltimore chaired the National Institutes of Health (NIH) meeting where the human genome project was launched.

Koonin, who is Caltech's provost, was chair of the JASON study of 1997, which noted to the scientific community that quality standards could be relaxed so that a "rough draft" of the human genome could be made years earlier and still be of great utility. This, in fact, was the approach that prevailed.

The Human Genome Project is unique among scientific projects for having set aside, from the beginning, research support for studies of the ethical, legal, and social implications of the new knowledge of human genes that would result. In Caltech's Division of the Humanities and Social Sciences, Professor Daniel Kevles has examined these ethical issues in his book The Code of Codes: Scientific and Social Issues in the Human Genome Project, which he coedited in 1992 with Leroy Hood.

Caltech scientists are also actively engaged in the future of genomics, which is the use of the newly obtained DNA sequences to discover and understand the function of genes in normal biology and in disease and disease susceptibility. This includes devising new ways to extract and manipulate information from the human genome sequence and from recently completed genome sequences of important experimental organisms used by scientists in the laboratory, such as the fruit fly, mustard weed, and yeast.

In one new project, Caltech recently became the home site for the international genome database for a key experimental organism called C. elegans, under the direction of Caltech Professor Paul Sternberg. This tiny worm has about 19,000 different genes, many of which correspond to related genes in humans. The shared origin and functional relationships between the genes of worm and man (and fruit fly and all other animals) let scientists learn much about how human genes work, by studying these small creatures in the laboratory.

The Worm Genome Database, called Wormbase, is undertaking the major task of collecting and making computer-accessible key information about every worm gene, its DNA sequence, and what its function is in the animal. This will require that new methods in automated data-mining and computing be brought together and fused with expert knowledge in biology, and then made accessible by computer to anyone interested.

Because of the relatedness of many genes and their functions among all animals, this information about the worm and its genome will be important for understanding human genes, and vice versa.

Another major genomics effort at Caltech is aimed at understanding how groups of genes work to direct development from a fertilized egg to an adult organism, and how these groups of genes change their action or fail in aging, cancer, or degenerative disease. The genomics approach to these problems involves the application of new computational methods and automated experimental technologies.

To do this, Barbara Wold, together with Mel Simon, Professor Stephen Quake from Caltech's Division of Engineering and Applied Science, and Dr. Eric Mjolsness of the NASA's Jet Propulsion Laboratory, have established the L. K. Whittier/Caltech Gene Expression Center, funded by the Whittier Foundation. The new work in genomics is also fueling new interdisciplinary programs at Caltech in the computational modeling of cells and organisms.

Robert Tindol


Subscribe to RSS - research_news