More Stormy Weather on Titan

PASADENA, Calif.— Titan, it turns out, may be a very stormy place. In 2001, a group of astronomers led by Henry Roe, now a postdoctoral scholar at the California Institute of Technology, discovered methane clouds near the south pole of Saturn's largest moon, resolving a debate about whether such clouds exist amid the haze of its atmosphere.

Now Roe and his colleagues have found similar atmospheric disturbances at Titan's temperate mid-latitudes, about halfway between the equator and the poles. In a bit of ironic timing, the team made its discovery using two ground-based observatories, the Gemini North and Keck 2 telescopes on Mauna Kea, in Hawaii, in the months before the Cassini spacecraft arrived at Saturn and Titan. The work will appear in the January 1, 2005, issue of the Astrophysical Journal.

"We were fortunate to catch these new mid-latitude clouds when they first appeared in late 2003 and early 2004," says Roe, who is a National Science Foundation Astronomy and Astrophysics Postdoctoral Scholar at Caltech. Much of the credit goes to the resolution and sensitivity of the two ground-based telescopes and their use of adaptive optics, in which a flexible mirror rapidly compensates for the distortions caused by turbulence in the Earth's atmosphere. These distortions are what cause the well-known twinkling of the stars. Using adaptive optics, details as small as 300 kilometers across can be distinguished despite the enormous distance of Titan (1.3 billion kilometers). That's equivalent to reading an automobile license plate from 100 kilometers away.

Still to be determined, though, is the cause of the clouds. According to Chad Trujillo, a former Caltech postdoctoral scholar and now a scientist at the Gemini Observatory, Titan's weather patterns can be stable for many months, with only occasional bursts of unusual activity like these recently discovered atmospheric features.

Like Earth, Titan's atmosphere is mostly nitrogen. Unlike Earth, Titan is inhospitable to life due to the lack of atmospheric oxygen and to its extremely cold surface temperatures (-297 degrees Fahrenheit). Along with nitrogen, Titan's atmosphere also contains a significant amount of methane, which may be the cause of the mid-latitude clouds.

Conditions on Earth allow water to exist in liquid, solid, or vapor states, depending on localized temperatures and pressures. The phase changes of water between these states are an important factor in the formation of weather in our atmosphere. But on Titan, methane rules. The moon's atmosphere is so cold that any water is frozen solid, but methane can move between liquid, solid, and gaseous states. This leads to a methane meteorological cycle on Titan that is similar to the water-based weather cycle on Earth.

While the previously discovered south polar clouds are thought to be a result of solar surface heating, the new mid-latitude clouds cannot be formed by the same mechanism. One possible explanation for the new clouds is a seasonal shift in the global winds. More likely, says Roe, surface activity might be disturbing the atmosphere at the mid-latitude location. Geysers of methane slush may be brewing up from below, or a warm spot on Titan's surface may be heating the atmosphere. Cryovolcanism--volcanic activity that spews an icy mix of chemicals--is another mechanism that could cause disturbances. Hints about what is happening on this frigid world could be obtained as the Huygens probe, which will be released from Cassini on Christmas day, drops through Titan's atmosphere in January 2005.

If the clouds are being caused by these geological conditions, says Roe, they should stay at the observed 40-degree latitude and repeatedly occur above the same surface feature or features. Meanwhile, if a seasonal shift in the winds is forming the clouds then their locations should move northward as Titan's season progresses into southern summer. "Continued observations with the Gemini and Keck telescopes will easily distinguish between these two scenarios," says Roe.

The Gemini observatory is operated by the Association of Universities for Research in Astronomy under a cooperative agreement with the National Science Foundation, involving the National Optical Astronomy Observatory, AURA, and the NSF as the U.S. partner. The W.M. Keck Observatory is operated by the California Association for Research in Astronomy, a scientific partnership between the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration.

Writer: 
JP
Writer: 
Exclude from News Hub: 
No

Physicists at Caltech, UT Austin ReportBose-Einstein Condensation of Cold Excitons

PASADENA, Calif.-Bose-Einstein condensates are enigmatic states of matter in which huge numbers of particles occupy the same quantum state and, for all intents and purposes, lose their individual identity. Predicted long ago by Albert Einstein and Satyendranath Bose, these bizarre condensates have recently become one of the hottest topics in physics research worldwide.

Now, physicists at the California Institute of Technology and the University of Texas at Austin have created a sustained Bose-Einstein condensate of excitons, unusual particles that inhabit solid semiconductor materials. By contrast, most recent work on the phenomenon has focused on supercooled dilute gases, in which the freely circulating atoms of the gas are reduced to a temperature where they all fall into the lowest-energy quantum state. The new Caltech-UT Austin results are being published this week in the journal Nature.

According to Jim Eisenstein, who is the Roshek Professor of Physics at Caltech and co-lead author of the paper, exciton condensation was first predicted over 40 years ago but has remained undiscovered until now because the excitons usually decay in about a billionth of a second. In this new work, the researchers created stable excitons, which consist of an electron in one layer of a sandwich-like semiconductor structure bound to a positively charged "hole" in an adjacent layer. A hole is the vacancy created when an electron is removed from a material.

Bound together, the electron and hole form a "boson," a type of particle that does not mind crowding together with other similar bosons into the same quantum state. The other type of particle in the universe, "fermions," include individual protons and electrons and neutrons. Only one fermion is allowed to occupy a given quantum state.

The picture is complex, but if one imagines two layers of material, one containing some electrons, the other completely empty, the results are somewhat easier to visualize. Begin by transferring half of the electrons from the full layer to the empty one. The resulting situation is equivalent to a layer of electrons in parallel with a layer of holes. And because the electron has a negative charge, the taking away of an electron means that the hole in which it once existed has a positive charge.

The difficult thing about the procedure is that the layers have to be positioned just right and a large magnetic field has to be applied just right in order to avoid swamping the subtle binding of the electron and hole by other forces in the system. The magnetic field is also essential for stabilizing the excitons and preventing their decay.

Eisenstein says that the simplest experiment consists of sending electrical currents through the two layers in opposite directions. The "smoking gun" for exciton condensation is the absence of the ubiquitous sideways force experienced by charged particles moving in magnetic fields. Excitons, which have no net charge, should not feel such a force.

One mystery that remains is the tendency of the excitons to dump a small amount of energy when they move. "We find that, as we go toward lower temperatures, energy dissipation does become smaller and smaller," Eisenstein says. "But we expected no energy dissipation at all.

"Therefore, this is not really an ideal superfluid--so far it is at best a bad one."

The other author of the paper is Allan MacDonald, who holds the Sid W. Richardson Foundation Regents Chair in physics at UT Austin and is a specialist in theoretical condensed matter physics.

Writer: 
Robert Tindol
Writer: 

Caltech computer scientists embed computation in a DNA crystal to create microscopic patterns

PASADENA, Calif.--In a demonstration that holds promise for future advances in nanotechnology, California Institute of Technology computer scientists have succeeded in building a DNA crystal that computes as it grows. As the computation proceeds, it creates a triangular fractal pattern in the DNA crystal.

This is the first time that a computation has been embedded in the growth of any crystal, and the first time that computation has been used to create a complex microscopic pattern. And, the researchers say, it is one step in the dream of nanoscientists to master construction techniques at the molecular level.

Reporting in the December issue of the journal Public Library of Science (PLoS) Biology, Caltech assistant professor Erik Winfree and his colleagues show that DNA "tiles" can be programmed to assemble themselves into a crystal bearing a pattern of progressively smaller "triangles within triangles," known as a Sierpinski triangle. This fractal pattern is more complex than patterns found in natural crystals because it never repeats. Natural crystals, by contrast, all bear repeating patterns like those commonly found in the tiling of a bathroom floor. And, because each DNA tile is a tiny knot of DNA with just 150 base pairs (an entire human genome has some 3 billion), the resulting Sierpinski triangles are microscopic. The Winfree team reports growing micron-size DNA crystals (about a hundredth the width of a human hair) that contain numerous Sierpinski triangles.

A key feature of the Caltech team's approach is that the DNA tiles assemble into a crystal spontaneously. Comprising a knot of four DNA strands, each DNA tile has four loose ends known as "sticky ends." These sticky ends are what binds one DNA tile to another. A sticky end with a particular DNA sequence can be thought of as a special type of glue, one that only binds to a sticky end with a complementary DNA sequence, a special "anti-glue''. For their experiments, the authors just mixed the DNA tiles into salt water and let the sticky ends do the work, self-assembling the tiles into a Sierpinski triangle. In nanotechnology this "hands off" approach to manufacturing is a desirable property, and a common theme.

The novel aspect of the research is the translation of an algorithm--the basic method underlying a computer program--into the process of crystal growth. A well-known algorithm for drawing a Sierpinski triangle starts with a sequence of 0s and 1s. It redraws the sequence over and over again, filling up successive rows on a piece of paper, each time performing binary addition on adjacent digits.

The result is a Sierpinski triangle built out of 0s and 1s. To embed this algorithm in crystal growth, the scientists represented written rows of binary "0s" and "1s" as rows of DNA tiles in the crystal--some tiles stood for 0, and others for 1. To emulate addition, the sticky ends were designed to ensure that whenever a free tile stuck to tiles already in the crystal, it represented the sum of the tiles it was sticking to.

The process was not without error, however. Sometimes DNA tiles stuck in the wrong place, computing the wrong sum, and destroying the pattern. The largest perfect Sierpinski triangle that grew contained only about 200 DNA tiles. But it is the first time any such thing has been done and the researchers believe they can reduce errors in the future.

In fact the work is the first experimental demonstration of a theoretical concept that Winfree has been developing since 1995--his proposal that any algorithm can be embedded in the growth of a crystal. This concept, according to Winfree's coauthor and Caltech research fellow Paul W. K. Rothemund, has inspired an entirely new research field, "algorithmic self-assembly," in which scientists study the implications of embedding computation into crystal growth.

"A growing group of researchers has proposed a series of ever more complicated computations and patterns for these crystals, but until now it was unclear that even the most basic of computations and patterns could be achieved experimentally," Rothemund says.

Whether larger, more complicated computations and patterns can be created depends on whether Winfree's team can reduce the errors. Whether the crystals will be useful in nanotechnology may depend on whether the patterns can be turned into electronic devices and circuits, a possibility being explored at other universities including Duke and Purdue.

Nanotechnology applications aside, the authors contend that the most important implication of their work may be a better understanding of how computation shapes the physical world around us. "If algorithmic concepts can be successfully adapted to the molecular context," the authors write, "the algorithm would join energy and entropy as essential concepts for understanding how physical processes create order."

Winfree is an assistant professor of computation and neural systems and computer science; Rothemund is a senior research fellow in computer science and computation and neural systems. The third author is Nick Papadakis, a former staff member in computer science.

 

Writer: 
Robert Tindol
Writer: 

Internet Speed Quadrupled by International Team During 2004 Bandwidth Challenge

PITTSBURGH, Pa.--For the second consecutive year, the "High Energy Physics" team of physicists, computer scientists, and network engineers have won the Supercomputing Bandwidth Challenge with a sustained data transfer of 101 gigabits per second (Gbps) between Pittsburgh and Los Angeles. This is more than four times faster than last year's record of 23.2 gigabits per second, which was set by the same team.

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and deploy a new generation of revolutionary Internet applications.

The international team is led by the California Institute of Technology and includes as partners the Stanford Linear Accelerator Center (SLAC), Fermilab, CERN, the University of Florida, the University of Manchester, University College London (UCL) and the organization UKLight, Rio de Janeiro State University (UERJ), the state universities of São Paulo (USP and UNESP), the Kyungpook National University, and the Korea Institute of Science and Technology Information (KISTI). The group's "High-Speed TeraByte Transfers for Physics" record data transfer speed is equivalent to downloading three full DVD movies per second, or transmitting all of the content of the Library of Congress in 15 minutes, and it corresponds to approximately 5% of the rate that all forms of digital content were produced on Earth during the test.

The new mark, according to Bandwidth Challenge (BWC) sponsor Wesley Kaplow, vice president of engineering and operations for Qwest Government Services exceeded the sum of all the throughput marks submitted in the present and previous years by other BWC entrants. The extraordinary achieved bandwidth was made possible in part through the use of the FAST TCP protocol developed by Professor Steven Low and his Caltech Netlab team. It was achieved through the use of seven 10 Gbps links to Cisco 7600 and 6500 series switch-routers provided by Cisco Systems at the Caltech Center for Advanced Computing (CACR) booth, and three 10 Gbps links to the SLAC/Fermilab booth. The external network connections included four dedicated wavelengths of National LambdaRail, between the SC2004 show floor in Pittsburgh and Los Angeles (two waves), Chicago, and Jacksonville, as well as three 10 Gbps connections across the Scinet network infrastructure at SC2004 with Qwest-provided wavelengths to the Internet2 Abilene Network (two 10 Gbps links), the TeraGrid (three 10 Gbps links) and ESnet. 10 gigabit ethernet (10 GbE) interfaces provided by S2io were used on servers running FAST at the Caltech/CACR booth, and interfaces from Chelsio equipped with transport offload engines (TOE) running standard TCP were used at the SLAC/FNAL booth. During the test, the network links over both the Abilene and National Lambda Rail networks were shown to operate successfully at up to 99 percent of full capacity.

The Bandwidth Challenge allowed the scientists and engineers involved to preview the globally distributed grid system that is now being developed in the US and Europe in preparation for the next generation of high-energy physics experiments at CERN's Large Hadron Collider (LHC), scheduled to begin operation in 2007. Physicists at the LHC will search for the Higgs particles thought to be responsible for mass in the universe and for supersymmetry and other fundamentally new phenomena bearing on the nature of matter and spacetime, in an energy range made accessible by the LHC for the first time.

The largest physics collaborations at the LHC, the Compact Muon Solenoid (CMS), and the Toroidal Large Hadron Collider Apparatus (ATLAS), each encompass more than 2000 physicists and engineers from 160 universities and laboratories spread around the globe. In order to fully exploit the potential for scientific discoveries, many petabytes of data will have to be processed, distributed, and analyzed. The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport, terabyte-scale data samples on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" from already-understood particle interactions. This data will be drawn from major facilities at CERN in Switzerland, at Fermilab and the Brookhaven lab in the U.S., and at other laboratories and computing centers around the world, where the accumulated stored data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Future optical networks, incorporating multiple 10 Gbps links, are the foundation of the grid system that will drive the scientific discoveries. A "hybrid" network integrating both traditional switching and routing of packets, and dynamically constructed optical paths to support the largest data flows, is a central part of the near-term future vision that the scientific community has adopted to meet the challenges of data intensive science in many fields. By demonstrating that many 10 Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic grid supporting many-terabyte and larger data transactions is practical.

While the SC2004 100+ Gbps demonstration required a major effort by the teams involved and their sponsors, in partnership with major research and education network organizations in the United States, Europe, Latin America, and Asia Pacific, it is expected that networking on this scale in support of largest science projects (such as the LHC) will be commonplace within the next three to five years.

The network has been deployed through exceptional support by Cisco Systems, Hewlett Packard, Newisys, S2io, Chelsio, Sun Microsystems, and Boston Ltd., as well as the staffs of National LambdaRail, Qwest, the Internet2 Abilene Network, the Consortium for Education Network Initiatives in California (CENIC), ESnet, the TeraGrid, the AmericasPATH network (AMPATH), the National Education and Research Network of Brazil (RNP) and the GIGA project, as well as ANSP/FAPESP in Brazil, KAIST in Korea, UKERNA in the UK, and the Starlight international peering point in Chicago. The international connections included the LHCNet OC-192 link between Chicago and CERN at Geneva, the CHEPREO OC-48 link between Abilene (Atlanta), Florida International University in Miami, and São Paulo, as well as an OC-12 link between Rio de Janeiro, Madrid, Géant, and Abilene (New York). The APII-TransPAC links to Korea also were used with good occupancy. The throughputs to and from Latin America and Korea represented a significant step up in scale, which the team members hope will be the beginning of a trend toward the widespread use of 10 Gbps-scale network links on DWDM optical networks interlinking different world regions in support of science by the time the LHC begins operation in 2007. The demonstration and the developments leading up to it were made possible through the strong support of the U.S. Department of Energy and the National Science Foundation, in cooperation with the agencies of the international partners.

As part of the demonstration, a distributed analysis of simulated LHC physics data was done using the Grid-enabled Analysis Environment (GAE), developed at Caltech for the LHC and many other major particle physics experiments, as part of the Particle Physics Data Grid, the Grid Physics Network and the International Virtual Data Grid Laboratory (GriPhyN/iVDGL), and Open Science Grid projects. This involved the transfer of data to CERN, Florida, Fermilab, Caltech, UC San Diego, and Brazil for processing by clusters of computers, and finally aggregating the results back to the show floor to create a dynamic visual display of quantities of interest to the physicists. In another part of the demonstration, file servers at the SLAC/FNAL booth in London and Manchester also were used for disk-to-disk transfers from Pittsburgh to England. This gave physicists valuable experience in the use of the large, distributed datasets and to the computational resources connected by fast networks, on the scale required at the start of the LHC physics program.

The team used the MonALISA (MONitoring Agents using a Large Integrated Services Architecture) system developed at Caltech to monitor and display the real-time data for all the network links used in the demonstration. MonALISA (http://monalisa.caltech.edu) is a highly scalable set of autonomous, self-describing, agent-based subsystems which are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and grid systems as well as the scientific applications themselves. Detailed results for the network traffic on all the links used are available at http://boson.cacr.caltech.edu:8888/.

Multi-gigabit/second end-to-end network performance will lead to new models for how research and business is performed. Scientists will be empowered to form virtual organizations on a planetary scale, sharing in a flexible way their collective computing and data resources. In particular, this is vital for projects on the frontiers of science and engineering, in "data intensive" fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

Harvey Newman, professor of physics at Caltech and head of the team, said, "This is a breakthrough for the development of global networks and grids, as well as inter-regional cooperation in science projects at the high-energy frontier. We demonstrated that multiple links of various bandwidths, up to the 10 gigabit-per-second range, can be used effectively over long distances.

"This is a common theme that will drive many fields of data-intensive science, where the network needs are foreseen to rise from tens of gigabits per second to the terabit-per-second range within the next five to 10 years," Newman continued. "In a broader sense, this demonstration paves the way for more flexible, efficient sharing of data and collaborative work by scientists in many countries, which could be a key factor enabling the next round of physics discoveries at the high energy frontier. There are also profound implications for how we could integrate information sharing and on-demand audiovisual collaboration in our daily lives, with a scale and quality previously unimaginable."

Les Cottrell, assistant director of SLAC's computer services, said: "The smooth interworking of 10GE interfaces from multiple vendors, the ability to successfully fill 10 gigabit-per-second paths both on local area networks (LANs), cross-country and intercontinentally, the ability to transmit greater than 10Gbits/second from a single host, and the ability of TCP offload engines (TOE) to reduce CPU utilization, all illustrate the emerging maturity of the 10Gigabit/second Ethernet market. The current limitations are not in the network but rather in the servers at the ends of the links, and their buses."

Further technical information about the demonstration may be found at http://ultralight.caltech.edu/sc2004 and http://www-iepm.slac.stanford.edu/monitoring/bulk/sc2004/hiperf.html A longer version of the release including information on the participating organizations may be found at http://ultralight.caltech.edu/sc2004/BandwidthRecord

 

Laser Points to the Future at Palomar

PALOMAR MOUNTAIN, Calif. — The Hale Telescope on Palomar Mountain has been gathering light from the depths of the universe for 55 years. It finally sent some back early last week as a team of astronomers from the California Institute of Technology, the Jet Propulsion Laboratory and the University of Chicago created an artificial star by propagating a 4-watt laser beam out from the Hale Telescope and up into the night sky.

The laser was propagated as the first step in a program to expand the fraction of sky available to the technique known as adaptive optics. Adaptive optics allows astronomers to correct for the fuzzy images produced by earth's moving atmosphere, giving them a view that often surpasses those of smaller telescopes based in space.

"We have been steadily improving adaptive optics using bright natural guide stars at Palomar. As a result, the system routinely corrects for atmospheric distortions. Now we will be able to go to the next step," says Richard Dekany, associate director for development at Caltech Optical Observatories. Currently astronomers at Palomar can use the adaptive-optics technique only if a moderately bright star is sufficiently close to their object of interest. The adaptive-optics system uses the star as a source by which astronomers monitor and correct for the distortions produced by earth's atmosphere.

Employing the laser will allow astronomers to place an artificial corrective guide star wherever they see fit. To do so, they shine a narrow sodium laser beam up through the atmosphere. At an altitude of about 60 miles, the laser beam makes a small amount of sodium gas glow. The reflected glow from the glowing gas serves as the artificial guide star for the adaptive-optics system. The laser beam is too faint to be seen except by observers very close to the telescope, and the guide star it creates is even fainter. It can't be seen with the unaided eye, yet it is bright enough to allow astronomers to make their adaptive-optics corrections.

The Palomar Observatory currently employs the world's fastest astronomical adaptive optics system on its 200-inch Hale Telescope. It is able to correct for changes in the atmosphere 2,000 times per second. Astronomers from Caltech, JPL, and Cornell University have exploited this system to discover brown dwarf companions to stars, study the weather on a moon of Saturn, and see the shapes of asteroids.

"This is an important achievement that brings us one step closer to our goal," says Mitchell Troy, the adaptive optics group lead and Palomar adaptive optics task manager at the Jet Propulsion Laboratory. The goal, achieving adaptive-optics correction using the laser guide star, is expected next year. This will place Palomar in elite company as just the third observatory worldwide to deploy a laser guide system. This laser will greatly expand the science performed at Palomar and pave the way for future projects on telescopes that have not yet been built.

"This is a terrific technical achievement which not only opens up a bold and exciting scientific future for the venerable 200-inch telescope, but also demonstrates the next step on a path toward future large telescopes such as the Thirty Meter Telescope, " says Richard Ellis, Steele Family Professor of Astronomy and director of the Caltech Optical Observatories. "The next generation of large telescopes requires sodium laser guide-star adaptive-optics of the type being demonstrated at Palomar Observatory," he adds.

Currently in the design phase, the Thirty Meter Telescope (TMT) will eventually deliver images at visible and infrared wavelengths 12 times sharper than those of the Hubble Space Telescope. The TMT project is a collaboration between Caltech and the Associated Universities for Research in Astronomy, the Association of Canadian Universities for Research in Astronomy, and the University of California.

The Caltech adaptive optics team is made up of Richard Dekany (team leader) and Viswa Velur, Rich Goeden, Bob Weber, and Khanh Bui. Professor Edward Kibblewhite, University of Chicago, built the Chicago sum-frequency laser used in this project. The JPL Palomar adaptive optics team includes Mitchell Troy (team leader), Gary Brack, Steve Guiwits, Dean Palmer, Jennifer Roberts, Fang Shi, Thang Trinh, Tuan Truong and Kent Wallace. Installation of the laser at the Hale Telescope was overseen by Andrew Pickles, Robert Thicksten, and Hal Petrie of Palomar Observatory, and supported by Merle Sweet, John Henning, and Steve Einer.

The Palomar adaptive optics instrument was built and continues to be supported by the Jet Propulsion Laboratory as part of a Caltech-JPL collaboration.

Support for the adaptive-optics research at Caltech's Palomar Observatory comes from the Gordon and Betty Moore Foundation, the Oschin Family Foundation, and the National Science Foundation Center for Adaptive Optics.

Writer: 
Jill Perry
Writer: 

A "Smoking Gun" For Nicotine Addiction

Embargoed for Release at 11 a.m. PST, Thursday, November 4, 2004

PASADENA, Calif. - Nicotine is responsible for more than four million smoking-related deaths each year. Yet people still smoke. Why? One reason is the stranglehold of addiction, started when nicotine enhances the release of a neurotransmitter called dopamine, a chemical messenger that induces a feeling of pleasure. That's what smoking, presumably, is all about.

Knowing specifically which receptor molecules are activated by nicotine in the dopamine-releasing cells would be a promising first step in developing a therapeutic drug to help people kick the habit. But that's a challenging goal, since there are many cell receptor proteins, each in turn comprising a set of "subunit proteins" that may respond to nicotine, or instead, to a completely different chemical signal. Reporting in the November 5 issue of the journal Science, California Institute of Technology postdoctoral scholar Andrew Tapper, Professor Allan Collins of the University of Colorado, seven other colleagues, and Henry A. Lester, the Bren Professor of Biology at Caltech, have determined that when receptors with a specific subunit known as alpha4 are activated by nicotine, it's sufficient for some addiction-related events, such as pleasure response, sensitization, and tolerance to repeated doses of nicotine. This research suggests that alpha4, and the molecules that are triggered in turn by alpha4, may prove to be useful targets for addiction therapies.

When cells communicate in the brain, nerve impulses jump chemically across a gap between two nerve cells called the synapse, using a neurotransmitter such as acetylcholine. Acetylcholine activates specific receptors on the post-synaptic nerve cell. This starts the firing of electrical impulses and, in cells that manufacture dopamine, that pleasure-inducing messenger is released as well. Having completed its task, acetylcholine is then rapidly broken down by an enzyme called acetylcholinesterase. It's a clever and wondrous biological machine, says Lester. But, he says, "nicotine is clever too, because it mimics acetylcholine." Worse, nicotine is not broken down by acetylcholinesterase. "So it persists at the synapse for minutes rather than milliseconds, and excites the post-synaptic neurons to fire rapidly for long periods, releasing large amounts of dopamine. Most scientists believe that's a key reason why nicotine is so addictive."

Previous work in several laboratories in the 1990s had suggested that, of the many so-called nicotinic acetylcholine receptors, one consisting of subunits called alpha4 and beta2 was important for nicotine addiction. This had been determined by the use of so-called "knockout" mice. In this method, scientists identify a gene that may affect a behavior, then interrupt or knock it out in a specially bred strain of mice. In this case, the effects of nicotine on the knockout mice and normal mice were compared, and the mice without the beta2 subunit lacked some responses to nicotine.

In work to identify receptor subunits that were sufficient to cause nicotine dependence, Lester's group examined the "partner" subunit, alpha4. But instead of experimenting with a knockout mouse, they developed "hypersensitive knock-in" mice. Lester's group replaced a naturally occurring bit of DNA with a mutation that changed a single amino acid in a single gene among the mouse's 30,000 genes. That change was enough to make the alpha4 subunit highly sensitive to nicotine.

Lester's group first selected the proper mutation with tests in frog eggs and cultures, then bred the strain of mice. As they hypothesized, the mice with the re-engineered alpha4 receptor proved to be highly sensitive even to very small doses of nicotine that didn't activate other nicotinic receptor types. This finding shows the alpha4 subunit is a prime candidate to be studied at the molecular and cellular level, says Lester, "and it may be a possible target for developing a medication that would reduce the release of dopamine caused by nicotine, and hopefully reduce nicotine's addictive grip."

Knockout mice leave nothing to measure, explains Lester. Knock-in mice have the advantage, he says, of isolating and amplifying the "downstream" chain of molecular signals within nerve cells that occur after nicotine activates its receptors. These signals, when repeatedly stimulated by nicotine, eventually cause the-as-yet unknown, long-term changes in nerve cells that are presumably the biological basis of addiction. Lester and his colleagues are now tracking down those signals, molecules, and newly activated genes, using their hypersensitive mice. The eventual hope: one of those signals might provide a molecular target that can be specifically blocked, providing a therapy against addiction. This strategy would resemble modern cancer drugs, such as Gleevec, which block only specific signaling molecules needed by proliferating cancer cells. Hypersensitive knock-in mice may also prove useful in gaining further insight into diseases such as epilepsy and Parkinson's disease.

Lester is optimistic about ultimately defeating nicotine's addictive power. "It's a complicated pathway that still must be broken down into individual steps before we can understand it fully," he says, "but I personally believe that nicotine addiction will be among the first addictions to be solved, because we already have so many tools to study it."

This research was supported by the California Tobacco-Related Disease Research Program, the W. M. Keck Foundation, the Plum Foundation, the National Institute of Mental Health, the National Institute of Neurological Disorders and Stroke, and the National Institute on Drug Abuse.

MEDIA CONTACT: Mark Wheeler (626) 395-8733 wheel@caltech.edu

Writer: 
MW
Writer: 

Size Does Matter - When it Comes to Reducing Environmental Pollution

PASADENA, Calif.-When it comes to mitigating the harmful impacts of environmental pollution--size does matter . . . or, at least, that's the hypothesis that California Institute of Technology professors Janet Hering and Richard Flagan will be testing.

Hering is professor of environmental science and engineering executive officer for Keck Laboratories. Flagan is executive officer of chemical engineering Irma and Ross McCollum professor of chemical engineering and professor of environmental science and engineering.

In a study funded by the Camille and Henry Dreyfus Foundation, Hering and Flagan will examine whether the effectiveness of iron nanoparticles in pollution remediation is influenced by their size. The $120,000 grant, under the Dreyfus Foundation's 2004 Postdoctoral Program in Environmental Chemistry, will be used to recruit a postdoctoral scientist to conduct research in environmental chemistry.

Specifically, the researchers will utilize this grant to examine effective strategies for reduction and mitigation of environmental pollutants in aquatic ecosystems. Ultimately, the study seeks to help provide viable, cost-effective commercial technologies for the remediation of certain contaminants, including groundwater contaminants, chlorinated solvents, nitrates, pesticides, various chemical by-products, residue created in manufacturing, and other industrial or inorganic contaminants.

The study, "Use of Vapor-Phase Synthesized Iron Nanoparticles to Examine Nanoscale Reactivity," will investigate whether reactivity and effectiveness of iron nanoparticles, in pollution mitigation, are influenced by their size. The study will compare particles in different size classes to determine whether nanoparticles exhibit enhanced reactivity in the reduction of organic substrates based on their size when surface area effects are accounted for.

Elemental iron [Fe(0)], or zero-valent iron, has been demonstrated to be an effective reductant for a wide range of environmental contaminants, including both organic and inorganic contaminants. Upon reaction with Fe(0), some contaminants can be transformed to products that are non-toxic or immobile. Fe(0) can be delivered to the subsurface environment by injection of Fe(0) nanoparticles.

If research results yield a conclusion that the size of Fe(0) nanoparticles does make a difference in their reactivity or effectiveness, then this finding will have a significant effect on the application of Fe(0) materials in environmental remediation and will provide insight into the fundamental chemical properties and behavior of nanoparticles in these applications.

Created in 1946, the Camille and Henry Dreyfus Foundation bears the names of modern chemistry pioneers Drs. Camille Dreyfus and his brother Henry. The foundation's mandate is "to advance the science of chemistry, chemical engineering, and related sciences as a means of improving human relations and circumstances throughout the world." The foundation directs much of its resources to the support of excellence in teaching and research by outstanding chemistry faculty at universities and colleges.

Writer: 
DWH
Writer: 
Exclude from News Hub: 
No

Caltech Launches New Information Initiative

PASADENA, Calif. — Information is everywhere. Most of us think about it in terms of mere facts--facts gleaned from a teacher or a colleague, from the media, or words from a textbook or the Internet.

But there are other, near-infinite types of information--the instructions encoded in our genome that tell our cells when to divide and when to die, or the daily flow of data into the stock market that somehow motivates people to buy and sell.

Information constantly streams to scientists from around the world, and from other "worlds" as well, thanks to sensors and actuators in the sea or out in space.

What's needed is a way to harness and understand all of this data so that scientists and engineers can continue to unravel the secrets of nature and the human institutions in which we operate. In an unprecedented effort, the California Institute of Technology has launched a university-wide initiative called Information Science and Technology (IST)--drawing back the curtain on the nature of information itself and redefining the way we approach, understand, and use science and engineering. IST will cut across disciplines, eventually involving over 25 percent of all faculty and nearly 35 percent of students on campus, likely altering the Institute's intellectual and organizational landscape.

Caltech has committed to raising $100 million for IST as part of the Institute's five-year, $1.4 billion capital campaign. Nearly $50 million has been raised in the form of separate grants of $25 million from the Annenberg Foundation and $22.2 million from the Gordon and Betty Moore Foundation. The Annenberg Foundation gift will be used to construct the Walter and Leonore Annenberg Center for Information Science and Technology--a new building that will be the physical center of IST. The building will join the existing Watson and Moore laboratories in forming a core of buildings linking together IST researchers.

Funding from the Moore Foundation will provide seed money to establish four new interdisciplinary research centers within IST. These new centers will join two that already exist at Caltech, and together the six groups will anchor and organize Caltech's effort to lead the way in this new field.

IST evolved over the last 50 years from an activity that focused on enabling more efficient calculations to a major intellectual theme that spans disciplines in engineering and the sciences. While other universities created schools of computer science (or computer and information science), these are generally related to computer science and software--a limited view of information science and technology. At Caltech, IST serves as a new intellectual framework on which to build information-based research and instructional programs across the academic spectrum.

"To maintain preeminence in science, the U.S. needs new and unified ways of looking at, approaching, and exploiting information in and across the physical, biological, and social sciences, and engineering," says Jehoshua (Shuki) Bruck, the Gordon and Betty Moore Professor of Computation and Neural Systems and Electrical Engineering and the first director of IST. "Caltech is taking a leadership role by creating an Institute-wide initiative in the science and engineering of information. IST will transform the research and educational environment at Caltech and other universities around the world."

In the same way that the printing press heralded the start of the Renaissance, and the study of physics helped to foster the Industrial Revolution, technological advances in computation and communication in the 20th century have set the stage for the Age of Information. Yet, scientific and technological changes are accelerating so fast they are outpacing existing institutions such as schools, media, industry, and government--structures originally designed for the needs of the Industrial Age. "So we need a new intellectual framework to harness these new advances," says Bruck, "in order to provide for a stable and well-educated society that's prepared to meet the challenges of tomorrow."

"Some say biology is the science of the 21st century, but information science will provide the unity to all of the sciences," says Caltech president and Nobel Prize-winning biologist David Baltimore. "It will be like the physics of the 20th century in which Einstein went beyond the teachings of Newton--which were enough to put people on the moon--and allowed people's minds to reach into the atom or out into the cosmos. Information science, the understanding of what constitutes information, how it is transmitted, encoded, and retrieved, is in the throes of a revolution whose societal repercussions will be enormous. The new Albert Einstein has yet to emerge, but the time is ripe."

Annenberg Foundation Gift The Annenberg gift is the first portion of a $100 million institutional commitment to IST, and is part of the Institute's capital campaign. Now in the design stage, the Annenberg Center is expected to be completed when the campaign ends in 2007.

"I am delighted that the Annenberg Foundation will be a part of this visionary enterprise," said Leonore Annenberg, foundation president and chairman. "As a publisher, broadcaster, diplomat, and philanthropist, Walter Annenberg was known for breaking new ground. Support for this important new initiative surely would have pleased him as much as it honors the work of the foundation."

Founded in 1989 by Walter H. Annenberg, the Annenberg Foundation exists to advance the public well-being through improved communication. As the principal means of achieving its goal, the foundation encourages the development of more effective ways to share ideas and knowledge.

Gordon and Betty Moore Foundation Gift The Moore Foundation gift is part of a $300 million commitment the foundation made to Caltech in 2001.

The four centers funded by the Moore grant are the following: the Center for Biological Circuit Design, which will address how living things store, process, and share information; the Social and Information Sciences Center, which will investigate how social systems, such as markets, political processes, and organizations, efficiently process immense amounts of information and how this understanding can help to improve society; the Center for the Physics of Information, which will examine the physical qualities of information and will design the computers and materials for the next generation of information technology; and the Center for the Mathematics of Information, which will formulate a common understanding and language of information that unifies researchers from different fields.

The Moore Foundation seeks to develop outcome-based projects that will improve the quality of life for future generations. It organizes the majority of its grant-making around large-scale initiatives that concentrate on: environmental conservation, science, higher education, and the San Francisco Bay Area. 

Writer: 
JP
Writer: 

Observing the Roiling Earth

PASADENA, Calif. - In the 1960s the theory of plate tectonics rocked geology's world by determining that the first 60 miles or so of our planet--the lithosphere--is divided into about a dozen rigid plates that crawl along by centimeters each year. Most manifestations of the earth's dynamics, earthquakes and volcanoes for example, occur along the boundaries of these plates.

As a model, the theory of plate tectonics continues to serve us well, says Jean-Philippe Avouac, a professor of geology at the California Institute of Technology. But while plate tectonics provides a powerful description of the large-scale deformation of the earth's lithosphere over millions of years, it doesn't explain the physical forces that drive the movements of the plates. Also, contrary to the theory, it's now known that plates are not perfectly rigid and that plate boundaries sometimes form broad fault zones with diffuse seismicity.

Now, thanks to a $13,254,000 grant from the Gordon and Betty Moore Foundation, Caltech has established the Tectonic Observatory, under the direction of Avouac, with the ultimate goal, he says, of "providing a new view of how and why the earth's crust is deforming over timescales ranging from a few tens of seconds, the typical duration of an earthquake, to several tens of million of years."

But it's not the only goal. "Most of the outstanding questions in earth science concern processes that take place at the boundaries of the earth's tectonic plates," says Avouac, so the observatory's scientific efforts will be centered around major field studies at a few key plate boundaries in western North America, Sumatra, Central America, and Taiwan, with the goal of answering a number of questions, including:

--Tectonic plates move gradually when viewed on large timescales, but then sometimes undergo sharp "jerks" in speed and direction. What's the cause?

--Because earthquakes can be damaging events to humans, it's important to know: what physical parameters control their timing, location, and size?

--Subduction zones, where oceanic plates sink into the earth's mantle, are needed to accommodate and perhaps drive plate motion. How do these subduction zones originate and grow?

"We plan to take advantage of a number of new technologies that will allow us to measure deformation of the earth's crust and image the earth's interior with unprecedented accuracy," says Avouac. The bulk of the grant will be spent on these new technologies, along with acquiring data that will be used to observe and model the boundary zones. In addition to seismometers, other equipment and data that's needed will include space-based GPS, which will allow geologists to measure the relative velocity of two points on the earth's surface to within a few millimeters each year; satellite images to map displacements of broad areas of the ground's surface over time; geochemical fingerprinting methods to analyze and date rocks that have been brought to the surface by volcanic eruptions or erosion, thus helping to characterize the composition of the earth far below; and of course, massive computation to analyze all the data, along with advanced computational techniques, "to allow us to develop models at the scale of the global earth," says Avouac.

"The breakthroughs we will achieve will probably result from the interactions among the various disciplines that will contribute to the project," he says. "We've already begun our effort, for example, by imaging and monitoring seismic activity and crustal deformation along a major subduction zone in Mexico. As I speak, geologists are in the field and continuing to install what will be a total of 50 seismometers."

Few institutions are capable of mounting this kind of sustained, diverse effort on a single plate boundary, he says, or of mining data from multiple disciplines to create dynamic models. "That's what Caltech is capable of doing," says Avouac. "We hope to breed a new generation of earth scientist. The Tectonics Observatory will offer students an exceptional environment with access to all of the modern techniques and analytical tools in our field, along with the possibility of interacting with a group of faculty with an incredibly diversified expertise."

The Gordon and Betty Moore Foundation was established in September 2000 by Intel cofounder Gordon Moore and his wife, Betty. The foundation funds projects that will measurably improve the quality of life by creating positive outcomes for future generations. Grantmaking is concentrated in initiatives that support the Foundation's principal areas of concern: environmental conservation, science, higher education, and the San Francisco Bay Area.

MEDIA CONTACT: Mark Wheeler (626) 395-8733 wheel@caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media

Writer: 
MW
Writer: 

Systems Biology Could Augur New Age for Predictive and Preventive Medicine

PASADENA, Calif./SEATTLE--The dream of monitoring a patient's physical condition through blood testing has long been realized. But how about detecting diseases in their very early stages, or evaluating how they are responding to treatment, with no more to work with than a drop of blood?

That dream is closer to realization than many of us think, according to several leading experts advocating a new approach known as systems biology. Writing in the current issue of the journal Science, Institute for Systems Biology immunologist and technologist Leroy Hood and California Institute of Technology chemist Jim Heath and their colleagues explain how a new approach to the way that biological information is gathered and processed could soon lead to breakthroughs in the prevention and early treatment of a number of diseases.

The lead author of the Science article is Leroy Hood, a former Caltech professor and now the founding director of the Institute for Systems Biology in Seattle. According to Hood, the focus of medicine in the next few years will shift from treating disease--often after it has already seriously compromised the patient's health--to preventing it before it even sets in.

Hood explains that systems biology essentially analyzes a living organism as if it were an electronic circuit. This approach requires a gigantic amount of information to be collected and processed, including the sequence of the organism's genome, and the mRNAs and proteins that it generates. The object is to understand how all of these molecular components of the system are interrelated, and then predict how the mRNAs or proteins, for example, are affected by disturbances such as genetic mutations, infectious agents, or chemical carcinogens. Therefore, systems biology should be useful for diseases resulting from genetics as well as from the environment.

"Patients' individual genome sequences, or at least sections of them, may be part of their medical files, and routine blood tests will involve thousands of measurements to test for various diseases and genetic predispositions to other conditions," Hood says. "I'll guarantee you we'll see this predictive medicine in 10 years or so."

"In this paper, we first describe a predictive model of how a single-cell yeast organism works," Heath explains, adding that the model covers a metabolic process that utilizes copious amounts from data such as messenger RNA concentrations from all the yeast's 6,000 genes, protein-DNA interactions, and the like.

"The yeast model taught us many lessons for human disease," Heath says. "For example, when yeast is perturbed either genetically or through exposure to some molecule, the mRNAs and proteins that are generated by the yeast provide a fingerprint of the perturbation. In addition, many of those proteins are secreted. The lesson is that a disease, such as a very early-stage cancer, also triggers specific biological responses in people. Many of those responses lead to secreted proteins, and so the blood provides a powerful window for measuring the fingerprint of the early-stage disease."

Heath and his colleagues write in the Science article that, with a sufficient number of measurements, "one can presumably identify distinct patterns for each of the distinct types of a particular cancer, the various stages in the progression of each disease type, the partition of the disease into categories defined by critical therapeutic targets, and the measurement of how drugs alter the disease patterns. The key is that the more questions you want answered, the more measurements you need to make. It is the systems biology approach that defines what needs to be measured to answer the questions."

In other words, the systems biology approach should allow therapists to catch diseases much earlier and treat them much more effectively. "This allows you to imagine the pathway toward predictive medicine rather than reactive medicine, which is what we have now," Heath says.

About 100,000 measurements on yeast were required to construct a predictive network hypothesis. The authors write that 100,000,000 measurements do not yet enable such a hypothesis to be formulated for a human disease. In the conclusion of the Science article, the authors address the technologies that will be needed to fully realize the systems approach to medicine. Heath emphasizes that most of these technologies, ranging from microfluidics to nanotechnologies to molecular-imaging methods, have already been demonstrated, and some are already having a clinical impact. "It's not just a dream that we'll be diagnosing multiple diseases, including early stage detection, from a fingerprick of blood," Heath says.

"Early-stage versions of these technologies will be demonstrated very soon."

The other authors of the paper are Michael E. Phelps of the David Geffen School of Medicine at UCLA, and Biaoyang Lin of the Institute for Systems Biology.

Writer: 
Robert Tindol
Writer: 

Pages

Subscribe to RSS - research_news