Internet Speed Quadrupled by International Team During 2004 Bandwidth Challenge

PITTSBURGH, Pa.--For the second consecutive year, the "High Energy Physics" team of physicists, computer scientists, and network engineers have won the Supercomputing Bandwidth Challenge with a sustained data transfer of 101 gigabits per second (Gbps) between Pittsburgh and Los Angeles. This is more than four times faster than last year's record of 23.2 gigabits per second, which was set by the same team.

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and deploy a new generation of revolutionary Internet applications.

The international team is led by the California Institute of Technology and includes as partners the Stanford Linear Accelerator Center (SLAC), Fermilab, CERN, the University of Florida, the University of Manchester, University College London (UCL) and the organization UKLight, Rio de Janeiro State University (UERJ), the state universities of São Paulo (USP and UNESP), the Kyungpook National University, and the Korea Institute of Science and Technology Information (KISTI). The group's "High-Speed TeraByte Transfers for Physics" record data transfer speed is equivalent to downloading three full DVD movies per second, or transmitting all of the content of the Library of Congress in 15 minutes, and it corresponds to approximately 5% of the rate that all forms of digital content were produced on Earth during the test.

The new mark, according to Bandwidth Challenge (BWC) sponsor Wesley Kaplow, vice president of engineering and operations for Qwest Government Services exceeded the sum of all the throughput marks submitted in the present and previous years by other BWC entrants. The extraordinary achieved bandwidth was made possible in part through the use of the FAST TCP protocol developed by Professor Steven Low and his Caltech Netlab team. It was achieved through the use of seven 10 Gbps links to Cisco 7600 and 6500 series switch-routers provided by Cisco Systems at the Caltech Center for Advanced Computing (CACR) booth, and three 10 Gbps links to the SLAC/Fermilab booth. The external network connections included four dedicated wavelengths of National LambdaRail, between the SC2004 show floor in Pittsburgh and Los Angeles (two waves), Chicago, and Jacksonville, as well as three 10 Gbps connections across the Scinet network infrastructure at SC2004 with Qwest-provided wavelengths to the Internet2 Abilene Network (two 10 Gbps links), the TeraGrid (three 10 Gbps links) and ESnet. 10 gigabit ethernet (10 GbE) interfaces provided by S2io were used on servers running FAST at the Caltech/CACR booth, and interfaces from Chelsio equipped with transport offload engines (TOE) running standard TCP were used at the SLAC/FNAL booth. During the test, the network links over both the Abilene and National Lambda Rail networks were shown to operate successfully at up to 99 percent of full capacity.

The Bandwidth Challenge allowed the scientists and engineers involved to preview the globally distributed grid system that is now being developed in the US and Europe in preparation for the next generation of high-energy physics experiments at CERN's Large Hadron Collider (LHC), scheduled to begin operation in 2007. Physicists at the LHC will search for the Higgs particles thought to be responsible for mass in the universe and for supersymmetry and other fundamentally new phenomena bearing on the nature of matter and spacetime, in an energy range made accessible by the LHC for the first time.

The largest physics collaborations at the LHC, the Compact Muon Solenoid (CMS), and the Toroidal Large Hadron Collider Apparatus (ATLAS), each encompass more than 2000 physicists and engineers from 160 universities and laboratories spread around the globe. In order to fully exploit the potential for scientific discoveries, many petabytes of data will have to be processed, distributed, and analyzed. The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport, terabyte-scale data samples on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" from already-understood particle interactions. This data will be drawn from major facilities at CERN in Switzerland, at Fermilab and the Brookhaven lab in the U.S., and at other laboratories and computing centers around the world, where the accumulated stored data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Future optical networks, incorporating multiple 10 Gbps links, are the foundation of the grid system that will drive the scientific discoveries. A "hybrid" network integrating both traditional switching and routing of packets, and dynamically constructed optical paths to support the largest data flows, is a central part of the near-term future vision that the scientific community has adopted to meet the challenges of data intensive science in many fields. By demonstrating that many 10 Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic grid supporting many-terabyte and larger data transactions is practical.

While the SC2004 100+ Gbps demonstration required a major effort by the teams involved and their sponsors, in partnership with major research and education network organizations in the United States, Europe, Latin America, and Asia Pacific, it is expected that networking on this scale in support of largest science projects (such as the LHC) will be commonplace within the next three to five years.

The network has been deployed through exceptional support by Cisco Systems, Hewlett Packard, Newisys, S2io, Chelsio, Sun Microsystems, and Boston Ltd., as well as the staffs of National LambdaRail, Qwest, the Internet2 Abilene Network, the Consortium for Education Network Initiatives in California (CENIC), ESnet, the TeraGrid, the AmericasPATH network (AMPATH), the National Education and Research Network of Brazil (RNP) and the GIGA project, as well as ANSP/FAPESP in Brazil, KAIST in Korea, UKERNA in the UK, and the Starlight international peering point in Chicago. The international connections included the LHCNet OC-192 link between Chicago and CERN at Geneva, the CHEPREO OC-48 link between Abilene (Atlanta), Florida International University in Miami, and São Paulo, as well as an OC-12 link between Rio de Janeiro, Madrid, Géant, and Abilene (New York). The APII-TransPAC links to Korea also were used with good occupancy. The throughputs to and from Latin America and Korea represented a significant step up in scale, which the team members hope will be the beginning of a trend toward the widespread use of 10 Gbps-scale network links on DWDM optical networks interlinking different world regions in support of science by the time the LHC begins operation in 2007. The demonstration and the developments leading up to it were made possible through the strong support of the U.S. Department of Energy and the National Science Foundation, in cooperation with the agencies of the international partners.

As part of the demonstration, a distributed analysis of simulated LHC physics data was done using the Grid-enabled Analysis Environment (GAE), developed at Caltech for the LHC and many other major particle physics experiments, as part of the Particle Physics Data Grid, the Grid Physics Network and the International Virtual Data Grid Laboratory (GriPhyN/iVDGL), and Open Science Grid projects. This involved the transfer of data to CERN, Florida, Fermilab, Caltech, UC San Diego, and Brazil for processing by clusters of computers, and finally aggregating the results back to the show floor to create a dynamic visual display of quantities of interest to the physicists. In another part of the demonstration, file servers at the SLAC/FNAL booth in London and Manchester also were used for disk-to-disk transfers from Pittsburgh to England. This gave physicists valuable experience in the use of the large, distributed datasets and to the computational resources connected by fast networks, on the scale required at the start of the LHC physics program.

The team used the MonALISA (MONitoring Agents using a Large Integrated Services Architecture) system developed at Caltech to monitor and display the real-time data for all the network links used in the demonstration. MonALISA (http://monalisa.caltech.edu) is a highly scalable set of autonomous, self-describing, agent-based subsystems which are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and grid systems as well as the scientific applications themselves. Detailed results for the network traffic on all the links used are available at http://boson.cacr.caltech.edu:8888/.

Multi-gigabit/second end-to-end network performance will lead to new models for how research and business is performed. Scientists will be empowered to form virtual organizations on a planetary scale, sharing in a flexible way their collective computing and data resources. In particular, this is vital for projects on the frontiers of science and engineering, in "data intensive" fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

Harvey Newman, professor of physics at Caltech and head of the team, said, "This is a breakthrough for the development of global networks and grids, as well as inter-regional cooperation in science projects at the high-energy frontier. We demonstrated that multiple links of various bandwidths, up to the 10 gigabit-per-second range, can be used effectively over long distances.

"This is a common theme that will drive many fields of data-intensive science, where the network needs are foreseen to rise from tens of gigabits per second to the terabit-per-second range within the next five to 10 years," Newman continued. "In a broader sense, this demonstration paves the way for more flexible, efficient sharing of data and collaborative work by scientists in many countries, which could be a key factor enabling the next round of physics discoveries at the high energy frontier. There are also profound implications for how we could integrate information sharing and on-demand audiovisual collaboration in our daily lives, with a scale and quality previously unimaginable."

Les Cottrell, assistant director of SLAC's computer services, said: "The smooth interworking of 10GE interfaces from multiple vendors, the ability to successfully fill 10 gigabit-per-second paths both on local area networks (LANs), cross-country and intercontinentally, the ability to transmit greater than 10Gbits/second from a single host, and the ability of TCP offload engines (TOE) to reduce CPU utilization, all illustrate the emerging maturity of the 10Gigabit/second Ethernet market. The current limitations are not in the network but rather in the servers at the ends of the links, and their buses."

Further technical information about the demonstration may be found at http://ultralight.caltech.edu/sc2004 and http://www-iepm.slac.stanford.edu/monitoring/bulk/sc2004/hiperf.html A longer version of the release including information on the participating organizations may be found at http://ultralight.caltech.edu/sc2004/BandwidthRecord

 

Laser Points to the Future at Palomar

PALOMAR MOUNTAIN, Calif. — The Hale Telescope on Palomar Mountain has been gathering light from the depths of the universe for 55 years. It finally sent some back early last week as a team of astronomers from the California Institute of Technology, the Jet Propulsion Laboratory and the University of Chicago created an artificial star by propagating a 4-watt laser beam out from the Hale Telescope and up into the night sky.

The laser was propagated as the first step in a program to expand the fraction of sky available to the technique known as adaptive optics. Adaptive optics allows astronomers to correct for the fuzzy images produced by earth's moving atmosphere, giving them a view that often surpasses those of smaller telescopes based in space.

"We have been steadily improving adaptive optics using bright natural guide stars at Palomar. As a result, the system routinely corrects for atmospheric distortions. Now we will be able to go to the next step," says Richard Dekany, associate director for development at Caltech Optical Observatories. Currently astronomers at Palomar can use the adaptive-optics technique only if a moderately bright star is sufficiently close to their object of interest. The adaptive-optics system uses the star as a source by which astronomers monitor and correct for the distortions produced by earth's atmosphere.

Employing the laser will allow astronomers to place an artificial corrective guide star wherever they see fit. To do so, they shine a narrow sodium laser beam up through the atmosphere. At an altitude of about 60 miles, the laser beam makes a small amount of sodium gas glow. The reflected glow from the glowing gas serves as the artificial guide star for the adaptive-optics system. The laser beam is too faint to be seen except by observers very close to the telescope, and the guide star it creates is even fainter. It can't be seen with the unaided eye, yet it is bright enough to allow astronomers to make their adaptive-optics corrections.

The Palomar Observatory currently employs the world's fastest astronomical adaptive optics system on its 200-inch Hale Telescope. It is able to correct for changes in the atmosphere 2,000 times per second. Astronomers from Caltech, JPL, and Cornell University have exploited this system to discover brown dwarf companions to stars, study the weather on a moon of Saturn, and see the shapes of asteroids.

"This is an important achievement that brings us one step closer to our goal," says Mitchell Troy, the adaptive optics group lead and Palomar adaptive optics task manager at the Jet Propulsion Laboratory. The goal, achieving adaptive-optics correction using the laser guide star, is expected next year. This will place Palomar in elite company as just the third observatory worldwide to deploy a laser guide system. This laser will greatly expand the science performed at Palomar and pave the way for future projects on telescopes that have not yet been built.

"This is a terrific technical achievement which not only opens up a bold and exciting scientific future for the venerable 200-inch telescope, but also demonstrates the next step on a path toward future large telescopes such as the Thirty Meter Telescope, " says Richard Ellis, Steele Family Professor of Astronomy and director of the Caltech Optical Observatories. "The next generation of large telescopes requires sodium laser guide-star adaptive-optics of the type being demonstrated at Palomar Observatory," he adds.

Currently in the design phase, the Thirty Meter Telescope (TMT) will eventually deliver images at visible and infrared wavelengths 12 times sharper than those of the Hubble Space Telescope. The TMT project is a collaboration between Caltech and the Associated Universities for Research in Astronomy, the Association of Canadian Universities for Research in Astronomy, and the University of California.

The Caltech adaptive optics team is made up of Richard Dekany (team leader) and Viswa Velur, Rich Goeden, Bob Weber, and Khanh Bui. Professor Edward Kibblewhite, University of Chicago, built the Chicago sum-frequency laser used in this project. The JPL Palomar adaptive optics team includes Mitchell Troy (team leader), Gary Brack, Steve Guiwits, Dean Palmer, Jennifer Roberts, Fang Shi, Thang Trinh, Tuan Truong and Kent Wallace. Installation of the laser at the Hale Telescope was overseen by Andrew Pickles, Robert Thicksten, and Hal Petrie of Palomar Observatory, and supported by Merle Sweet, John Henning, and Steve Einer.

The Palomar adaptive optics instrument was built and continues to be supported by the Jet Propulsion Laboratory as part of a Caltech-JPL collaboration.

Support for the adaptive-optics research at Caltech's Palomar Observatory comes from the Gordon and Betty Moore Foundation, the Oschin Family Foundation, and the National Science Foundation Center for Adaptive Optics.

Writer: 
Jill Perry
Writer: 

A "Smoking Gun" For Nicotine Addiction

Embargoed for Release at 11 a.m. PST, Thursday, November 4, 2004

PASADENA, Calif. - Nicotine is responsible for more than four million smoking-related deaths each year. Yet people still smoke. Why? One reason is the stranglehold of addiction, started when nicotine enhances the release of a neurotransmitter called dopamine, a chemical messenger that induces a feeling of pleasure. That's what smoking, presumably, is all about.

Knowing specifically which receptor molecules are activated by nicotine in the dopamine-releasing cells would be a promising first step in developing a therapeutic drug to help people kick the habit. But that's a challenging goal, since there are many cell receptor proteins, each in turn comprising a set of "subunit proteins" that may respond to nicotine, or instead, to a completely different chemical signal. Reporting in the November 5 issue of the journal Science, California Institute of Technology postdoctoral scholar Andrew Tapper, Professor Allan Collins of the University of Colorado, seven other colleagues, and Henry A. Lester, the Bren Professor of Biology at Caltech, have determined that when receptors with a specific subunit known as alpha4 are activated by nicotine, it's sufficient for some addiction-related events, such as pleasure response, sensitization, and tolerance to repeated doses of nicotine. This research suggests that alpha4, and the molecules that are triggered in turn by alpha4, may prove to be useful targets for addiction therapies.

When cells communicate in the brain, nerve impulses jump chemically across a gap between two nerve cells called the synapse, using a neurotransmitter such as acetylcholine. Acetylcholine activates specific receptors on the post-synaptic nerve cell. This starts the firing of electrical impulses and, in cells that manufacture dopamine, that pleasure-inducing messenger is released as well. Having completed its task, acetylcholine is then rapidly broken down by an enzyme called acetylcholinesterase. It's a clever and wondrous biological machine, says Lester. But, he says, "nicotine is clever too, because it mimics acetylcholine." Worse, nicotine is not broken down by acetylcholinesterase. "So it persists at the synapse for minutes rather than milliseconds, and excites the post-synaptic neurons to fire rapidly for long periods, releasing large amounts of dopamine. Most scientists believe that's a key reason why nicotine is so addictive."

Previous work in several laboratories in the 1990s had suggested that, of the many so-called nicotinic acetylcholine receptors, one consisting of subunits called alpha4 and beta2 was important for nicotine addiction. This had been determined by the use of so-called "knockout" mice. In this method, scientists identify a gene that may affect a behavior, then interrupt or knock it out in a specially bred strain of mice. In this case, the effects of nicotine on the knockout mice and normal mice were compared, and the mice without the beta2 subunit lacked some responses to nicotine.

In work to identify receptor subunits that were sufficient to cause nicotine dependence, Lester's group examined the "partner" subunit, alpha4. But instead of experimenting with a knockout mouse, they developed "hypersensitive knock-in" mice. Lester's group replaced a naturally occurring bit of DNA with a mutation that changed a single amino acid in a single gene among the mouse's 30,000 genes. That change was enough to make the alpha4 subunit highly sensitive to nicotine.

Lester's group first selected the proper mutation with tests in frog eggs and cultures, then bred the strain of mice. As they hypothesized, the mice with the re-engineered alpha4 receptor proved to be highly sensitive even to very small doses of nicotine that didn't activate other nicotinic receptor types. This finding shows the alpha4 subunit is a prime candidate to be studied at the molecular and cellular level, says Lester, "and it may be a possible target for developing a medication that would reduce the release of dopamine caused by nicotine, and hopefully reduce nicotine's addictive grip."

Knockout mice leave nothing to measure, explains Lester. Knock-in mice have the advantage, he says, of isolating and amplifying the "downstream" chain of molecular signals within nerve cells that occur after nicotine activates its receptors. These signals, when repeatedly stimulated by nicotine, eventually cause the-as-yet unknown, long-term changes in nerve cells that are presumably the biological basis of addiction. Lester and his colleagues are now tracking down those signals, molecules, and newly activated genes, using their hypersensitive mice. The eventual hope: one of those signals might provide a molecular target that can be specifically blocked, providing a therapy against addiction. This strategy would resemble modern cancer drugs, such as Gleevec, which block only specific signaling molecules needed by proliferating cancer cells. Hypersensitive knock-in mice may also prove useful in gaining further insight into diseases such as epilepsy and Parkinson's disease.

Lester is optimistic about ultimately defeating nicotine's addictive power. "It's a complicated pathway that still must be broken down into individual steps before we can understand it fully," he says, "but I personally believe that nicotine addiction will be among the first addictions to be solved, because we already have so many tools to study it."

This research was supported by the California Tobacco-Related Disease Research Program, the W. M. Keck Foundation, the Plum Foundation, the National Institute of Mental Health, the National Institute of Neurological Disorders and Stroke, and the National Institute on Drug Abuse.

MEDIA CONTACT: Mark Wheeler (626) 395-8733 wheel@caltech.edu

Writer: 
MW
Writer: 

Size Does Matter - When it Comes to Reducing Environmental Pollution

PASADENA, Calif.-When it comes to mitigating the harmful impacts of environmental pollution--size does matter . . . or, at least, that's the hypothesis that California Institute of Technology professors Janet Hering and Richard Flagan will be testing.

Hering is professor of environmental science and engineering executive officer for Keck Laboratories. Flagan is executive officer of chemical engineering Irma and Ross McCollum professor of chemical engineering and professor of environmental science and engineering.

In a study funded by the Camille and Henry Dreyfus Foundation, Hering and Flagan will examine whether the effectiveness of iron nanoparticles in pollution remediation is influenced by their size. The $120,000 grant, under the Dreyfus Foundation's 2004 Postdoctoral Program in Environmental Chemistry, will be used to recruit a postdoctoral scientist to conduct research in environmental chemistry.

Specifically, the researchers will utilize this grant to examine effective strategies for reduction and mitigation of environmental pollutants in aquatic ecosystems. Ultimately, the study seeks to help provide viable, cost-effective commercial technologies for the remediation of certain contaminants, including groundwater contaminants, chlorinated solvents, nitrates, pesticides, various chemical by-products, residue created in manufacturing, and other industrial or inorganic contaminants.

The study, "Use of Vapor-Phase Synthesized Iron Nanoparticles to Examine Nanoscale Reactivity," will investigate whether reactivity and effectiveness of iron nanoparticles, in pollution mitigation, are influenced by their size. The study will compare particles in different size classes to determine whether nanoparticles exhibit enhanced reactivity in the reduction of organic substrates based on their size when surface area effects are accounted for.

Elemental iron [Fe(0)], or zero-valent iron, has been demonstrated to be an effective reductant for a wide range of environmental contaminants, including both organic and inorganic contaminants. Upon reaction with Fe(0), some contaminants can be transformed to products that are non-toxic or immobile. Fe(0) can be delivered to the subsurface environment by injection of Fe(0) nanoparticles.

If research results yield a conclusion that the size of Fe(0) nanoparticles does make a difference in their reactivity or effectiveness, then this finding will have a significant effect on the application of Fe(0) materials in environmental remediation and will provide insight into the fundamental chemical properties and behavior of nanoparticles in these applications.

Created in 1946, the Camille and Henry Dreyfus Foundation bears the names of modern chemistry pioneers Drs. Camille Dreyfus and his brother Henry. The foundation's mandate is "to advance the science of chemistry, chemical engineering, and related sciences as a means of improving human relations and circumstances throughout the world." The foundation directs much of its resources to the support of excellence in teaching and research by outstanding chemistry faculty at universities and colleges.

Writer: 
DWH
Writer: 
Exclude from News Hub: 
No

Caltech Launches New Information Initiative

PASADENA, Calif. — Information is everywhere. Most of us think about it in terms of mere facts--facts gleaned from a teacher or a colleague, from the media, or words from a textbook or the Internet.

But there are other, near-infinite types of information--the instructions encoded in our genome that tell our cells when to divide and when to die, or the daily flow of data into the stock market that somehow motivates people to buy and sell.

Information constantly streams to scientists from around the world, and from other "worlds" as well, thanks to sensors and actuators in the sea or out in space.

What's needed is a way to harness and understand all of this data so that scientists and engineers can continue to unravel the secrets of nature and the human institutions in which we operate. In an unprecedented effort, the California Institute of Technology has launched a university-wide initiative called Information Science and Technology (IST)--drawing back the curtain on the nature of information itself and redefining the way we approach, understand, and use science and engineering. IST will cut across disciplines, eventually involving over 25 percent of all faculty and nearly 35 percent of students on campus, likely altering the Institute's intellectual and organizational landscape.

Caltech has committed to raising $100 million for IST as part of the Institute's five-year, $1.4 billion capital campaign. Nearly $50 million has been raised in the form of separate grants of $25 million from the Annenberg Foundation and $22.2 million from the Gordon and Betty Moore Foundation. The Annenberg Foundation gift will be used to construct the Walter and Leonore Annenberg Center for Information Science and Technology--a new building that will be the physical center of IST. The building will join the existing Watson and Moore laboratories in forming a core of buildings linking together IST researchers.

Funding from the Moore Foundation will provide seed money to establish four new interdisciplinary research centers within IST. These new centers will join two that already exist at Caltech, and together the six groups will anchor and organize Caltech's effort to lead the way in this new field.

IST evolved over the last 50 years from an activity that focused on enabling more efficient calculations to a major intellectual theme that spans disciplines in engineering and the sciences. While other universities created schools of computer science (or computer and information science), these are generally related to computer science and software--a limited view of information science and technology. At Caltech, IST serves as a new intellectual framework on which to build information-based research and instructional programs across the academic spectrum.

"To maintain preeminence in science, the U.S. needs new and unified ways of looking at, approaching, and exploiting information in and across the physical, biological, and social sciences, and engineering," says Jehoshua (Shuki) Bruck, the Gordon and Betty Moore Professor of Computation and Neural Systems and Electrical Engineering and the first director of IST. "Caltech is taking a leadership role by creating an Institute-wide initiative in the science and engineering of information. IST will transform the research and educational environment at Caltech and other universities around the world."

In the same way that the printing press heralded the start of the Renaissance, and the study of physics helped to foster the Industrial Revolution, technological advances in computation and communication in the 20th century have set the stage for the Age of Information. Yet, scientific and technological changes are accelerating so fast they are outpacing existing institutions such as schools, media, industry, and government--structures originally designed for the needs of the Industrial Age. "So we need a new intellectual framework to harness these new advances," says Bruck, "in order to provide for a stable and well-educated society that's prepared to meet the challenges of tomorrow."

"Some say biology is the science of the 21st century, but information science will provide the unity to all of the sciences," says Caltech president and Nobel Prize-winning biologist David Baltimore. "It will be like the physics of the 20th century in which Einstein went beyond the teachings of Newton--which were enough to put people on the moon--and allowed people's minds to reach into the atom or out into the cosmos. Information science, the understanding of what constitutes information, how it is transmitted, encoded, and retrieved, is in the throes of a revolution whose societal repercussions will be enormous. The new Albert Einstein has yet to emerge, but the time is ripe."

Annenberg Foundation Gift The Annenberg gift is the first portion of a $100 million institutional commitment to IST, and is part of the Institute's capital campaign. Now in the design stage, the Annenberg Center is expected to be completed when the campaign ends in 2007.

"I am delighted that the Annenberg Foundation will be a part of this visionary enterprise," said Leonore Annenberg, foundation president and chairman. "As a publisher, broadcaster, diplomat, and philanthropist, Walter Annenberg was known for breaking new ground. Support for this important new initiative surely would have pleased him as much as it honors the work of the foundation."

Founded in 1989 by Walter H. Annenberg, the Annenberg Foundation exists to advance the public well-being through improved communication. As the principal means of achieving its goal, the foundation encourages the development of more effective ways to share ideas and knowledge.

Gordon and Betty Moore Foundation Gift The Moore Foundation gift is part of a $300 million commitment the foundation made to Caltech in 2001.

The four centers funded by the Moore grant are the following: the Center for Biological Circuit Design, which will address how living things store, process, and share information; the Social and Information Sciences Center, which will investigate how social systems, such as markets, political processes, and organizations, efficiently process immense amounts of information and how this understanding can help to improve society; the Center for the Physics of Information, which will examine the physical qualities of information and will design the computers and materials for the next generation of information technology; and the Center for the Mathematics of Information, which will formulate a common understanding and language of information that unifies researchers from different fields.

The Moore Foundation seeks to develop outcome-based projects that will improve the quality of life for future generations. It organizes the majority of its grant-making around large-scale initiatives that concentrate on: environmental conservation, science, higher education, and the San Francisco Bay Area. 

Writer: 
JP
Writer: 

Observing the Roiling Earth

PASADENA, Calif. - In the 1960s the theory of plate tectonics rocked geology's world by determining that the first 60 miles or so of our planet--the lithosphere--is divided into about a dozen rigid plates that crawl along by centimeters each year. Most manifestations of the earth's dynamics, earthquakes and volcanoes for example, occur along the boundaries of these plates.

As a model, the theory of plate tectonics continues to serve us well, says Jean-Philippe Avouac, a professor of geology at the California Institute of Technology. But while plate tectonics provides a powerful description of the large-scale deformation of the earth's lithosphere over millions of years, it doesn't explain the physical forces that drive the movements of the plates. Also, contrary to the theory, it's now known that plates are not perfectly rigid and that plate boundaries sometimes form broad fault zones with diffuse seismicity.

Now, thanks to a $13,254,000 grant from the Gordon and Betty Moore Foundation, Caltech has established the Tectonic Observatory, under the direction of Avouac, with the ultimate goal, he says, of "providing a new view of how and why the earth's crust is deforming over timescales ranging from a few tens of seconds, the typical duration of an earthquake, to several tens of million of years."

But it's not the only goal. "Most of the outstanding questions in earth science concern processes that take place at the boundaries of the earth's tectonic plates," says Avouac, so the observatory's scientific efforts will be centered around major field studies at a few key plate boundaries in western North America, Sumatra, Central America, and Taiwan, with the goal of answering a number of questions, including:

--Tectonic plates move gradually when viewed on large timescales, but then sometimes undergo sharp "jerks" in speed and direction. What's the cause?

--Because earthquakes can be damaging events to humans, it's important to know: what physical parameters control their timing, location, and size?

--Subduction zones, where oceanic plates sink into the earth's mantle, are needed to accommodate and perhaps drive plate motion. How do these subduction zones originate and grow?

"We plan to take advantage of a number of new technologies that will allow us to measure deformation of the earth's crust and image the earth's interior with unprecedented accuracy," says Avouac. The bulk of the grant will be spent on these new technologies, along with acquiring data that will be used to observe and model the boundary zones. In addition to seismometers, other equipment and data that's needed will include space-based GPS, which will allow geologists to measure the relative velocity of two points on the earth's surface to within a few millimeters each year; satellite images to map displacements of broad areas of the ground's surface over time; geochemical fingerprinting methods to analyze and date rocks that have been brought to the surface by volcanic eruptions or erosion, thus helping to characterize the composition of the earth far below; and of course, massive computation to analyze all the data, along with advanced computational techniques, "to allow us to develop models at the scale of the global earth," says Avouac.

"The breakthroughs we will achieve will probably result from the interactions among the various disciplines that will contribute to the project," he says. "We've already begun our effort, for example, by imaging and monitoring seismic activity and crustal deformation along a major subduction zone in Mexico. As I speak, geologists are in the field and continuing to install what will be a total of 50 seismometers."

Few institutions are capable of mounting this kind of sustained, diverse effort on a single plate boundary, he says, or of mining data from multiple disciplines to create dynamic models. "That's what Caltech is capable of doing," says Avouac. "We hope to breed a new generation of earth scientist. The Tectonics Observatory will offer students an exceptional environment with access to all of the modern techniques and analytical tools in our field, along with the possibility of interacting with a group of faculty with an incredibly diversified expertise."

The Gordon and Betty Moore Foundation was established in September 2000 by Intel cofounder Gordon Moore and his wife, Betty. The foundation funds projects that will measurably improve the quality of life by creating positive outcomes for future generations. Grantmaking is concentrated in initiatives that support the Foundation's principal areas of concern: environmental conservation, science, higher education, and the San Francisco Bay Area.

MEDIA CONTACT: Mark Wheeler (626) 395-8733 wheel@caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media

Writer: 
MW
Writer: 

Systems Biology Could Augur New Age for Predictive and Preventive Medicine

PASADENA, Calif./SEATTLE--The dream of monitoring a patient's physical condition through blood testing has long been realized. But how about detecting diseases in their very early stages, or evaluating how they are responding to treatment, with no more to work with than a drop of blood?

That dream is closer to realization than many of us think, according to several leading experts advocating a new approach known as systems biology. Writing in the current issue of the journal Science, Institute for Systems Biology immunologist and technologist Leroy Hood and California Institute of Technology chemist Jim Heath and their colleagues explain how a new approach to the way that biological information is gathered and processed could soon lead to breakthroughs in the prevention and early treatment of a number of diseases.

The lead author of the Science article is Leroy Hood, a former Caltech professor and now the founding director of the Institute for Systems Biology in Seattle. According to Hood, the focus of medicine in the next few years will shift from treating disease--often after it has already seriously compromised the patient's health--to preventing it before it even sets in.

Hood explains that systems biology essentially analyzes a living organism as if it were an electronic circuit. This approach requires a gigantic amount of information to be collected and processed, including the sequence of the organism's genome, and the mRNAs and proteins that it generates. The object is to understand how all of these molecular components of the system are interrelated, and then predict how the mRNAs or proteins, for example, are affected by disturbances such as genetic mutations, infectious agents, or chemical carcinogens. Therefore, systems biology should be useful for diseases resulting from genetics as well as from the environment.

"Patients' individual genome sequences, or at least sections of them, may be part of their medical files, and routine blood tests will involve thousands of measurements to test for various diseases and genetic predispositions to other conditions," Hood says. "I'll guarantee you we'll see this predictive medicine in 10 years or so."

"In this paper, we first describe a predictive model of how a single-cell yeast organism works," Heath explains, adding that the model covers a metabolic process that utilizes copious amounts from data such as messenger RNA concentrations from all the yeast's 6,000 genes, protein-DNA interactions, and the like.

"The yeast model taught us many lessons for human disease," Heath says. "For example, when yeast is perturbed either genetically or through exposure to some molecule, the mRNAs and proteins that are generated by the yeast provide a fingerprint of the perturbation. In addition, many of those proteins are secreted. The lesson is that a disease, such as a very early-stage cancer, also triggers specific biological responses in people. Many of those responses lead to secreted proteins, and so the blood provides a powerful window for measuring the fingerprint of the early-stage disease."

Heath and his colleagues write in the Science article that, with a sufficient number of measurements, "one can presumably identify distinct patterns for each of the distinct types of a particular cancer, the various stages in the progression of each disease type, the partition of the disease into categories defined by critical therapeutic targets, and the measurement of how drugs alter the disease patterns. The key is that the more questions you want answered, the more measurements you need to make. It is the systems biology approach that defines what needs to be measured to answer the questions."

In other words, the systems biology approach should allow therapists to catch diseases much earlier and treat them much more effectively. "This allows you to imagine the pathway toward predictive medicine rather than reactive medicine, which is what we have now," Heath says.

About 100,000 measurements on yeast were required to construct a predictive network hypothesis. The authors write that 100,000,000 measurements do not yet enable such a hypothesis to be formulated for a human disease. In the conclusion of the Science article, the authors address the technologies that will be needed to fully realize the systems approach to medicine. Heath emphasizes that most of these technologies, ranging from microfluidics to nanotechnologies to molecular-imaging methods, have already been demonstrated, and some are already having a clinical impact. "It's not just a dream that we'll be diagnosing multiple diseases, including early stage detection, from a fingerprick of blood," Heath says.

"Early-stage versions of these technologies will be demonstrated very soon."

The other authors of the paper are Michael E. Phelps of the David Geffen School of Medicine at UCLA, and Biaoyang Lin of the Institute for Systems Biology.

Writer: 
Robert Tindol
Writer: 

Caltech Biologists Pursue Promising New Approach in Treatment of HIV/AIDS and Cancer

PASADENA, Calif.—In response to the arduously slow progress in finding cures for AIDS and cancer, Caltech researchers are now investigating a promising new approach in the treatment of these diseases.

With a $1.5 million matching grant from the Skirball Foundation in New York, Caltech biologists have established the Engineering Immunity project, designed to create a novel immunological approach to treating–and even some day preventing–HIV infection and some cancers like melanoma.

The immune system provides humans with a powerful defense against infectious diseases–but sometimes, it fails. Utilizing an innovative, integrated approach, the Engineering Immunity project will combine gene therapy, stem cell biology, and immunotherapy to arm the immune system; this integrative methodology offers groundbreaking potential for treatment of these diseases and others for which the immune system currently fails to provide defense.

Caltech President David Baltimore, who won the Nobel Prize in 1975 for his work in virology and cancer research, stated, "The Engineering Immunity project advocates a new approach to therapy for AIDS and cancer with revolutionary implications for the treatment of these and many other diseases. It is an innovative research project that holds special significance for the future of biomedical sciences."

In the fight against HIV, the virus that causes AIDS, T-cell immunity and T-cell-focused therapies and vaccines have been methods widely investigated and pursued. However, antibodies often provide the best protection against viruses, and virtually all vaccines for other viral diseases are designed to elicit antibody-based immunity. Antibodies against HIV do appear during HIV infections, but heretofore, they had not been able to provide therapeutic advantage to most patients. Rare neutralizing antibodies have been identified, but have not proven valuable because a general way to elicit their production in all patients has not been found. Moreover, most of them are effective only at very high concentrations that are hard to maintain in a person by conventional means. Thus, early attempts to elicit antibody-based immunity against HIV have largely failed.

The Engineering Immunity integrated methodology involves utilizing retroviruses, which are natural carriers of genes. Retrovirus vectors will be produced that encode antibodies found to be effective against HIV. Utilizing retroviruses, the Baltimore Laboratory at Caltech, in collaboration with Caltech structural biologist Pamela Bjorkman, will introduce specific genes into stem cells. These genes will encode specificity molecules on the immune cells, thereby arming the immune cells to kill selected agents or cells, i.e., the cells that are growing HIV or particular cancer cells.

The Engineering Immunity initiative will provide a new route to the production of antibodies with therapeutic, and even protective, ability for a potential cure of AIDS, melanoma, and other diseases ultimately.

The Skirball Foundation, an independent foundation created in 1950 by Jack H. Skirball, is dedicated primarily to medical research and care, educational and social needs of disadvantaged children, and advancing the highest values of the Jewish heritage. Among the many institutions that the Foundation has supported are the Skirball Cultural Center, the Salk Institute, the Venice Family Clinic, the Jewish Childcare Association in New York City, and the Skirball Institute of Biomolecular Medicine at New York University.

###

Contact: Deborah Williams-Hedges (626) 395-3227 debwms@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

Writer: 
DWH
Writer: 

Fuel Cells: the Next Generation

PASADENA, Calif. — For several years now the Department of Energy (DOE) has been urging the fuel cell community to solve a major problem in the design of solid oxide fuel cells (SOFCs): heat. Such fuel cells could someday provide reliable power for homes and industry, dramatically cutting greenhouse gas emissions as well as other pollutants.

But SOFCs run hot, at temperatures as high as 1000 degrees Celsius (about 1800 degrees Fahrenheit). They're efficient at such temperatures, but only a few costly materials can withstand the heat. Using such materials makes things expensive, and is the reason for the push for lower temperatures by the DOE.

Sossina Haile, an associate professor of materials science and chemical engineering at the California Institute of Technology, is an expert in fuel cells, and she has been whittling away at the heat problem for years. Now she and her colleagues have not only solved the problem, they've smashed it. They've brought the temperature down to about 600 degrees Celsius (1100 degrees Fahrenheit), while achieving more power output than others are achieving at the higher temperatures--about 1 watt per square centimeter of fuel cell area.

They accomplished this by changing the chemical composition of one component of a fuel cell called the cathode. The cathode is where air is fed in to the fuel cell, and it's where the oxygen is electrochemically reduced to oxygen ions. The oxygen ions then migrate across the electrolyte (which conducts electricity), to react with fuel at the anode, another fuel cell component. The electrochemical reduction of oxygen is an essential step in the fuel cell's process of generating power. But the problem with running solid oxide fuel cells at 500 to 700 degrees Celsius is that the cathode becomes inactive when the temperature is less than about 800 degrees Celsius.

Haile and postdoctoral scholar Zongping Shao's insight was to switch out the conventional cathode and replace it with a compound that has a long chemical formula guaranteed to strike fear into the heart of every undergraduate, but is abbreviated as "BSCF" for short.

What BSCF can do that standard cathodes can't is to allow the oxygen to diffuse through it very rapidly. "In conventional cathodes, the oxygen diffuses slowly, so that even if the electrochemical reaction is fast, the oxygen ions are slow in getting to the electrolyte," says Haile. "In BSCF the electrochemical reaction is fast and the oxygen ion transport is fast. You have the best combination of properties." This combination is what gives the very high power outputs from Haile's fuel cells.

The work was reported in a recent issue of the journal Nature. Because they are using relatively conventional anodes and electrolytes with this new cathode, says Haile, it would be easy to switch out cathodes in existing fuel cells. That will probably be their next step, says Haile: to partner with a company to produce the next generation of solid-oxide fuel cells.

Writer: 
MW
Writer: 
Exclude from News Hub: 
No

CBI Reveals Motion in the Remotest Seeds of Galaxy Clusters in the Very Early Universe

PASADENA, Calif.--Cosmologists from the California Institute of Technology have used observations probing back to the remote epoch of the universe when atoms were first forming to detect movements among the seeds that gave rise to clusters of galaxies. The new results show the motion of primordial matter on its way to forming galaxy clusters and superclusters. The observations were obtained with an instrument high in the Chilean Andes known as the Cosmic Background Imager (CBI), and they provide new confidence in the accuracy of the standard model of the early universe in which rapid inflation occurred a brief instant after the Big Bang.

The novel feature of these polarization observations is that they reveal directly the seeds of galaxy clusters and their motions as they proceeded to form the first clusters of galaxies.

Reporting in the October 7 online edition of Science Express, Caltech's Rawn Professor of Astronomy, and principal investigator on the CBI project, Anthony Readhead and his team say the new polarization results provide strong support for the standard model of the universe as a place in which dark matter and dark energy are much more prevalent than everyday matter as we know it, which poses a major problem for physics. A companion paper describing early polarization observations with the CBI has been submitted to the Astrophysical Journal.

The cosmic background observed by the CBI originates from the era just 400,000 years after the Big Bang and provides a wealth of information on the nature of the universe. At this remote epoch none of the familiar structures of the universe existed--there were no galaxies, stars, or planets. Instead there were only tiny density fluctuations, and these were the seeds out of which galaxies and stars formed under the hand of gravity.

Instruments prior to the CBI had detected fluctuations on large angular scales, corresponding to masses much larger than superclusters of galaxies. The high resolution of the CBI allowed the seeds of the structures we observe around us in the universe today to be observed for the first time in January 2000.

The expanding universe cooled and by 400,000 years after the Big Bang it was cool enough for electrons and protons to combine to form atoms. Prior to this time photons could not travel far before colliding with an electron, and the universe was like a dense fog, but at this point the universe became transparent and since that time the photons have streamed freely across the universe to reach our telescopes today, 13.8 billion years later. Thus observations of the microwave background provide a snapshot of the universe as it was just 400,000 years after the Big Bang--long before the formation of the first galaxies, stars, and planets.

The new data were collected by the CBI between September 2002 and May 2004, and cover four patches of sky, encompassing a total area three hundred times the size of the moon and showing fine details only a fraction of the size of the moon. The new results are based on a property of light called polarization. This is a property that can be demonstrated easily with a pair of polarizing sunglasses. If one looks at light reflected off a pond through such sunglasses and then rotates the sunglasses, one sees the reflected light varying in brightness. This is because the reflected light is polarized, and the polarizing sunglasses only transmit light whose polarization is properly aligned with the glasses. The CBI likewise picks out the polarized light, and it is the details of this light that reveal the motion of the seeds of galaxy clusters.

In the total intensity we see a series of peaks and valleys, where the peaks are successive harmonics of a fundamental "tone." In the polarized emission we also see a series of peaks and valleys, but the peaks in the polarized emission coincide with the valleys in the total intensity, and vice versa. In other words, the polarized emission is exactly out of step with the total intensity. This property of the polarized emission being out of step with the total intensity indicates that the polarized emission arises from the motion of the material.

The first detection of polarized emission by the Degree Angular Scale Interferometer (DASI), the sister project of the CBI, in 2002 provided dramatic evidence of motion in the early universe, as did the measurements by the Wilkinson Microwave Anisotropy Probe (WMAP) in 2003. The CBI results announced today significantly augment these earlier findings by demonstrating directly, and on the small scales corresponding to galaxy clusters, that the polarized emission is out of step with the total intensity.

Other data on the cosmic microwave background polarization were released just two weeks ago by the DASI team, whose three years of results show further compelling evidence that the polarization is indeed due to the cosmic background and is not contaminated by radiation from the Milky Way. The results of these two sister projects therefore complement each other beautifully, as was the intention of Readhead and John Carlstrom, the principal investigator of DASI and a coauthor on the CBI paper, when they planned these two instruments a decade ago.

According to Readhead, "Physics has no satisfactory explanation for the dark energy which dominates the universe. This problem presents the most serious challenge to fundamental physics since the quantum and relativistic revolutions of a century ago. The successes of these polarization experiments give confidence in our ability to probe fine details of the polarized cosmic background, which will eventually throw light on the nature of this dark energy."

"The success of these polarization experiments has opened a new window for exploring the universe which may allow us to probe the first instants of the universe through observations of gravitational waves from the epoch of inflation," says Carlstrom.

The analysis of the CBI data is carried out in collaboration with groups at the National Radio Astronomy Observatory (NRAO) and at the Canadian Institute for Theoretical Astrophysics (CITA).

"This is truly an exciting time in cosmological research, with a remarkable convergence of theory and observation, a universe full of mysteries such as dark matter and dark energy, and a fantastic array of new technology--there is tremendous potential for fundamental discoveries here" says Steve Myers of the NRAO, a coauthor and key member of the CBI team from its inception.

According to Richard Bond, director of CITA and a coauthor of the paper, "As a theorist in the early eighties, when we were first showing that the magnitude of the cosmic microwave background polarization would likely be a factor of a hundred down in power from the minute temperature variations that were themselves a heroic effort to discover, it seemed wishful thinking that even in some far distant future such minute signals would be revealed. With these polarization detections, the wished-for has become reality, thanks to remarkable technological advances in experiments such as CBI. It has been our privilege at CITA to be fully engaged as members of the CBI team in unveiling these signals and interpreting their cosmological significance for what has emerged as the standard model of cosmic structure formation and evolution."

The next step for Readhead and his CBI team will be to refine these polarization observations significantly by taking more data, and to test whether or not the polarized emission is exactly out of step with the total intensity with the goal of finding some clues to the nature of the dark matter and dark energy.

The CBI is a microwave telescope array comprising 13 separate antennas, each about three feet in diameter and operating in 10 frequency channels, set up in concert so that the entire instruments acts as a set of 780 interferometers. The CBI is located at Llano de Chajnantor, a high plateau in Chile at 16,800 feet, making it by far the most sophisticated scientific instrument ever used at such high altitudes. The telescope is so high, in fact, that members of the scientific team must each carry bottled oxygen to do the work.

The upgrade of the CBI to polarization capability was supported by a generous grant from the Kavli Operating Institute, and the project is also the grateful recipient of continuing support from Barbara and Stanley Rawn Jr. The CBI is also supported by the National Science Foundation, the California Institute of Technology, and the Canadian Institute for Advanced Research, and has also received generous support from Maxine and Ronald Linde, Cecil and Sally Drinkward, and the Kavli Institute for Cosmological Physics at the University of Chicago.

In addition to the scientists mentioned above, today's Science Express paper is coauthored by C. Contaldi and J. L. Sievers of CITA, J.K. Cartwright and S. Padin, both of Caltech and the University of Chicago; B. S. Mason and M. Pospieszalski of the NRAO; C. Achermann, P. Altamirano, L. Bronfman, S. Casassus, and J. May all of the University of Chile; C. Dickinson, J. Kovac, T. J. Pearson, and M. Shepherd of Caltech; W. L. Holzapfel of UC Berkeley; E. M. Leitch and C. Pryke of the University of Chicago; D. Pogosyan of the University of Toronto and the University of Alberta; and R. Bustos, R. Reeves, and S. Torres of the University of Concepción, Chile.

 

Writer: 
Robert Tindol
Writer: 

Pages

Subscribe to RSS - research_news