Size Does Matter - When it Comes to Reducing Environmental Pollution

PASADENA, Calif.-When it comes to mitigating the harmful impacts of environmental pollution--size does matter . . . or, at least, that's the hypothesis that California Institute of Technology professors Janet Hering and Richard Flagan will be testing.

Hering is professor of environmental science and engineering executive officer for Keck Laboratories. Flagan is executive officer of chemical engineering Irma and Ross McCollum professor of chemical engineering and professor of environmental science and engineering.

In a study funded by the Camille and Henry Dreyfus Foundation, Hering and Flagan will examine whether the effectiveness of iron nanoparticles in pollution remediation is influenced by their size. The $120,000 grant, under the Dreyfus Foundation's 2004 Postdoctoral Program in Environmental Chemistry, will be used to recruit a postdoctoral scientist to conduct research in environmental chemistry.

Specifically, the researchers will utilize this grant to examine effective strategies for reduction and mitigation of environmental pollutants in aquatic ecosystems. Ultimately, the study seeks to help provide viable, cost-effective commercial technologies for the remediation of certain contaminants, including groundwater contaminants, chlorinated solvents, nitrates, pesticides, various chemical by-products, residue created in manufacturing, and other industrial or inorganic contaminants.

The study, "Use of Vapor-Phase Synthesized Iron Nanoparticles to Examine Nanoscale Reactivity," will investigate whether reactivity and effectiveness of iron nanoparticles, in pollution mitigation, are influenced by their size. The study will compare particles in different size classes to determine whether nanoparticles exhibit enhanced reactivity in the reduction of organic substrates based on their size when surface area effects are accounted for.

Elemental iron [Fe(0)], or zero-valent iron, has been demonstrated to be an effective reductant for a wide range of environmental contaminants, including both organic and inorganic contaminants. Upon reaction with Fe(0), some contaminants can be transformed to products that are non-toxic or immobile. Fe(0) can be delivered to the subsurface environment by injection of Fe(0) nanoparticles.

If research results yield a conclusion that the size of Fe(0) nanoparticles does make a difference in their reactivity or effectiveness, then this finding will have a significant effect on the application of Fe(0) materials in environmental remediation and will provide insight into the fundamental chemical properties and behavior of nanoparticles in these applications.

Created in 1946, the Camille and Henry Dreyfus Foundation bears the names of modern chemistry pioneers Drs. Camille Dreyfus and his brother Henry. The foundation's mandate is "to advance the science of chemistry, chemical engineering, and related sciences as a means of improving human relations and circumstances throughout the world." The foundation directs much of its resources to the support of excellence in teaching and research by outstanding chemistry faculty at universities and colleges.

Writer: 
DWH
Writer: 
Exclude from News Hub: 
No

Caltech Launches New Information Initiative

PASADENA, Calif. — Information is everywhere. Most of us think about it in terms of mere facts--facts gleaned from a teacher or a colleague, from the media, or words from a textbook or the Internet.

But there are other, near-infinite types of information--the instructions encoded in our genome that tell our cells when to divide and when to die, or the daily flow of data into the stock market that somehow motivates people to buy and sell.

Information constantly streams to scientists from around the world, and from other "worlds" as well, thanks to sensors and actuators in the sea or out in space.

What's needed is a way to harness and understand all of this data so that scientists and engineers can continue to unravel the secrets of nature and the human institutions in which we operate. In an unprecedented effort, the California Institute of Technology has launched a university-wide initiative called Information Science and Technology (IST)--drawing back the curtain on the nature of information itself and redefining the way we approach, understand, and use science and engineering. IST will cut across disciplines, eventually involving over 25 percent of all faculty and nearly 35 percent of students on campus, likely altering the Institute's intellectual and organizational landscape.

Caltech has committed to raising $100 million for IST as part of the Institute's five-year, $1.4 billion capital campaign. Nearly $50 million has been raised in the form of separate grants of $25 million from the Annenberg Foundation and $22.2 million from the Gordon and Betty Moore Foundation. The Annenberg Foundation gift will be used to construct the Walter and Leonore Annenberg Center for Information Science and Technology--a new building that will be the physical center of IST. The building will join the existing Watson and Moore laboratories in forming a core of buildings linking together IST researchers.

Funding from the Moore Foundation will provide seed money to establish four new interdisciplinary research centers within IST. These new centers will join two that already exist at Caltech, and together the six groups will anchor and organize Caltech's effort to lead the way in this new field.

IST evolved over the last 50 years from an activity that focused on enabling more efficient calculations to a major intellectual theme that spans disciplines in engineering and the sciences. While other universities created schools of computer science (or computer and information science), these are generally related to computer science and software--a limited view of information science and technology. At Caltech, IST serves as a new intellectual framework on which to build information-based research and instructional programs across the academic spectrum.

"To maintain preeminence in science, the U.S. needs new and unified ways of looking at, approaching, and exploiting information in and across the physical, biological, and social sciences, and engineering," says Jehoshua (Shuki) Bruck, the Gordon and Betty Moore Professor of Computation and Neural Systems and Electrical Engineering and the first director of IST. "Caltech is taking a leadership role by creating an Institute-wide initiative in the science and engineering of information. IST will transform the research and educational environment at Caltech and other universities around the world."

In the same way that the printing press heralded the start of the Renaissance, and the study of physics helped to foster the Industrial Revolution, technological advances in computation and communication in the 20th century have set the stage for the Age of Information. Yet, scientific and technological changes are accelerating so fast they are outpacing existing institutions such as schools, media, industry, and government--structures originally designed for the needs of the Industrial Age. "So we need a new intellectual framework to harness these new advances," says Bruck, "in order to provide for a stable and well-educated society that's prepared to meet the challenges of tomorrow."

"Some say biology is the science of the 21st century, but information science will provide the unity to all of the sciences," says Caltech president and Nobel Prize-winning biologist David Baltimore. "It will be like the physics of the 20th century in which Einstein went beyond the teachings of Newton--which were enough to put people on the moon--and allowed people's minds to reach into the atom or out into the cosmos. Information science, the understanding of what constitutes information, how it is transmitted, encoded, and retrieved, is in the throes of a revolution whose societal repercussions will be enormous. The new Albert Einstein has yet to emerge, but the time is ripe."

Annenberg Foundation Gift The Annenberg gift is the first portion of a $100 million institutional commitment to IST, and is part of the Institute's capital campaign. Now in the design stage, the Annenberg Center is expected to be completed when the campaign ends in 2007.

"I am delighted that the Annenberg Foundation will be a part of this visionary enterprise," said Leonore Annenberg, foundation president and chairman. "As a publisher, broadcaster, diplomat, and philanthropist, Walter Annenberg was known for breaking new ground. Support for this important new initiative surely would have pleased him as much as it honors the work of the foundation."

Founded in 1989 by Walter H. Annenberg, the Annenberg Foundation exists to advance the public well-being through improved communication. As the principal means of achieving its goal, the foundation encourages the development of more effective ways to share ideas and knowledge.

Gordon and Betty Moore Foundation Gift The Moore Foundation gift is part of a $300 million commitment the foundation made to Caltech in 2001.

The four centers funded by the Moore grant are the following: the Center for Biological Circuit Design, which will address how living things store, process, and share information; the Social and Information Sciences Center, which will investigate how social systems, such as markets, political processes, and organizations, efficiently process immense amounts of information and how this understanding can help to improve society; the Center for the Physics of Information, which will examine the physical qualities of information and will design the computers and materials for the next generation of information technology; and the Center for the Mathematics of Information, which will formulate a common understanding and language of information that unifies researchers from different fields.

The Moore Foundation seeks to develop outcome-based projects that will improve the quality of life for future generations. It organizes the majority of its grant-making around large-scale initiatives that concentrate on: environmental conservation, science, higher education, and the San Francisco Bay Area. 

Writer: 
JP
Writer: 

Observing the Roiling Earth

PASADENA, Calif. - In the 1960s the theory of plate tectonics rocked geology's world by determining that the first 60 miles or so of our planet--the lithosphere--is divided into about a dozen rigid plates that crawl along by centimeters each year. Most manifestations of the earth's dynamics, earthquakes and volcanoes for example, occur along the boundaries of these plates.

As a model, the theory of plate tectonics continues to serve us well, says Jean-Philippe Avouac, a professor of geology at the California Institute of Technology. But while plate tectonics provides a powerful description of the large-scale deformation of the earth's lithosphere over millions of years, it doesn't explain the physical forces that drive the movements of the plates. Also, contrary to the theory, it's now known that plates are not perfectly rigid and that plate boundaries sometimes form broad fault zones with diffuse seismicity.

Now, thanks to a $13,254,000 grant from the Gordon and Betty Moore Foundation, Caltech has established the Tectonic Observatory, under the direction of Avouac, with the ultimate goal, he says, of "providing a new view of how and why the earth's crust is deforming over timescales ranging from a few tens of seconds, the typical duration of an earthquake, to several tens of million of years."

But it's not the only goal. "Most of the outstanding questions in earth science concern processes that take place at the boundaries of the earth's tectonic plates," says Avouac, so the observatory's scientific efforts will be centered around major field studies at a few key plate boundaries in western North America, Sumatra, Central America, and Taiwan, with the goal of answering a number of questions, including:

--Tectonic plates move gradually when viewed on large timescales, but then sometimes undergo sharp "jerks" in speed and direction. What's the cause?

--Because earthquakes can be damaging events to humans, it's important to know: what physical parameters control their timing, location, and size?

--Subduction zones, where oceanic plates sink into the earth's mantle, are needed to accommodate and perhaps drive plate motion. How do these subduction zones originate and grow?

"We plan to take advantage of a number of new technologies that will allow us to measure deformation of the earth's crust and image the earth's interior with unprecedented accuracy," says Avouac. The bulk of the grant will be spent on these new technologies, along with acquiring data that will be used to observe and model the boundary zones. In addition to seismometers, other equipment and data that's needed will include space-based GPS, which will allow geologists to measure the relative velocity of two points on the earth's surface to within a few millimeters each year; satellite images to map displacements of broad areas of the ground's surface over time; geochemical fingerprinting methods to analyze and date rocks that have been brought to the surface by volcanic eruptions or erosion, thus helping to characterize the composition of the earth far below; and of course, massive computation to analyze all the data, along with advanced computational techniques, "to allow us to develop models at the scale of the global earth," says Avouac.

"The breakthroughs we will achieve will probably result from the interactions among the various disciplines that will contribute to the project," he says. "We've already begun our effort, for example, by imaging and monitoring seismic activity and crustal deformation along a major subduction zone in Mexico. As I speak, geologists are in the field and continuing to install what will be a total of 50 seismometers."

Few institutions are capable of mounting this kind of sustained, diverse effort on a single plate boundary, he says, or of mining data from multiple disciplines to create dynamic models. "That's what Caltech is capable of doing," says Avouac. "We hope to breed a new generation of earth scientist. The Tectonics Observatory will offer students an exceptional environment with access to all of the modern techniques and analytical tools in our field, along with the possibility of interacting with a group of faculty with an incredibly diversified expertise."

The Gordon and Betty Moore Foundation was established in September 2000 by Intel cofounder Gordon Moore and his wife, Betty. The foundation funds projects that will measurably improve the quality of life by creating positive outcomes for future generations. Grantmaking is concentrated in initiatives that support the Foundation's principal areas of concern: environmental conservation, science, higher education, and the San Francisco Bay Area.

MEDIA CONTACT: Mark Wheeler (626) 395-8733 wheel@caltech.edu

Visit the Caltech media relations web site: http://pr.caltech.edu/media

Writer: 
MW
Writer: 

Systems Biology Could Augur New Age for Predictive and Preventive Medicine

PASADENA, Calif./SEATTLE--The dream of monitoring a patient's physical condition through blood testing has long been realized. But how about detecting diseases in their very early stages, or evaluating how they are responding to treatment, with no more to work with than a drop of blood?

That dream is closer to realization than many of us think, according to several leading experts advocating a new approach known as systems biology. Writing in the current issue of the journal Science, Institute for Systems Biology immunologist and technologist Leroy Hood and California Institute of Technology chemist Jim Heath and their colleagues explain how a new approach to the way that biological information is gathered and processed could soon lead to breakthroughs in the prevention and early treatment of a number of diseases.

The lead author of the Science article is Leroy Hood, a former Caltech professor and now the founding director of the Institute for Systems Biology in Seattle. According to Hood, the focus of medicine in the next few years will shift from treating disease--often after it has already seriously compromised the patient's health--to preventing it before it even sets in.

Hood explains that systems biology essentially analyzes a living organism as if it were an electronic circuit. This approach requires a gigantic amount of information to be collected and processed, including the sequence of the organism's genome, and the mRNAs and proteins that it generates. The object is to understand how all of these molecular components of the system are interrelated, and then predict how the mRNAs or proteins, for example, are affected by disturbances such as genetic mutations, infectious agents, or chemical carcinogens. Therefore, systems biology should be useful for diseases resulting from genetics as well as from the environment.

"Patients' individual genome sequences, or at least sections of them, may be part of their medical files, and routine blood tests will involve thousands of measurements to test for various diseases and genetic predispositions to other conditions," Hood says. "I'll guarantee you we'll see this predictive medicine in 10 years or so."

"In this paper, we first describe a predictive model of how a single-cell yeast organism works," Heath explains, adding that the model covers a metabolic process that utilizes copious amounts from data such as messenger RNA concentrations from all the yeast's 6,000 genes, protein-DNA interactions, and the like.

"The yeast model taught us many lessons for human disease," Heath says. "For example, when yeast is perturbed either genetically or through exposure to some molecule, the mRNAs and proteins that are generated by the yeast provide a fingerprint of the perturbation. In addition, many of those proteins are secreted. The lesson is that a disease, such as a very early-stage cancer, also triggers specific biological responses in people. Many of those responses lead to secreted proteins, and so the blood provides a powerful window for measuring the fingerprint of the early-stage disease."

Heath and his colleagues write in the Science article that, with a sufficient number of measurements, "one can presumably identify distinct patterns for each of the distinct types of a particular cancer, the various stages in the progression of each disease type, the partition of the disease into categories defined by critical therapeutic targets, and the measurement of how drugs alter the disease patterns. The key is that the more questions you want answered, the more measurements you need to make. It is the systems biology approach that defines what needs to be measured to answer the questions."

In other words, the systems biology approach should allow therapists to catch diseases much earlier and treat them much more effectively. "This allows you to imagine the pathway toward predictive medicine rather than reactive medicine, which is what we have now," Heath says.

About 100,000 measurements on yeast were required to construct a predictive network hypothesis. The authors write that 100,000,000 measurements do not yet enable such a hypothesis to be formulated for a human disease. In the conclusion of the Science article, the authors address the technologies that will be needed to fully realize the systems approach to medicine. Heath emphasizes that most of these technologies, ranging from microfluidics to nanotechnologies to molecular-imaging methods, have already been demonstrated, and some are already having a clinical impact. "It's not just a dream that we'll be diagnosing multiple diseases, including early stage detection, from a fingerprick of blood," Heath says.

"Early-stage versions of these technologies will be demonstrated very soon."

The other authors of the paper are Michael E. Phelps of the David Geffen School of Medicine at UCLA, and Biaoyang Lin of the Institute for Systems Biology.

Writer: 
Robert Tindol
Writer: 

Caltech Biologists Pursue Promising New Approach in Treatment of HIV/AIDS and Cancer

PASADENA, Calif.—In response to the arduously slow progress in finding cures for AIDS and cancer, Caltech researchers are now investigating a promising new approach in the treatment of these diseases.

With a $1.5 million matching grant from the Skirball Foundation in New York, Caltech biologists have established the Engineering Immunity project, designed to create a novel immunological approach to treating–and even some day preventing–HIV infection and some cancers like melanoma.

The immune system provides humans with a powerful defense against infectious diseases–but sometimes, it fails. Utilizing an innovative, integrated approach, the Engineering Immunity project will combine gene therapy, stem cell biology, and immunotherapy to arm the immune system; this integrative methodology offers groundbreaking potential for treatment of these diseases and others for which the immune system currently fails to provide defense.

Caltech President David Baltimore, who won the Nobel Prize in 1975 for his work in virology and cancer research, stated, "The Engineering Immunity project advocates a new approach to therapy for AIDS and cancer with revolutionary implications for the treatment of these and many other diseases. It is an innovative research project that holds special significance for the future of biomedical sciences."

In the fight against HIV, the virus that causes AIDS, T-cell immunity and T-cell-focused therapies and vaccines have been methods widely investigated and pursued. However, antibodies often provide the best protection against viruses, and virtually all vaccines for other viral diseases are designed to elicit antibody-based immunity. Antibodies against HIV do appear during HIV infections, but heretofore, they had not been able to provide therapeutic advantage to most patients. Rare neutralizing antibodies have been identified, but have not proven valuable because a general way to elicit their production in all patients has not been found. Moreover, most of them are effective only at very high concentrations that are hard to maintain in a person by conventional means. Thus, early attempts to elicit antibody-based immunity against HIV have largely failed.

The Engineering Immunity integrated methodology involves utilizing retroviruses, which are natural carriers of genes. Retrovirus vectors will be produced that encode antibodies found to be effective against HIV. Utilizing retroviruses, the Baltimore Laboratory at Caltech, in collaboration with Caltech structural biologist Pamela Bjorkman, will introduce specific genes into stem cells. These genes will encode specificity molecules on the immune cells, thereby arming the immune cells to kill selected agents or cells, i.e., the cells that are growing HIV or particular cancer cells.

The Engineering Immunity initiative will provide a new route to the production of antibodies with therapeutic, and even protective, ability for a potential cure of AIDS, melanoma, and other diseases ultimately.

The Skirball Foundation, an independent foundation created in 1950 by Jack H. Skirball, is dedicated primarily to medical research and care, educational and social needs of disadvantaged children, and advancing the highest values of the Jewish heritage. Among the many institutions that the Foundation has supported are the Skirball Cultural Center, the Salk Institute, the Venice Family Clinic, the Jewish Childcare Association in New York City, and the Skirball Institute of Biomolecular Medicine at New York University.

###

Contact: Deborah Williams-Hedges (626) 395-3227 debwms@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

Writer: 
DWH
Writer: 

Fuel Cells: the Next Generation

PASADENA, Calif. — For several years now the Department of Energy (DOE) has been urging the fuel cell community to solve a major problem in the design of solid oxide fuel cells (SOFCs): heat. Such fuel cells could someday provide reliable power for homes and industry, dramatically cutting greenhouse gas emissions as well as other pollutants.

But SOFCs run hot, at temperatures as high as 1000 degrees Celsius (about 1800 degrees Fahrenheit). They're efficient at such temperatures, but only a few costly materials can withstand the heat. Using such materials makes things expensive, and is the reason for the push for lower temperatures by the DOE.

Sossina Haile, an associate professor of materials science and chemical engineering at the California Institute of Technology, is an expert in fuel cells, and she has been whittling away at the heat problem for years. Now she and her colleagues have not only solved the problem, they've smashed it. They've brought the temperature down to about 600 degrees Celsius (1100 degrees Fahrenheit), while achieving more power output than others are achieving at the higher temperatures--about 1 watt per square centimeter of fuel cell area.

They accomplished this by changing the chemical composition of one component of a fuel cell called the cathode. The cathode is where air is fed in to the fuel cell, and it's where the oxygen is electrochemically reduced to oxygen ions. The oxygen ions then migrate across the electrolyte (which conducts electricity), to react with fuel at the anode, another fuel cell component. The electrochemical reduction of oxygen is an essential step in the fuel cell's process of generating power. But the problem with running solid oxide fuel cells at 500 to 700 degrees Celsius is that the cathode becomes inactive when the temperature is less than about 800 degrees Celsius.

Haile and postdoctoral scholar Zongping Shao's insight was to switch out the conventional cathode and replace it with a compound that has a long chemical formula guaranteed to strike fear into the heart of every undergraduate, but is abbreviated as "BSCF" for short.

What BSCF can do that standard cathodes can't is to allow the oxygen to diffuse through it very rapidly. "In conventional cathodes, the oxygen diffuses slowly, so that even if the electrochemical reaction is fast, the oxygen ions are slow in getting to the electrolyte," says Haile. "In BSCF the electrochemical reaction is fast and the oxygen ion transport is fast. You have the best combination of properties." This combination is what gives the very high power outputs from Haile's fuel cells.

The work was reported in a recent issue of the journal Nature. Because they are using relatively conventional anodes and electrolytes with this new cathode, says Haile, it would be easy to switch out cathodes in existing fuel cells. That will probably be their next step, says Haile: to partner with a company to produce the next generation of solid-oxide fuel cells.

Writer: 
MW
Writer: 
Exclude from News Hub: 
No

CBI Reveals Motion in the Remotest Seeds of Galaxy Clusters in the Very Early Universe

PASADENA, Calif.--Cosmologists from the California Institute of Technology have used observations probing back to the remote epoch of the universe when atoms were first forming to detect movements among the seeds that gave rise to clusters of galaxies. The new results show the motion of primordial matter on its way to forming galaxy clusters and superclusters. The observations were obtained with an instrument high in the Chilean Andes known as the Cosmic Background Imager (CBI), and they provide new confidence in the accuracy of the standard model of the early universe in which rapid inflation occurred a brief instant after the Big Bang.

The novel feature of these polarization observations is that they reveal directly the seeds of galaxy clusters and their motions as they proceeded to form the first clusters of galaxies.

Reporting in the October 7 online edition of Science Express, Caltech's Rawn Professor of Astronomy, and principal investigator on the CBI project, Anthony Readhead and his team say the new polarization results provide strong support for the standard model of the universe as a place in which dark matter and dark energy are much more prevalent than everyday matter as we know it, which poses a major problem for physics. A companion paper describing early polarization observations with the CBI has been submitted to the Astrophysical Journal.

The cosmic background observed by the CBI originates from the era just 400,000 years after the Big Bang and provides a wealth of information on the nature of the universe. At this remote epoch none of the familiar structures of the universe existed--there were no galaxies, stars, or planets. Instead there were only tiny density fluctuations, and these were the seeds out of which galaxies and stars formed under the hand of gravity.

Instruments prior to the CBI had detected fluctuations on large angular scales, corresponding to masses much larger than superclusters of galaxies. The high resolution of the CBI allowed the seeds of the structures we observe around us in the universe today to be observed for the first time in January 2000.

The expanding universe cooled and by 400,000 years after the Big Bang it was cool enough for electrons and protons to combine to form atoms. Prior to this time photons could not travel far before colliding with an electron, and the universe was like a dense fog, but at this point the universe became transparent and since that time the photons have streamed freely across the universe to reach our telescopes today, 13.8 billion years later. Thus observations of the microwave background provide a snapshot of the universe as it was just 400,000 years after the Big Bang--long before the formation of the first galaxies, stars, and planets.

The new data were collected by the CBI between September 2002 and May 2004, and cover four patches of sky, encompassing a total area three hundred times the size of the moon and showing fine details only a fraction of the size of the moon. The new results are based on a property of light called polarization. This is a property that can be demonstrated easily with a pair of polarizing sunglasses. If one looks at light reflected off a pond through such sunglasses and then rotates the sunglasses, one sees the reflected light varying in brightness. This is because the reflected light is polarized, and the polarizing sunglasses only transmit light whose polarization is properly aligned with the glasses. The CBI likewise picks out the polarized light, and it is the details of this light that reveal the motion of the seeds of galaxy clusters.

In the total intensity we see a series of peaks and valleys, where the peaks are successive harmonics of a fundamental "tone." In the polarized emission we also see a series of peaks and valleys, but the peaks in the polarized emission coincide with the valleys in the total intensity, and vice versa. In other words, the polarized emission is exactly out of step with the total intensity. This property of the polarized emission being out of step with the total intensity indicates that the polarized emission arises from the motion of the material.

The first detection of polarized emission by the Degree Angular Scale Interferometer (DASI), the sister project of the CBI, in 2002 provided dramatic evidence of motion in the early universe, as did the measurements by the Wilkinson Microwave Anisotropy Probe (WMAP) in 2003. The CBI results announced today significantly augment these earlier findings by demonstrating directly, and on the small scales corresponding to galaxy clusters, that the polarized emission is out of step with the total intensity.

Other data on the cosmic microwave background polarization were released just two weeks ago by the DASI team, whose three years of results show further compelling evidence that the polarization is indeed due to the cosmic background and is not contaminated by radiation from the Milky Way. The results of these two sister projects therefore complement each other beautifully, as was the intention of Readhead and John Carlstrom, the principal investigator of DASI and a coauthor on the CBI paper, when they planned these two instruments a decade ago.

According to Readhead, "Physics has no satisfactory explanation for the dark energy which dominates the universe. This problem presents the most serious challenge to fundamental physics since the quantum and relativistic revolutions of a century ago. The successes of these polarization experiments give confidence in our ability to probe fine details of the polarized cosmic background, which will eventually throw light on the nature of this dark energy."

"The success of these polarization experiments has opened a new window for exploring the universe which may allow us to probe the first instants of the universe through observations of gravitational waves from the epoch of inflation," says Carlstrom.

The analysis of the CBI data is carried out in collaboration with groups at the National Radio Astronomy Observatory (NRAO) and at the Canadian Institute for Theoretical Astrophysics (CITA).

"This is truly an exciting time in cosmological research, with a remarkable convergence of theory and observation, a universe full of mysteries such as dark matter and dark energy, and a fantastic array of new technology--there is tremendous potential for fundamental discoveries here" says Steve Myers of the NRAO, a coauthor and key member of the CBI team from its inception.

According to Richard Bond, director of CITA and a coauthor of the paper, "As a theorist in the early eighties, when we were first showing that the magnitude of the cosmic microwave background polarization would likely be a factor of a hundred down in power from the minute temperature variations that were themselves a heroic effort to discover, it seemed wishful thinking that even in some far distant future such minute signals would be revealed. With these polarization detections, the wished-for has become reality, thanks to remarkable technological advances in experiments such as CBI. It has been our privilege at CITA to be fully engaged as members of the CBI team in unveiling these signals and interpreting their cosmological significance for what has emerged as the standard model of cosmic structure formation and evolution."

The next step for Readhead and his CBI team will be to refine these polarization observations significantly by taking more data, and to test whether or not the polarized emission is exactly out of step with the total intensity with the goal of finding some clues to the nature of the dark matter and dark energy.

The CBI is a microwave telescope array comprising 13 separate antennas, each about three feet in diameter and operating in 10 frequency channels, set up in concert so that the entire instruments acts as a set of 780 interferometers. The CBI is located at Llano de Chajnantor, a high plateau in Chile at 16,800 feet, making it by far the most sophisticated scientific instrument ever used at such high altitudes. The telescope is so high, in fact, that members of the scientific team must each carry bottled oxygen to do the work.

The upgrade of the CBI to polarization capability was supported by a generous grant from the Kavli Operating Institute, and the project is also the grateful recipient of continuing support from Barbara and Stanley Rawn Jr. The CBI is also supported by the National Science Foundation, the California Institute of Technology, and the Canadian Institute for Advanced Research, and has also received generous support from Maxine and Ronald Linde, Cecil and Sally Drinkward, and the Kavli Institute for Cosmological Physics at the University of Chicago.

In addition to the scientists mentioned above, today's Science Express paper is coauthored by C. Contaldi and J. L. Sievers of CITA, J.K. Cartwright and S. Padin, both of Caltech and the University of Chicago; B. S. Mason and M. Pospieszalski of the NRAO; C. Achermann, P. Altamirano, L. Bronfman, S. Casassus, and J. May all of the University of Chile; C. Dickinson, J. Kovac, T. J. Pearson, and M. Shepherd of Caltech; W. L. Holzapfel of UC Berkeley; E. M. Leitch and C. Pryke of the University of Chicago; D. Pogosyan of the University of Toronto and the University of Alberta; and R. Bustos, R. Reeves, and S. Torres of the University of Concepción, Chile.

 

Writer: 
Robert Tindol
Writer: 

Memory Lane in the Brain

PASADENA, Calif.- Biologist Erin Schuman is interested in how memories are formed--or forgotten. The landscape the professor of biology at the California Institute of Technology explores is the hippocampus, the part of the brain known to be crucial for memory in humans and other animals.

In 2002, Schuman and Miguel Remondes, her graduate student, published a paper in the journal Nature that suggested a possible role for a well-known but poorly understood part of the brain known as the temporoammonic (TA) pathway. Using rat hippocampal slices, they suggested two possible roles for the TA pathway that were not previously known: to serve as a memory gatekeeper that can either enhance or diminish memories, and to provide information to help animals know where they are in their environments.

The researchers' next step was to prove their theories by looking at a possible role for the TA in memory at a behavioral level. That is, says Remondes, now a postdoctoral fellow at MIT, "to do the real test."

To understand how memories are formed, many scientists have focused on the "trisynaptic circuit," which involves three areas of the hippocampus: input from the senses is first sent from the cortex to the dentate gyrus, where this signal is processed by two sets of synapses, then sent back to the cortex. That's the circuit. An often overlooked separate input to the hippocampus, though, is the TA pathway. It makes direct contact with the neurons that are at the last station in the trisynaptic circuit, thus short-circuiting the traditional trisynaptic pathway.

Reporting in the October 7 issue of the journal Nature, Remondes and Schuman, also an associate investigator for the Howard Hughes Medical Institute, now show they were correct in their belief that the TA Pathway is important in spatial or location memory. The scientists used rats as their experimental animal, and the Morris Water Maze, a standard test for location memory in rodents. The animals swim in a pool of opaque water until they find a hidden goal--a platform which allows them to escape the water. To find the platform, the animals rely on the geometrical relationships of cues away from the pool (e.g., on the walls of the maze). In other words, says Remondes, "they have to navigate and remember where the platform is in order to escape the water."

The researchers tested both short-term (24 hours) and long-term memory (four weeks). The TA pathway was lesioned (disabled) in one set of rats; another set was used as a control. Having learned the location of the platform, both sets of rats still remembered where it was 24 hours later. But when tested four weeks later, only the control rats remembered where it was. The lesioned rats forgot, which showed that the TA pathway played some role in the retention of long-term memories. But what was the role?

"It led to a second question," says Schuman. "Because long-term memories require something called consolidation, an exchange of information between the cortex and hippocampus, we wanted to know if the TA pathway was working in the acquisition phase of memory or in its consolidation."

Using two other groups of rats, the pair conducted a second set of tests. After confirming the rat's memory of the platform after 24 hours, one group was immediately lesioned. These animals lost their long-term memory when tested 4 weeks later, indicating to Schuman and Remondes that ongoing TA pathway activity was required on days after learning to stabilize or consolidate the long-term memory.

The second group of rats was also lesioned, but not until three weeks later. The researchers found that this group remembered the platform's location, showing their memory had already been adequately consolidated after three weeks. This proved the TA pathway is required to consolidate long-term location memory.

"These data indicate there must be a dialogue between the hippocampus and the cortex during long-term memory consolidation," says Schuman. "Clearly, the TA pathway plays an important role in this discussion." Further, she notes, "understanding the mechanisms of memory formation and retention may shed light on diseases like Alzheimers, where memory is impaired. "

Writer: 
MW
Exclude from News Hub: 
No

New Target for Future Therapeutic Drugs

PASADENA, Calif. - "Sometimes letting nature tell you what's important is the better way to go," says Raymond Deshaies, an associate professor of biology at the California Institute of Technology. Deshaies is referring to new work to come out of his lab and the lab of Randall King at Harvard that defies conventional thinking--they've discovered a chemical that stops a key cell function, but, more importantly, suggests a new possible target within a cell, once thought to be untenable, for future therapeutic drugs.

In a report in this week's issue of the journal Science, lead author Rati Verma, a Howard Hughes Medical Institute (HHMI) Research Specialist in the Deshaies lab, Deshaies, also an assistant investigator for the HHMI, and nine other authors report that a small molecule called ubistatin blocked an important step in the so-called cell cycle, a process fundamental to life where a cell makes duplicate copies of its own DNA for distribution to two daughter cells. Knowing how to stop cell duplication is critical in preventing diseases like cancer, when mutated cells go out of control and proliferate madly. Further, ubistatin blocked the cell cycle by preventing two proteins from interacting together. Prior to this, it was thought unlikely that a compound with low molecular weight like ubistatin--or any future drug--would have much impact on the interaction of proteins with each other.

While ubistatin has other properties that preclude it from being a drug candidate, its stoppage of the cell cycle provides an important clue for future drug development, says Deshaies. "We've found a chemical Achilles' heel in this cell pathway, at least from the viewpoint of these small molecules that comprise most therapeutic drugs."

Because the cell cycle is maddeningly complex, researchers usually pick a single pathway (a pathway is a series of chemical events within a cell that perform some task), then try to make a chemical to block it. They may find such a chemical, but often find it difficult to discover where in the pathway--the target--their drug hit. Finding the target is like finding the proverbial needle in a haystack.

Deshaies's colleague Rati Verma found the needle. Instead of using the typical "top down" approach of starting with a specific target, then looking for a drug to block it, the researchers took a "bottom up" approach of starting with a drug and then searching for the target it blocks. They decided to test a large number of molecules to see if any of them might block any step in one particular pathway called the ubiquitin-proteasome pathway (UP pathway): within the cell cycle, when a protein's job is done, another chain of proteins called ubiquitin attaches to it. That serves as a signal to yet another protein called proteasome. The proteasome, says Deshaies, is the biological equivalent of a Cuisinart. "It attaches to these ubiquitin-marked proteins, then ingests them and chews them up."

The researchers examined an entire cell, specifically that of a frog's egg. The King group decided to screen 110,000 molecules to see if any had an impact on the cell. First, they weeded out those molecules that had no effect on cellular function in the UP pathway. King attached a molecule of luciferase ("the stuff that makes a firefly light up," says Deshaies) to certain proteins that are normally destroyed during cell division. Next, he added this newly created protein (now a readily detectable biological "flashlight") to droplets of cellular material extracted from the frog's egg that had been placed in individual chambers. As the egg extract conducted its normal cell division, the luciferase flashlight was destroyed and the chambers went dark. That meant those proteins had been destroyed as part of the normal progress of the cell cycle. He then separately added the 110,000 small molecules to see if any of them would prevent the loss of the luciferase--essentially looking for a lit-up reaction chamber in a field of darkness.

Using this approach, the researchers eventually narrowed the molecules they were testing down to a few that were operating in a specific part of the pathway--downstream from where ubiquitin attaches to the soon-to-be doomed protein, but before the proteasome ingested and chewed it up. But given that numerous proteins are involved in this process, the question remained--where specifically was the molecule they were testing working? In short, where was the target?

To find out, Deshaies turned to work they had done over the last five years with ubiquitin, which examined how it interacted with various other proteins, including proteasome. Through a process of elimination, says Deshaies, "we figured out that these small molecules called ubistatins were blocking the recognition of the ubiquitin chain by the proteasome." Graphic evidence for how this occurs was provided by a 'picture' taken by David Fushman at the University of Maryland with a nuclear magnetic resonance spectrometer.

This step blocked by ubistatin involves a protein-protein interaction, a surprise to Deshaies. "One interesting thing about our discovery is that it is further evidence that you can affect a protein-protein interaction with a small molecule. The conventional thinking was that if you look at a footprint of a drug binding a protein, the drugs are small, but the footprint that corresponds to one protein binding to another is big. So most people thought that the idea of trying to block the huge footprint of protein-protein interaction with a tiny drug was extremely unlikely. So if I were asked to predict what we would find, I would never have proposed that a drug could prevent the ubiquitin chain from binding to the proteasome, because I was also influenced by this conventional wisdom."

Writer: 
MW
Exclude from News Hub: 
No

Caltech Bioinformatics Experts DevelopNew Literature Search Engine for Biologists

PASADENA, Calif.—When it comes to finding a used book on the Internet, one merely needs to Google the title, and a few suitable items for sale will soon be just a click away. But for the biologist or medical researcher looking for information on how two nematode genes interrelate in hopes of better understanding human disease, there is a clear need for a more focused search engine.

Bioinformatics experts from the California Institute of Technology are formally announcing today the Textpresso search engine, which they hope will revolutionize the way that genetic information is retrieved by researchers worldwide. The Textpresso search engine is specifically built to serve researchers who work on the small worm known as C. elegans, but the basic design should lead to the creation of new search engines for researchers who specialize in other living organisms that are intensively studied.

In the current issue of the journal PLOS, published by the Public Library of Science, Caltech biology professor Paul Sternberg and his colleagues--research associate Hans-Michael Muller and bioinformatics specialist Eimear Kenny--write that the new "text-mining system for scientific literature" will be invaluable to specialists trying to cope with the vast amount of information now available on C. elegans. This information has vastly increased in recent years due to the large-scale gene-sequencing initiative as well as the more traditional small-scale projects by individual researchers. As a result, the need for a way to scan the vast literature has become much more important.

"Textpresso gives me, as a researcher, more time to actually read papers because I don't have to skim papers anymore," says Sternberg, who is leader of the federally funded WormBase project that has already put online the entire genome sequence of C. elegans and the closely related organism C. briggsae, as well as genes for some 20 other nematode species.

The four-year-old WormBase project, a linchpin in the worldwide effort to better understand how genes interrelate, also makes a host of other information freely available in addition to the 100.2 million base-pairs that make up the millimeter-long worm's genome. There are now 28,000 gene-disruption experiments in WormBase, along with 2 million DNA expression ("chip") microarray observations, as well as detailed information on the expression of more than 1,700 of the worm's 20,000 genes. The Textpresso search engine is a logical product for the WormBase team to develop in the ongoing quest to put genetic information to work in curing and preventing human disease.

Lest anyone assume that the genes of a millimeter-long nematode have little to do with humans, it should be pointed out that the two organisms are similar in about 40 percent of their genes. A very realistic motivation for funding the genome sequencing of the fruit fly, the small mustardlike plant known as Arabidopsis, the chimp, and various other species, has been the expectation of finding underlying common mechanisms.

Thus, a cancer researcher who discovers that a certain gene is expressed in cancer cells can use the WormBase to see if the gene exists in nematodes, and if so, what is known about the gene's function. And now that Textpresso is available, the researcher can do so much more efficiently.

"The idea is distilling down the information so it can be extracted easier," says Muller, the lead author of the paper and codeveloper of Textpresso with Kenny. The idea for the name of the search engine, in fact, comes from its resemblance to "espresso," which is a process used to get the caffeine and flavor out of coffee in a minimal volume.

According to Kenny, the search engine is designed with a special kind of search in mind, which establishes categories of terms and organizes them as an ontology--that is, a catalog of types of objects and concepts and their relationships. For example, if the researcher wants to find out whether any other researcher has worked on the relationship between the nematode gene called "lin-12" and the anchor cell, then typing the two terms into a conventional search engine like Google results in more than 400 hits. And if the researcher wants to know which genes are important in the anchor cell, the task is even more arduous. But Textpresso is designed to get the information in a much simpler, more efficient, more straightforward way.

Textpresso is a text-processing system that splits research papers into sentences, and sentences into words or phrases. All words and phrases are labeled so that they are searchable, and the labels are then condensed into 33 ontological categories. So far, the database includes 4,420 scientific papers on C. elegans, as well as bibliographic information from WormBase, information on various scientific meetings, the "Worm Breeder's Gazette," and various other links and WormBase information. Therefore, the engine already searches through millions of sentences to allow researchers to find a paper of interest or information of interest with great efficiency.

Finally, the Textpresso search engine should be a useful prototype for search engines to serve other biological databases--some of which have even larger piles of data for the specialist to cope with. "Yeast currently has 25,000 papers," Kenny says.

Textpresso can be accessed at www.textpresso.org or via WormBase at www.wormbase.org.

Writer: 
Robert Tindol
Writer: 

Pages

Subscribe to RSS - research_news