New Study Suggests That State Proposal Would Drive Electricity Prices Higher

PASADENA, Calif.-A new study, on the organization of the wholesale electricity market conducted by California Institute of Technology and Purdue University economists, suggests that a plan being considered by the California Energy Commission (CEC) to require electric utility companies to make public their procurement strategies would result in higher costs for utility customers.

The study uses new laboratory experimental techniques that were developed by Caltech Professor Charles Plott to study the intricate ways in which the basic laws of supply and demand work in different forms of market organization. Use of the techniques has spread rapidly around the world. One of the first to use laboratory experimental methods in economics was Caltech alumnus Vernon Smith, who was awarded the Nobel Prize for his work.

The proposal considered by the CEC requires that the utility company make available its needs for electricity, as dictated by its customers, to the companies from which the utilities must buy their electricity. While the scientific operations of the law of supply and demand are intricate and present many challenges to science, the principles operating in this case are rather transparent. Common sense tells us that there are situations in life where letting our competitors in on our game plan is a sure way of decreasing our advantage.

For example, if you're a quarterback, the best way to make sure the fans see your team score a bunch of exciting touchdowns is certainly not to invite opposing team members into your huddle. Just as you know you should withhold information from the other football team, you also know that you should hide your cards from your poker competitors, and that you should avoid telling the used-car salesman how much money you can spend on a car.

Surprisingly, this common-sense approach to withholding information in competitive situations is still open to debate in the world of resource regulation. The proposal currently being considered by the CEC would require an openness that may seem like a good idea, but the Caltech and Purdue research shows that it flies in the face of science.

"A presumption exists in regulatory circles that openness of economic processes promotes efficiency in protecting the public interest," says Charles Plott, who is the Harkness Professor of Economics and Political Science at Caltech. "That may be true in political and bureaucratic processes, but it's not true in markets."

Plott and Timothy Cason of Purdue are the authors of a new study in the journal Economic Inquiry showing that the forced disclosure of information is bad for consumers in utilities markets, and that scientific experiments back them up. Their work addresses, in part, the CEC's announcement that it "does not believe that the California ratepayers will be harmed by a more transparent system."

Plott says that this is a long-standing and fallacious assumption that contradicts the basic laws of supply and demand. Nonetheless, it seems to persist because of a confusion between the desire for information that is characteristic of regulators and the efficient workings of a market.

"At face value, openness sounds good," explains Plott. "The argument is that the public needs to know as much as possible, and that by knowing more information the public is better able to monitor a company's behavior.

"But this is just not true, and common sense eventually tells you so," Plott says. "If you think about it, forcing a utility to reveal information about its plans about procurement of power from the wholesale markets doesn't make any more sense than forcing one player to play cards face-up in a poker game."

But the science argues against such disclosure, too, Plott says. Laboratory results in Plotts's Caltech experimental economics lab shows that forcing the utilities to reveal confidential information regarding their energy demands to suppliers will lead to higher prices for the consumer.

In short, if the rules are changed, California consumers can expect to pay higher average prices for electricity, Plott says. The exact amount of the price increases will be dictated by events that are unknown to us now, but it is easy to imagine situations in which the price increases could be on the order of 7 to 8 percent higher, and it is just as easy to imagine circumstances in which the impact of the disclosures could increase prices two or three times more. Under no circumstances would the disclosure lower prices.

In the experiments described in the Economic Inquiry paper, the researchers set up the procedure so that the volunteer test subjects would be financially motivated. The experiments were rather complicated and involved, but the objective was to test the influence of the wholesale power supplier's possession of pertinent information on the eventual price.

The experimental work showed that the manipulation of information strongly controlled the movement of the pricing equilibrium such that perturbations always went to the informed side. In everyday English, this means that a lower price results when buyers in a competitive market are not forced to "tip their hands."

Or to put it another way, the current system that requires each competitor to guess what the other is doing will result in their trying to beat each other out for the lowest price. The party that benefits from the competition is the consumer.

On the other hand, disclosure of information results in a lack of competition. The likely result is a higher price for the consumer, which could be especially burdensome in the future if the supply and demand for power is as unpredictable as it has been in the last couple of years. In fact, the paper concludes, the disclosure of information could work even greater hardships on the public if demand is unpredictable.

The title of the paper is "Forced Information Disclosure and the Fallacy of Transparency in Markets." The paper will appear in an upcoming issue.




Robert Tindol

Caltech Neuroscientists Unlock Secrets of How the Brain Is Wired for Sex

PASADENA--There are two brain structures that a mouse just can't do without when it comes to hooking up with the mate of its dreams--and trying to stay off the lunch menu of the neighborhood cat. These are the amygdala, which is involved in the initial response to cues that signal love or war, and the hypothalamus, which coordinates the innate reproductive or defensive behaviors triggered by these cues.

Now, neuroscientists have traced out the wiring between the amygdala and hypothalamus, and think they may have identified the genes involved in laying down the wiring itself. The researchers have also made inroads in understanding how the circuitry works to make behavioral decisions, such as when a mouse is confronted simultaneously with an opportunity to reproduce and an imminent threat.

Reporting in the May 19 issue of the journal Neuron, David Anderson, Caltech's Roger W. Sperry Professor of Biology and a Howard Hughes Medical Institute investigator, his graduate student Gloria Choi, and their colleagues describe their discovery that the neural pathway between the amygdala and hypothalamus thought to govern reproductive behaviors is marked by a gene with the rather unromantic name of Lhx6.

For a confirmation that their work was on track, the researchers checked to see what the suspected neurons were doing when the mice were sexually aroused. In male mice, the smell of female mouse urine containing pheromones was already known to be a sexual stimulus, evoking such behaviors as ultrasonic vocalization, a sort of "courtship song." Therefore, the detection of neural activity in the pathway when the mouse smelled the pheromones was the giveaway.

The idea that Lhx6 actually specifies the wiring of the pathway is still based on inference, because when the researchers knocked out the gene, the mutation caused mouse embryos to die of other causes too early to detect an effect on brain wiring. But the Lhx6 gene encodes a transcription factor in a family of genes whose members are known to control the pathfinding of axons, which are tiny wires that jut out from neurons and send messages to other neurons.

The pathway between the amygdala and hypothalamus that is involved in danger avoidance appears to be marked by other genes in the same family, called Lhx9 and Lhx5. However, the function of the circuits marked by these factors is not as clear, because a test involving smells to confirm the pathways was more ambiguous than the one involving sexual attraction. The smell of a cat did not clearly light up Lhx9- or Lhx5-positive cells. Nevertheless, the fact that those cells are found in brain regions implicated in defensive behaviors suggests they might be involved in other forms of behaviors, such as aggression between male mice.

The researchers also succeeded in locating the part of the mouse brain where a circuit-overriding mechanism exists when a mouse is both exposed to a potential mate and perceives danger. This wiring is a place in the hypothalamus where the pathways involved in reproduction and danger avoidance converge. The details of the way the axons are laid down shows that a mouse is clearly hard-wired to get out of harm's way, even though a mating opportunity simultaneously presents itself.

"We also have a behavioral confirmation, because it is known that male mice 'sing' in an ultrasonic frequency when they're sexually attracted," Anderson explains. "But when they're exposed to danger signals like predator odors, they freeze or hide.

"When we exposed the mice to both cat odor and female urine simultaneously, the male mice stopped their singing, as we predicted from the wiring diagram," he says. "So the asymmetry in the cross-talk suggests that the system is prioritized for survival first, mating second."

The inevitable question is whether this applies to humans as well. Anderson's answer is that similarities are likely, and that the same genes may even be involved.

"The brains of mice and humans have both of these structures, and we, like mice, are likely to have some hard-wired circuits for reproductive behavior and for defense," he says. "So it's not unreasonable to assume that some of the genes involved in these behaviors in mice are also involved in humans."

However, humans can also make conscious decisions and override the hard-wired circuitry. For example, two teenagers locked in an amorous embrace in a theater can ignore a horrid monster on the screen and continue with the activity at hand. In real-life circumstances, they would be more inclined to postpone the groping until they were out of danger.

"We obviously have the conscious ability to interrupt the circuit-overriding mechanism, to see if the threat is really important," Anderson says.

Gloria Choi, a doctoral student in biology, did most of the lab work involved in the study. The other collaborators are Hongwei Dong and Larry Swanson, a professor at USC who in the past has comprehensively mapped the neural wiring of the rat brain, and Andrew Murphy, David Valenzuela, and George Yancopoulos at Regeneron Pharmaceuticals, in Tarrytown, New York, who generated the genetically modified mice using a new high-throughput system that they developed, called Velocigene.



Robert Tindol

Research on Sumatran Earthquakes Uncovers New Mysteries about Workings of Earth

PASADENA, Calif.--The Sumatra-Andaman earthquake of December 26 was an unmitigated human disaster. But three new papers by an international group of experts show that the huge data return could help scientists better understand extremely large earthquakes and the disastrous tsunamis that can be associated with them.

Appearing in a themed issue of this week's journal Science, the three papers are all co-authored by California Institute of Technology seismologists. The papers describe in unprecedented detail the rupture process of the magnitude-9 earthquake, the nature of the faulting, and the global oscillations that resulted when the earthquake "delivered a hammer blow to our planet." The work also shows evidence that the odd sequence of ground motions in the Andaman Islands will motivate geophysicists to further investigate the physical processes involved in earthquakes.

"For the first time it is possible to do a thorough seismological study of a magnitude-9 earthquake," says Hiroo Kanamori, who is the Smits Professor of Geophysics at Caltech and a co-author of all three papers. "Since the occurrence of similar great earthquakes in the 1960s, seismology has made good progress in instrumentation, theory, and computational methods, all of which allowed us to embark on a thorough study of this event."

"The analyses show that the Global Seismic Network, which was specifically designed to record such large earthquakes, performed exactly according to design standards," adds Jeroen Tromp, who is McMillan Professor of Geophysics and director of the Caltech Seismology Lab. "The network enables a broadband analysis of the rupture process, which means that there is considerable information over a broad range of wave frequencies, allowing us to study the earthquake in great detail.."

In fact, Kanamori points out, the data have already motivated tsunami experts to investigate how tsunamis are generated by seismic deformation. In the past, seismic deformation was treated as instantaneous uplift of the sea floor, but because of the extremely long rupture length (1200 km), slow deformation, and the large horizontal displacements as well as vertical deformation, the Sumatra-Andaman earthquake forced tsunami experts to rethink their traditional approach. Experts and public officials are now incorporating these details into modeling so that they can more effectively mitigate the human disaster of future tsunamis.

Another oddity contained in the data is the rate at which the ground moved in the Andaman Islands. Following the rapid seismic rupture, significant slip even larger than the co-seismic slip (in other words, the slip that occurred during the actual earthquake) continued beneath the islands over the next few days.

"We have never seen this kind of behavior," says Kanamori. "If slip can happen over a few days following the rapid co-seismic slip, then important hitherto unknown deformational processes in the Earth's crust must have been involved; this will be the subject of future investigations."

As for the "ringing" of Earth for literally weeks after the initial shock, the scientists say that the information will provide new insights into the planet's interior composition, mineralogy, and dynamics. In addition, the long-period free oscillations of such a large earthquake provide information on the earthquake itself.

The first of the papers is "The Great Sumatra-Andaman Earthquake of 26 December 2004. " In addition to Kanamori, the other authors are Thorne Lay (the lead author) and Steven Ward of UC Santa Cruz; Charles Ammon of Penn State; Meredith Nettles and Goan Estrom of Harvard; Richard Aster and Susan Bilek of the New Mexico Institute of Mining and Technology; Susan Beck of the University of Arizona; Michael Brudzinski of the University of Wisconsin and Miami University; Rhett Butler of the IRIS Consortium; Heather DeShon of the University of Wisconsin; Kenji Satake of the Geological Survey of Japan; and Stuart Sipkin of the US Geological Survey's National Earthquake Information Center.

The second paper is "Rupture Process of the 2004 Sumatra-Andaman Earthquake." The Caltech co-authors are Ji Chen, Sidao Ni, Vala Hjorleifsdottir, Hiroo Kanamori, and Donald Helmberger, the Smits Family Professor of Geological and Planetary Sciences.

The other authors are Charles Ammon (the lead author) of Penn State; David Robinson and Shamita Das of the University of Oxford; Thorne Lay of UC Santa Cruz; Hong-Kie Thio and Gene Ichinose of URS Corporation; Jascha Polet of the Institute for Crustal Studies; and David Wald of the National Earthquake Information Center.

The third paper is "Earth's Free Oscillations Excited by the 26 December 2004 Sumatra-Andaman Earthquake," of which Jeffrey Park of Yale University is lead author. The Caltech coauthors are Teh-Ruh Alex Song, Jeroen Tromp, and Hiroo Kanamori. The other authors are Emile Okal and Seth Stein of Northwestern University; Genevieve Roult and Eric Clevede of the Institute de Physique du Globe, Paris; Gabi Laske, Peter Davis, and Jon Berger of the Scripps Institute of Oceanography; Carla Braitenberg of the University of Trieste; Michel Van Camp of the Royal Observatory of Belgium; Xiang'e Lei, Heping Sun, and Houze Xu of the Chinese Academy of Sciences' Institute of Geodesy and Geophysics; and Severine Rosat of the National Astronomical Observatory of Japan.

The second paper contains web references to three animations that help to illustrate various aspects of this great earthquake:

Global movie of the vertical velocity wave field. The computation includes periods of 20 seconds and longer and shows a total duration of 3 hours. The largest amplitudes seen in this movie are the Rayleigh waves traveling around the globe. Global seismic stations are shown as yellow triangles.

Animation of the vertical velocity wave field in the source region. The computation includes periods of 12 seconds and longer with a total duration of about 13 minutes. As the rupture front propagates northward the wave-field gets compressed and amplified in the north and drawn out to the south. The radiation from patches of large slip shows up as circles that are offset from each other due to the rupture propagation (a Doppler-like effect).

Evolution of uplift and subsidence above the megathrust with time. The duration of the rupture is 550 seconds. This movie shows the history of the uplift at each point around the fault and, as a result, the dynamic part of the motion is visible (as wiggling contour lines). The simulation includes periods of 12 s and longer. The final frame of the movie shows the static field.

All animations were produced by Seismo Lab graduate student Vala Hjorleifsdottir with the assistance of Santiago Lombeyda at Caltech's Center for Advanced Computing Research. The simulations were performed on 150 nodes of Caltech's Division of Geological & Planetary Sciences' Dell cluster.

Robert Tindol

Seismic experiments provide new clues to earthquake wave directionality and growth speed

PASADENA, Calif.--In recent years, seismologists thought they were getting a handle on how an earthquake tends to rupture in a preferred direction along big strike-slip faults like the San Andreas. This is important because the direction of rupture has a profound influence on the distribution of ground shaking. But a new study could undermine their confidence a bit.

Reporting in the April 29 issue of the journal Science, researchers from the California Institute of Technology and Harvard University discuss new controlled laboratory experiments using dissimilar polymer plates to mimic Earth's crusts. The results show that the direction of rupture that controls the pattern of destruction is less predictable than recently thought.

The results explain puzzling results from last year's Parkfield earthquake, in which a northwestward rupture occurred. A southeastward rupture had been predicted on the basis of the two past earthquakes in the area and on numerical simulations. Also, during the recent large earthquakes in Turkey, some ruptures have occurred in the direction opposite to what happened in the past and are thought to involve unusually high speeds along that direction.

The phenomenon has to do with the basic ways rupture fronts (generating seismic waves) are propagated along a boundary between two materials with different wave speeds--an area of research that is yielding interesting and important results in the engineering laboratory.

The reason this is important is that geophysicists, knowing the wave speeds of the materials in different tectonic plates and the stresses acting on them, could someday have an improved ability to predict which areas along a major fault might be more powerfully hit. In effect, a better fundamental knowledge of the workings of Earth's plates could lead to a better ability to prepare for major earthquakes.

In the experiment, Caltech's von Kármán Professor of Aeronautics and Mechanical Engineering Ares Rosakis (the director of the Graduate Aeronautical Laboratories); his cross-campus colleague, Smits Professor of Geophysics Hiroo Kanamori; Professor James Rice of Harvard University; and Caltech grad student Kaiwen Xia, prepared polymer plates to mimic the effects of major strike-slip faults. These are faults in which two plates are rammed against each other by forces coming in at an angle, and which then spontaneously snap (or slide) to move sideways.

Because such a breaking of lab materials is similar on a smaller scale to the slipping of tectonic plates, the measurement of the waves in the polymer materials provides a good indication of what happens in earthquakes.

The team fixed the plates so that force was applied to them at an acute angle relative to the "fault" between them. The researchers then set off a small plasma explosion with a wire running to the center of the two polymer plates (the "hypocenter"), which caused the two plates to quickly slide apart, just as two tectonic plates would slide apart during an earthquake.

The clear polymer plates were made of two different materials especially selected so that their stress fringes could be photographed. In other words, the waves and rupture fronts that propagate through the system due to this "laboratory earthquake event" showed up as clearly visible waves on the photographic plates.

What's more, if the rupture fronts are super-shear, i.e., faster than the shear speed in the plates, they produce a shock-wave pattern that looks something like the Mach cone of a jet fighter breaking the sound barrier.

"Previously, it was generally thought that, if there is a velocity contrast, the rupture preferentially goes toward the direction of the slip in the low-velocity medium," explains Kanamori. In other words, if the lower-velocity medium is the plate shifting to the west, then the preferred direction of rupture would typically be to the west.

"What we see, when the force is small and the angle is small, is that we simultaneously generate ruptures to the west and to the east, and that the rupture fronts in both sides go with sub-shear speed," Rosakis explains. "But as the pressure increases substantially, the westward direction stays the same, but the other, eastward direction, becomes super-shear. This super-shear rupture speed is very close to the p-wave speed of the slower of the two materials."

To complicate matters even further, the results show that, when the experiment is done at forces below those required for super-shear, the directionality of the rupture is unpredictable. Both waves are at sub-shear speed, but waves in either direction can be devastating.

This, in effect, explains why the Parkfield earthquake last year ruptured in the direction opposite to that of past events. The experiment also strongly suggests that, if the earthquake had been sufficiently large, the super-shear waves would have traveled northwest, even though the preferred direction was southeast.

But the question remains whether super-shear is necessarily a bad thing, Kanamori says. "It's scientifically an interesting result, but I can't say what the exact implications are. It's at least important to be aware of these things.

"But it could also mean that earthquake ruptures are less predictable than ever," he adds.

Contact: Robert Tindol (626) 395-3631


Scientists Use fMRI to Catch Test Subjectsin the Act of Trusting One Another

PASADENA, Calif.--Who do you trust? The question may seem distinctly human--and limited only to "quality" humans, at that--but it turns out that trust is handled by the human brain in pretty much the same way that obtaining a food award is handled by the brain of an insect. In other words, it's all a lot more primitive than we think.

But there's more. The research also suggests that we can actually trust each other a fair amount of time without getting betrayed, and can do so just because of the biological creatures we are.

In a new milestone for neuroscience, experimenters at the California Institute of Technology and the Baylor College of Medicine for the first time have simultaneously scanned interacting brains using a new technique called "hyperscanning" brain imaging to probe how trust builds as subjects learn about one another. This new technique allowed the team to see for the first time how interacting brains influence each other as subjects played an economic game and built trusting relationships. The research has implications for further understanding the evolution of the brain and social behavior, and could also lead to new insights into maladies such as autism and schizophrenia, in which a person's interaction with others is severely compromised.

Reporting in Friday's issue of the journal Science, the Caltech and Baylor researchers describe the results they obtained by hooking up volunteers to functional magnetic resonance imaging (fMRI) machines in Pasadena and Houston, respectively. One volunteer in one locale would interact with another volunteer he or she did not know, and the two would play an economic game in which trustworthiness had to be balanced with the profit motive. At the time the volunteers were playing the game, their brain activity was continually monitored to see what was going on with their neurons.

According to Steve Quartz, associate professor of philosophy and director of the Social Cognitive Neuroscience Laboratory at Caltech, who led the Caltech effort and does much of his work on the social interactions of decision making by employing MRIs, the results show that trust involves a region of the brain known as the head of the caudate nucleus. As with all MRI images of the brain, the idea was to pick up evidence of a rush of blood to a specific part of the brain, which is taken to indicate evidence that the brain region is at that moment involved in mental activity.

The important finding, however, was not just that the caudate nucleus is involved, but that trust tended to shift backward in time as the game progressed. In other words, the expectation of a reward was intimately involved in an individual's assessment of trustworthiness in the other individual, and that the recipient tended to become more trusting prior to the reward coming--provided, of course, that there was no backstabbing.

Colin Camerer, the Axline Professor of Business Economics at Caltech and the other Caltech faculty author of the paper, adds that the study is also a breakthrough in showing that game theory continues to reward researchers who study human behavior.

"The theory about games such as the one we used in this study is developed around mathematics," Camerer says. "But a mathematical model of self-interest can be overly simplified. These results show that game theory can draw together the social and biological sciences for new and deeper understandings of human behavior. A better mathematical model will result."

The game is a multiround version of an economic exchange, in which one player (the "investor") is given $20 and told that he can either hold on to the money, or give some or all of it to the person on the other end of the game 1,500 miles away. The game is anonymous, and it is further assumed that the players will never meet each other, in order to keep other artifacts of social interaction from coming into play.

The person on the receiving end of the transaction (the "trustee") immediately has any gift that he receives tripled. The trustee can then give some or all of it back to the investor.

In ideal circumstances, the investor gives the entire $20 to the trustee, who then has his money tripled to $60 and then gives $30 back to the investor so that both have profited. That's assuming that greed hasn't made the trustee keep all the money for himself, of course, or that stinginess or lack of trust has persuaded the investor to keep the original investment all to himself. And this is the reason that trust is involved, and furthermore, the reason that there is brain activity during the course of the game for the experimenters to image.

The findings are that trust is delayed in the early rounds of the game (there are 10 in all), and that the players begin determining the costs and benefits of the interchange and soon begin anticipating the rewards before they are even bestowed. Before the game is finished, one player is showing brain activity in the head of the caudate nucleus that demonstrates he has an "intention to trust." Once the players know each other by reputation, they begin showing their intentions to trust about 14 seconds earlier than in the early rounds of the game.

The results are interesting on several levels, say Camerer and Quartz. For one, the results show the neuroscience of economic behavior.

"Neoclassical economics starts with the assumption that rational self-interest is the motivator of all our economic behavior," says Quartz. "The further assumption is that you can only get trust if you penalize people for non-cooperation, but these results show that you can build trust through social interaction, and question the traditional model of economic man."

"The results show that you can trust people for a fair amount of time, which contradicts the assumptions of classical economics," Camerer adds.

This is good news for us humans who must do business with each other, Quartz explains, because trustworthiness decreases the incidental costs. In other words, if we can trust people, then the costs of transactions are lower and simpler: there are fewer laws to encumber us, fewer lawyers to pay so as to ensure that all the documents pertaining to the deal are written in an airtight manner, and so on.

"It's the same as if you could have a business deal on a handshake," Quartz says. "You don't have to pay a bunch of lawyers to write up what you do at every step. Thus, trust is of great interest from the level of our everyday interactions all the way up to the economic prosperity of a country where trust is thought of in terms of social capital."

The research findings are also interesting in their similarity to classical conditioning experiments, in which a certain behavioral response is elicited through a reward. Just as a person is rewarded for trusting a trustworthy person--and begins trusting the person even earlier if the reward can honestly be expected--so, too, does a lab animal begin anticipating a food reward for pecking a mirror, tripping a switch, slobbering when a buzzer sounds, or running quickly through a maze.

"This is another striking demonstration of the brain re-using ancient centers for new purposes. That trust rides on top of the basic reward centers of the brain is something we had never anticipated and demonstrates how surprising brain imaging can be," Quartz notes.

And finally, the research could have implications for better understanding the neurology of individuals with severely compromised abilities to interact with other people, such as those afflicted with autism, borderline personality disorders, and schizophrenia. "The inability to predict others is a key facet of many mental disorders. These new results could help us better understand these conditions, and may ultimately guide new treatments," suggests Quartz.

The other authors of the article are Brooks King-Casas, Damon Tomlin and P. Read Montague (the lead author), all of the Baylor College of Medicine, and Cedric Anen of Caltech. The title of the paper is "Getting to Know You: Reputation and Trust in a Two-Person Economic Exchange."

Robert Tindol

Caltech Physics Team Invents DeviceFor Weighing Individual Molecules

PASADENA, Calif.-Physicists at the California Institute of Technology have created the first nanodevices capable of weighing individual biological molecules. This technology may lead to new forms of molecular identification that are cheaper and faster than existing methods, as well as revolutionary new instruments for proteomics.

According to Michael Roukes, professor of physics, applied physics, and bioengineering at Caltech and the founding director of Caltech's Kavli Nanoscience Institute, the technology his group has announced this week shows the immense potential of nanotechnology for creating transformational new instrumentation for the medical and life sciences. The new devices are at the nanoscale, he explains, since their principal component is significantly less than a millionth of a meter in width.

The Caltech devices are "nanoelectromechanical resonators"--essentially tiny tuning forks about a micron in length and a hundred or so nanometers wide that have a very specific frequency at which they vibrate when excited. Just as a bronze bell rings at a certain frequency based on its size, shape, and composition, these tiny tuning forks ring at their own fundamental frequency of mechanical vibration, although at such a high pitch that the "notes" are nearly as high in frequency as microwaves.

The researchers set up electronic circuitry to continually excite and monitor the frequency of the vibrating bar. Intermittently, a shutter is opened to expose the nanodevice to an atomic or molecular beam, in this case a very fine "spray" of xenon atoms or nitrogen molecules. Because the nanodevice is cooled, the molecules condense on the bar and add their mass to it, thereby lowering its frequency. In other words, the mechanical vibrations of the now slightly-more-massive nanodevice become slightly lower in frequency--just as thicker, heavier strings on an instrument sound notes that are lower than lighter ones.

Because frequency can be measured so precisely in physics labs, the researchers are then able to evaluate extremely subtle changes in mass of the nanodevice, and therefore, the weight of the added atoms or molecules.

Roukes says that their current generation of devices is sensitive to added mass at the level of a few zeptograms, which is few billionths of a trillionth of a gram. In their experiments this represents about thirty xenon atoms-- and it is the typical mass of an individual protein molecule.

"We hope to transform this chip-based technology into systems that are useful for picking out and identifying specific molecules, one-by-one--for example certain types of proteins secreted in the very early stages of cancer," Roukes says.

"The fundamental problem with identifying these proteins is that one must sort through millions of molecules to make the measurement. You need to be able to pick out the 'needle' from the 'haystack,' and that's hard to do, among other reasons because 95 percent of the proteins in the blood have nothing to do with cancer."

The new method might ultimately permit the creation of microchips, each possessing arrays of miniature mass spectrometers, which are devices for identifying molecules based on their weight. Today, high-throughput proteomics searches are often done at facilities possessing arrays of conventional mass spectrometers that fill an entire laboratory and can cost upwards of a million dollars each, Roukes adds. By contrast, future nanodevice-based systems should cost a small fraction of today's technology, and an entire massively-parallel nanodevice system will probably ultimately fit on a desktop.

Roukes says his group has technology in hand to push mass-sensing technology to even more sensitive levels, probably to the point that individual hydrogen atoms can be weighed. Such an intricately accurate method of determining atomic-scale masses would be quite useful in areas such as quantum optics, in which individual atoms are manipulated.

The next step for Roukes' team at Caltech is to engineer the interfaces so that individual biological molecules can be weighed. For this, the team will likely collaborate with various proteomics labs for side-by-side comparisons of already known information on the mass of biological molecules with results obtained with the new method.

Roukes announced the technology in Los Angeles on Wednesday, March 24, at a news conference during the annual American Physical Society convention. Further results will be published in the near future.

The Caltech team behind the zepto result included Dr. Ya-Tang Yang, former graduate student in applied physics, now at Applied Materials; Dr. Carlo Callegari, former postdoctoral associate, now a professor at the University of Graz, Austria; Xiaoli Feng, current graduate student in electrical engineering; and Dr. Kamil Ekinci former postdoctoral associate, now a professor at Boston University.

Robert Tindol

Scientists Discover What You Are Thinking

PASADENA, Calif. - By decoding signals coming from neurons, scientists at the California Institute of Technology have confirmed that an area of the brain known as the ventrolateral prefrontal cortex (vPF) is involved in the planning stages of movement, that instantaneous flicker of time when we contemplate moving a hand or other limb. The work has implications for the development of a neural prosthesis, a brain-machine interface that will give paralyzed people the ability to move and communicate simply by thinking.

By piggybacking on therapeutic work being conducted on epileptic patients, Daniel Rizzuto, a postdoctoral scholar in the lab of Richard Andersen, the Boswell Professor of Neuroscience, was able to predict where a target the patient was looking at was located, and also where the patient was going to move his hand. The work currently appears in the online version of Nature Neuroscience.

Most research in this field involves tapping into the areas of the brain that directly control motor actions, hoping that this will give patients the rudimentary ability to move a cursor, say, or a robotic arm with just their thoughts. Andersen, though, is taking a different tack. Instead of the primary motor areas, he taps into the planning stages of the brain, the posterior parietal and premotor areas.

Rizzuto looked at another area of the brain to see if planning could take place there as well. Until this work, the idea that spatial processing or movement planning took place in the ventrolateral prefrontal cortex has been a highly contested one. "Just the fact that these spatial signals are there is important," he says. "Based upon previous work in monkeys, people were saying this was not the case." Rizzuto's work is the first to show these spatial signals exist in humans.

Rizzuto took advantage of clinical work being performed by Adam Mamelak, a neurosurgeon at Huntington Memorial Hospital in Pasadena. Mamelak was treating three patients who suffered from severe epilepsy, trying to identify the brain areas where the seizures occurred and then surgically removing that area of the brain. Mamelak implanted electrodes into the vPF as part of this process.

"So for a couple of weeks these patients are lying there, bored, waiting for a seizure," says Rizzuto, "and I was able to get their permission to do my study, taking advantage of the electrodes that were already there." The patients watched a computer screen for a flashing target, remembered the target location through a short delay, then reached to that location. "Obviously a very basic task," he says.

"We were looking for the brain regions that may be contributing to planned movements. And what I was able to show is that a part of the brain called the ventrolateral prefrontal cortex is indeed involved in planning these movements." Just by analyzing the brain activity from the implanted electrodes using software algorithms that he wrote, Rizzuto was able to tell with very high accuracy where the target was located while it was on the screen, and also what direction the patient was going to reach to when the target wasn't even there.

Unlike most labs doing this type of research, Andersen's lab is looking at the planning areas of the brain rather than the primary motor area of the brain, because they believe the planning areas are less susceptible to damage. "In the case of a spinal cord injury," says Rizzuto, "communication to and from the primary motor cortex is cut off." But the brain still performs the computations associated with planning to move. "So if we can tap into the planning computations and decode where a person is thinking of moving," he says, then it just becomes an engineering problem--the person can be hooked up to a computer where he can move a cursor by thinking, or can even be attached to a robotic arm.

Andersen notes, "Dan's results are remarkable in showing that the human ventral prefrontal cortex, an area previously implicated in processing information about objects, also processes the intentions of subjects to make movements. This research adds ventral prefrontal cortex to the list of candidate brain areas for extracting signals for neural prosthetics applications."

In Andersen's lab, Rizzuto's goal is to take the technology they've perfected in animal studies to human clinical trials. "I've already met with our first paralyzed patient, and graduate student Hilary Glidden and I are now doing noninvasive studies to see how the brain reorganizes after paralysis," he says. If it does reorganize, he notes, all the technology that has been developed in non-paralyzed humans may not work. "This is why we think our approach may be better, because we already know that the primary motor area shows pathological reorganization and degeneration after paralysis. We think our area of the brain is going to reorganize less, if at all. After this we hope to implant paralyzed patients with electrodes so that they may better communicate with others and control their environment."

Exclude from News Hub: 

New study provides insights into the brain's remembrance of emotional events

PASADENA, Calif.--Those of us who are old enough to remember the Kennedy assassination are usually able to remember the initial announcement almost as if it's a movie running in our heads. That's because there is a well-known tendency for people to have enhanced memory of a highly emotional event, and further, a memory that focuses especially on the "gist" of the event.

In other words, people who remember the words "President Kennedy is dead" will remember the news extraordinarily well. But at the same time, they will likely have no more recollection of extraneous details such as what they were wearing or what they were doing an hour before hearing the news than they would for any other day in 1963. Neurobiologists have known both these phenomena to be true for some time, and a new study now explains how the brain achieves this effect.

In the new study, researchers from the California Institute of Technology and the University of Iowa College of Medicine show how the recollections of gist and details of emotional events are related to specific parts of the brain. In an article appearing in this month's Nature Neuroscience, the authors report that patients with damage to an area of the brain known as the amygdala are unable to remember the gist of an emotional stimulus, even though there is nothing otherwise faulty in their memory. The study shows that the amygdala somehow focuses the brain's processing resources on the gist of an emotional event.

"During a highly emotional event, like the Kennedy assassination, 9/11, or the Challenger accident, you remember the gist much better than you would remember the gist of some other neutral event," says Ralph Adolphs, a professor of psychology and neuroscience at Caltech and lead author of the study. "But people with damage to the amygdala have a failure to put this special tag on the gist of emotional memories. In other words, they remember the gist of an emotional event no better than the gist of a neutral event."

To test their hypothesis, Adolphs and his colleagues at the University of Iowa College of Medicine showed a group of normal control subjects and a group of test subjects known to have amygdala damage a series of pictures accompanied by fabricated stories. One type of series involved fairly mundane episodes in which, for example, a family was depicted driving somewhere and returning home uneventfully. But in the other series, the story would relate a tragic event, such as the family having been involved in a fatal auto accident on the way home, accompanied with gruesome pictures of amputated limbs.

As expected, the normal control subjects had enhanced recall of the emotional stories and pictures, and far more vague recall of the mundane stories. The test subjects with amygdala damage, however, possessed no better recall of the gist of the emotional story than of the mundane stories. On the other hand, both the control group and the group with amygdala damage showed about equal ability to remember details from stories with no emotional content.

The findings suggest that the amygdala is responsible for our ability to have strong recollections of emotional events, Adolphs says. Further study could point to how the amygdala is involved in impaired real-life emotional memories seen in patients with post-traumatic stress disorder and Alzheimer's disease, he adds.

The other authors of the article are Daniel Tranel and Tony W. Buchanan, both of the University of Iowa College of Medicine's Department of Neurology.

Robert Tindol

Negative Impacts of Dam Construction on Human Populations Can Be Reduced, Author Says

PASADENA, Calif.--Despite the adverse impacts of large dam construction on ecosystems and human settlements, more and more dams are likely to be built in the 21st century wherever there is a need to store water for irrigated agriculture, urban water supplies, and power generation. But world societies and governments would do well to evaluate the consequences of dam construction as an integral part of the planning process, a leading authority writes in a new book.

The book, The Future of Large Dams, is the latest work by California Institute of Technology anthropologist Thayer Scudder, who is arguably the world's foremost expert on the impact of dam construction on human societies living along major world rivers. Published by Earthscan, the book argues that the early analysis by affected stakeholders of the impact of a dam's proposed construction is a worthwhile undertaking. And not only is it worthwhile, but also is quite possible to accomplish with established research techniques.

According to Scudder, large dams are a "flawed yet still necessary development option." Flaws include both the shortcomings of the dam itself as well as ecological and social impacts. In terms of the former, Scudder says that dams, on the average, can be expected to get clogged with sediment at a rate of about 0.5 to 1 percent per year. And in terms of the latter, changing habitat caused by the flooding of land behind and below dams is certain to change the habits of nearby humans and animals alike--if not devastate both.

"Although dams have their problems, they're unfortunately still necessary because of the growing needs of humans for water storage," says Scudder. "That's the dilemma."

Given that governments throughout the world-- the United States included--will continue to dam rivers, Scudder says it's important to take into consideration that hundreds of millions of people have been adversely affected by dams in the last century. Somewhere between 40 and 80 million people have been forcibly relocated by the flooding of the land on which they live to create the reservoirs above the dams. Furthermore, even larger numbers of people have had their lives and livelihoods disrupted by the change of the river flow below dams.

"Lots of people in many places in the world are dependent on the natural flow of rivers, and the consequences can be the sort of things you might not normally even take into account," he says. "For example, a settlement that depends on an annual flooding of agricultural land when the river rises can be wiped out if the regulated flow of the dam causes the annual flooding to cease."

Scudder, in fact, wrote his doctoral dissertation many years ago on such an instance, in which the construction of a dam obliterated the most productive component of an upstream farming system.

"But the book argues that, despite these adverse impacts, there are state-of-the-art ways of addressing them," he says. "For example, if local populations downstream have been depending on an annual inundation of an agricultural flood plain, then the authorities in charge and other stakeholders should consider a controlled release of water that recreates the flooding conditions. Experiments have been done with coordinating hydropower generation and flood recession irrigation needs with the release of 'environmental flows'--that is, releases of water to protect habitats and communities. This approach has been tried in several African countries, and research has shown in other cases that managed floods would be a 'win-win' option."

In general, the way to make dams work for humans everywhere, Scudder suggests, is to address the social and environmental impacts both downstream and upstream of any dam project before the structure is even built, and to evaluate the situations in river basins where dams have already been constructed.

Finally, the political and institutional consideration of dam construction should be addressed, Scudder says. Too often, a dam project is undertaken at a specific locale because of its political expedience, and this is not the best way to minimize the negative human and ecological impact. Restructuring governmental departments that oversee dams can also maximize negative environmental, agricultural, or other impacts.

"We should all be able to benefit from the dams that are to be built in the future rather than suffer from them," he concludes.

Review copies of the book are available from Earthscan Sales and Marketing Administrator Michael Fell by e-mailing him at or calling +44 (0)20 7121 3154.


Robert Tindol

Neuroscientists discover that humans evaluate emotions by looking at the eyes

PASADENA, Calif.--If your mother ever told you to watch out for strangers with shifty eyes, you can start taking her advice to heart. Neuroscientists exploring a region of the brain associated with the recognition of emotional expressions have concluded that it is the eye region that we scan when our brains process information about other people's emotions.

Reporting in the January 6 issue of the journal Nature, California Institute of Technology neuroscientist Ralph Adolphs and colleagues at the University of Iowa, University of Montreal, and University of Glasgow describe new results they have obtained with a patient suffering from a rare genetic malady that has destroyed her brain's amygdala. The amygdala are found in each side of the brain in the medial temporal lobe and are known to process information about facial emotions. The patient, who has been studied by the researchers at the University of Iowa for a decade, shows an intriguing inability to recognize fear and other emotions from facial expressions.

"The fact that the amygdala is involved in fear recognition has been borne out by a large number of studies," explains Adolphs. "But until now the mechanisms through which amygdala damage compromises fear recognition have not been identified."

Although Adolphs and his colleagues have known for years that the woman is unable to recognize fear from facial expressions in others, they didn't know until recently that her problem was an inability to focus on the eye region of others when judging their emotions. They discovered this by carefully recording the way her eyes focused on pictures of faces.

In normal test subjects, a person's eyes dart from area to area of a face in a quick, largely unconscious program of evaluating facial expressions to recognize emotions. The woman, by contrast, tended to stare straight ahead at the photographs, displaying no tendency to regard the eyes at all. As a result, she was nonjudgmental in her interpersonal dealings, often trusting even those individuals who didn't deserve the benefit of the doubt.

However, the good news is that the woman could be trained to look at the eyes in the photographs, even though she had no natural inclination to do so. When she deliberately looked at the eyes upon being instructed to do so, she had a normal ability to recognize fear in the faces.

According to Adolphs, the study is a step forward in better understanding the human brain's perceptual mechanisms, and also a practical key in possible therapies to help certain patients with defective emotional perception lead more normal lives.

In terms of the former, Adolphs says that the amygdala's role in fear recognition will probably be better understood with additional research such as that now going on in Caltech's new magnetic resonance imaging lab. "It would be naïve to ascribe these findings to one single brain structure," he says. "Many parts of the brain work together, so a more accurate picture will probably relate cognitive abilities to a network of brain structures.

"Therefore, the things the amygdala do together with other parts of the brain are going to be a complex matter that will take a long time to figure out."

However, the very fact that the woman could be trained to evaluate fear in other people's faces is encouraging news for individuals with autism and other maladies that cause problems in their recognizing other people's emotions, Adolphs says.

"Maybe people with autism could be helped if they were trained how to look at the world and how to look at people's faces to improve their social functioning," he says.

Adolphs is a professor of psychology and neuroscience at Caltech, and holds a joint appointment at the University of Iowa College of Medicine. The other authors of the paper are Frederic Gosselin, Tony Buchanan, Daniel Tranel, Philippe Schyns, and Antonio Damasio.

Robert Tindol


Subscribe to RSS - research_news