Reducing 20/20 Hindsight Bias

PASADENA, Calif.—You probably know it as Monday-morning quarterbacking or 20/20 hindsight: failures often look obvious and predictable after the fact—whether it's an interception thrown by a quarterback under pressure, a surgeon's mistake, a slow response to a natural disaster, or friendly fire in the fog of war.

In legal settings, this tendency to underestimate the challenges faced by someone else—called hindsight bias—can lead to unfair judgments, punishing people who made an honest, unavoidable mistake.

"Hindsight bias is fueled by the fact that you weren't there—you didn't see the fog and confusion," says Colin Camerer, the Robert Kirby Professor of Behavioral Economics at the California Institute of Technology (Caltech). Furthermore, hindsight bias exists even if you were there. The bias is strong enough to alter your own memories, giving you an inflated sense that you saw the result coming. "We know a lot about the nature of these types of judgmental biases," he says. "But in the past, they weren't understood well enough to prevent them."

In a new study, recently published online in the journal Psychological Science, a team led by Camerer and Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology, not only found a way to predict the severity of the bias, but also identified a technique that successfully reduces it—a strategy that could help produce fairer assessments in situations such as medical malpractice suits and reviewing police or military actions.

Hindsight bias likely stems from the fact that when given new information, the brain tends to file away the old data and ignore it, Camerer explains. Once we know the outcome of a decision or event, we can't easily retrieve those old files, so we can't accurately evaluate something after the fact. The wide-ranging influence of hindsight bias has been observed in many previous studies, but research into the underlying mechanisms is difficult because these kinds of judgment are complex.

But by using experimental techniques from behavioral economics and visual psychophysics—the study of how visual stimuli affect perception—the Caltech researchers say they were able to probe more deeply into how hindsight emerges during decision making.

In the study, the researchers gave volunteers a basic visual task: to look for humans in blurry pictures. The visual system is among the most heavily studied parts of the brain, and researchers have developed many techniques and tools to understand it. In particular, the Caltech experiment used eye-tracking methods to monitor where the subjects were looking as they evaluated the photos, giving the researchers a window into the subjects' thought processes.

Subjects were divided into those who would do the task—the "performers"—and those who would judge the performers after the fact—the "evaluators." The performers saw a series of blurry photos and were told to guess which ones had humans in them. The evaluators' job was to estimate how many performers guessed correctly for each picture. To examine hindsight bias, some evaluators were shown clear versions of the photos before they saw the blurry photos—a situation analogous to how a jury in a medical malpractice case would already know the correct diagnosis before seeing the X-ray evidence.

The experiment found clear hindsight bias. Evaluators who had been primed by a clear photo greatly overestimated the percentage of people who would correctly identify the human. In other words, because the evaluators already knew the answer, they thought the task was easier than it really was. Furthermore, the measurements were similar to those from the first study of hindsight bias in 1975, which examined how people evaluated the probabilities of various geopolitical events before and after President Nixon's trip to China and the USSR. The fact that the results between such disparate kinds of studies are so consistent shows that the high-level thinking involved in the earlier study and the low-level processes of visual perception in the new study are connected, the researchers say.

In the second part of the study, the researchers tracked the subjects' eye movements and found that hindsight bias depended on how the performers and evaluators inspected the photos. Evaluators were often looking at different parts of the photos compared to the performers, and when that happened there was more hindsight bias. But when both groups' gazes fell on similar locations on the photos, the evaluators were less biased. Seeing the wandering gazes of the first group as they tried to make sense of the blurry images seemed to allow the evaluators to internalize the first group's struggles. In other words, when the two groups literally saw eye to eye, the evaluators were less biased and gave a more accurate estimate of the first group's success rate.

Based on these results, the researchers suspected that if they could show the evaluators where people in the first group had looked—indicated by dots jiggling on the screen—then perhaps the evaluators' gazes would be drawn there as well, reducing any potential hindsight bias. When they did the experiment, that's exactly what happened.

Other studies have shown that merely telling people that they should be aware of hindsight bias is not effective, Camerer says. Something more tangible—such as dots that draw the evaluators' attention—is needed.

Although the experiments were done in a very specific context, the researchers say that these results may be used to reduce hindsight bias in real-life situations. "We think it's a very promising step toward engineering something useful," Camerer says.

For example, eye-tracking technology could be used to record how doctors evaluate X-ray or MRI images. If a doctor happens to make a mistake, showing eye-tracking data could reduce hindsight bias when determining whether the error was honest and unavoidable or if the doctor was negligent. Lowering the likelihood of hindsight bias, Camerer says, could also decrease defensive medicine, in which doctors perform excessive and costly procedures—or decline doing a procedure altogether—for fear of being sued for malpractice even when they have done nothing wrong.

As technology advances, our activities are being increasingly monitored and recorded, says Daw-An Wu, the first author of the paper and a former postdoctoral scholar at Caltech who now works at the Caltech Brain Imaging Center. But the study shows that having visual records alone doesn't solve the problem of fair and unbiased accountability. "For there to be some fair judgment afterward, you would hope that the other component of reality is also being recorded—which is not just what is seen, but how people look at it," he says.

The Psychological Science paper is titled "Shared Visual Attention Reduces Hindsight Bias." In addition to Camerer, Shimojo, and Wu, the other author is Stephanie Wang, a former postdoctoral scholar at Caltech who is now an assistant professor at the University of Pittsburgh. This research collaboration was initiated and funded by Trilience Research, with additional support from the Gordon and Betty Moore Foundation, the Tamagawa-Caltech Global COE Program, and the CREST program of the Japan Science and Technology Agency.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

High-Energy Physicists Smash Records for Network Data Transfer

New methods for efficient use of long-range networks will support cutting-edge science

PASADENA, Calif.—Physicists led by the California Institute of Technology (Caltech) have smashed yet another series of records for data-transfer speed. The international team of high-energy physicists, computer scientists, and network engineers reached a transfer rate of 339 gigabits per second (Gbps)—equivalent to moving four million gigabytes (or one million full length movies) per day, nearly doubling last year's record. The team also reached a new record for a two-way transfer on a single link by sending data at 187 Gbps between Victoria, Canada, and Salt Lake City.

The achievements, the researchers say, pave the way for the next level of data-intensive science—in fields such as high-energy physics, astrophysics, genomics, meteorology, and global climate tracking. For example, last summer's discovery at the Large Hadron Collider (LHC) in Geneva of a new particle that may be the long-sought Higgs boson was made possible by a global network of computational and data-storage facilities that transferred more than 100 petabytes (100 million gigabytes) of data in the past year alone. As the LHC continues to slam protons together at higher rates and with more energy, the experiments will produce an even larger flood of data—reaching the exabyte range (a billion gigabytes).

The researchers, led by Caltech, the University of Victoria, and the University of Michigan, together with Brookhaven National Lab, Vanderbilt University, and other partners, demonstrated their achievement at the SuperComputing 2012 (SC12) conference, November 12–16 in Salt Lake City. They used wide-area network circuits connecting Caltech, the University of Victoria Computing Center in British Columbia, the University of Michigan, and the Salt Palace Convention Center in Utah. While setting the records, they also demonstrated other state-of-the-art methods such as software-defined intercontinental networks and direct interconnections between computer memories over the network between Pasadena and Salt Lake City.

"By sharing our methods and tools with scientists in many fields, we aim to further enable the next round of scientific discoveries, taking full advantage of 100-Gbps networks now, and higher-speed networks in the near future," says Harvey Newman, professor of physics at Caltech and the leader of the team. "In particular, we hope that these developments will afford physicists and students throughout the world the opportunity to participate directly in the LHC's next round of discoveries as they emerge."

As the demand for "Big Data" continues to grow exponentially—both in major science projects and in the world at large—the team says they look forward to next year's round of tests using network and data-storage technologies that are just beginning to emerge. Armed with these new technologies and methods, the Caltech team estimates that they may reach 1 terabit-per-second (a thousand gbps) data transfers over long-range networks by next fall. 

More information about the demonstration can be found at http://supercomputing.caltech.edu/.
 

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

An Eye for Science: In the Lab of Markus Meister

Take one look around Markus Meister's new lab and office space on the top floor of the Beckman Behavioral Biology building, and you can tell that he has an eye for detail. Curving, luminescent walls change color every few seconds, wrapping around lab space and giving a warm glow to the open, airy offices that line the east wall. A giant picture of neurons serves as wallpaper, and a column is wrapped in an image from the inside of a retina. And while he may have picked up some tips from his architect wife to help direct the design of his lab, Meister is the true visionary—a biologist studying the details of the eye.

"Since we study the visual system, light is a natural interest for me," says Meister, of the lab he helped plan before joining the Caltech faculty in July. "The architecture team responded to this in so many creative ways. They even installed windows inside the lab space, so that researchers at the bench could see straight through the offices to the nature outside."

Exactly how our eyes process those images of the world around us forms the basis of Meister's research. In particular, he investigates the circuits of nerve cells in the retina—a light-sensitive tissue that lines the inner surface of your eyeball and essentially captures images as they come in through the cornea and lens at the front of your eye. A traditional view of the retina is that it acts like the film in a camera, simply absorbing light and then sending a signal directly to the brain about the image you are viewing.

But Meister, who earned his PhD at Caltech in 1987 and was the Tarr Professor of Molecular and Cellular Biology at Harvard before moving back west, sees things a bit differently.

"There is a lot of preprocessing that occurs in the retina, so the signal that gets sent to the brain ultimately is quite different from the raw image that is projected onto the retina by the lens in your eye," he says. "And it's not just one kind of processing. There are on the order of 20 different ways in which the retina manipulates a raw image and then sends those results onto the brain through the optic nerve. An ongoing puzzle is trying to figure out how that is possible with the limited circuitry that exists in the retina."

Meister and his lab are dedicated to finding new clues to help decode that puzzle. In a recent study, published in Nature Neuroscience, he and Hiroki Asari, a postdoctoral scholar in biology at Caltech, studied the connections between particular cells within the retina. Their specific discovery, he explains, has to do with the associations between bipolar cells, which are the neurons in the middle of the retina, and ganglion cells, which are the very last neurons in the retina that send signals to the brain. What they found is that the connections between these bipolar cells and ganglion cells are much more diverse than had been expected.

"Each upstream bipolar cell can make different neural circuits to do particular kinds of computations before it sends signals to ganglion cells," says Asari, who began his postdoctoral work at Harvard and moved to Caltech with the Meister lab.

The team was also able to show that in many cases, the processing of information in the retina involves amacrine cells, a type of cell in the eye that seems to be involved in fine-tuning the properties of individual bipolar cell actions.

"It's a little bit like electronics, where you have a transistor—one wire controlling the connection between two other wires—that is absolutely central to everything," says Meister. "In a way, this connection between the amacrine cells and the bipolar cell and the ganglion cell looks a little bit like a transistor, in that the amacrine cell can control how the signal flows from the bipolar to the ganglion cell. That's an analogy that I think will help us understand the intricacy of the signal flow in the retina. A goal that we have is to ultimately understand these neural circuits in the same way that we understand electronic circuits."

The next step in this particular line of research is to figure out exactly where the amacrine cells are making their impact on bipolar cells. They believe most of the action happens at the synapses, the connection points between the cells. Studying this area requires new technology to get a good look at the tiny connectors. Luckily, Meister's new lab includes an in-house shop room—complete with a milling machine, a band saw, and other power tools needed to build things like microscopes.

"In this lab, we're doing things on many levels—from the giant milling machine all the way down to measurements on the micron scale," says Meister.

He also plans to expand his research focus. The team has started to study the visual behavior of mice, evaluating, for example, the kinds of innate reactions—those that don't require any other knowledge about the environment—they have to certain visual stimuli. Ultimately, the researchers would like to know which of the pathways that come out of the retina control which behaviors, and if they can find a link between the processing of vision that occurs early in the eye and how the animal functions in its environment using its visual system.

"In my new lab at Caltech, I'm trying to branch out further into the visual system to leverage the understanding we have about the front end—namely processing in the retina—to better understand the different actions that occur in the brain, all the way to certain behaviors of the animal that are based on visual stimuli," he says.

Meister says that he's also excited to get back into an environment that's more focused on math, physics, and engineering—something he hopes to take advantage of at both the faculty/colleague level and the student level.

"Our research subject is one that I feel connects me with so many different areas of science—from molecular genetics to theoretical physics," he explains. "You can rely on a wide range of collaborators and people who are interested in different aspects of the subject. To me, that's been the most satisfying part of my career. I have collaborative projects with neurosurgeons, a theoretical physicist who develops models of visual processing, a particle physicist who builds miniature detector electronics, a molecular genetics expert. These interactions really keep you broadly connected."

And he's hoping to connect to even more people now that he's settled in his Beckman Behavioral Biology lab—even if it's just for a friendly visit.

"I know we're in a remote corner of the building at the north end of the top floor, but we try to keep our door open at all times," says Meister. "There are only five people in the lab right now, and it gets kind of lonely. We're going to build the group up, but in the meantime it would be nice if people came to visit us."

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Nano Insights Could Lead to Improved Nuclear Reactors

Caltech researchers examine self-healing abilities of some materials

PASADENA, Calif.—In order to build the next generation of nuclear reactors, materials scientists are trying to unlock the secrets of certain materials that are radiation-damage tolerant. Now researchers at the California Institute of Technology (Caltech) have brought new understanding to one of those secrets—how the interfaces between two carefully selected metals can absorb, or heal, radiation damage.

"When it comes to selecting proper structural materials for advanced nuclear reactors, it is crucial that we understand radiation damage and its effects on materials properties. And we need to study these effects on isolated small-scale features," says Julia R. Greer, an assistant professor of materials science and mechanics at Caltech. With that in mind, Greer and colleagues from Caltech, Sandia National Laboratories, UC Berkeley, and Los Alamos National Laboratory have taken a closer look at radiation-induced damage, zooming in all the way to the nanoscale—where lengths are measured in billionths of meters. Their results appear online in the journals Advanced Functional Materials and Small.

During nuclear irradiation, energetic particles like neutrons and ions displace atoms from their regular lattice sites within the metals that make up a reactor, setting off cascades of collisions that ultimately damage materials such as steel. One of the byproducts of this process is the formation of helium bubbles. Since helium does not dissolve within solid materials, it forms pressurized gas bubbles that can coalesce, making the material porous, brittle, and therefore susceptible to breakage.  

Some nano-engineered materials are able to resist such damage and may, for example, prevent helium bubbles from coalescing into larger voids. For instance, some metallic nanolaminates—materials made up of extremely thin alternating layers of different metals—are able to absorb various types of radiation-induced defects at the interfaces between the layers because of the mismatch that exists between their crystal structures.

"People have an idea, from computations, of what the interfaces as a whole may be doing, and they know from experiments what their combined global effect is. What they don't know is what exactly one individual interface is doing and what specific role the nanoscale dimensions play," says Greer. "And that's what we were able to investigate."

Peri Landau and Guo Qiang, both postdoctoral scholars in Greer's lab at the time of this study, used a chemical procedure called electroplating to either grow miniature pillars of pure copper or pillars containing exactly one interface—in which an iron crystal sits atop a copper crystal. Then, working with partners at Sandia and Los Alamos, in order to replicate the effect of helium irradiation, they implanted those nanopillars with helium ions, both directly at the interface and, in separate experiments, throughout the pillar.

The researchers then used a one-of-a-kind nanomechanical testing instrument, called the SEMentor, which is located in the subbasement of the W. M. Keck Engineering Laboratories building at Caltech, to both compress the tiny pillars and pull on them as a way to learn about the mechanical properties of the pillars—how their length changed when a certain stress was applied, and where they broke, for example. 

"These experiments are very, very delicate," Landau says. "If you think about it, each one of the pillars—which are only 100 nanometers wide and about 700 nanometers long—is a thousand times thinner than a single strand of hair. We can only see them with high-resolution microscopes."

The team found that once they inserted a small amount of helium into a pillar at the interface between the iron and copper crystals, the pillar's strength increased by more than 60 percent compared to a pillar without helium. That much was expected, Landau explains, because "irradiation hardening is a well-known phenomenon in bulk materials." However, she notes, such hardening is typically linked with embrittlement, "and we do not want materials to be brittle."

Surprisingly, the researchers found that in their nanopillars, the increase in strength did not come along with embrittlement, either when the helium was implanted at the interface, or when it was distributed more broadly. Indeed, Greer and her team found, the material was able to maintain its ductility because the interface itself was able to deform gradually under stress.

This means that in a metallic nanolaminate material, small helium bubbles are able to migrate to an interface, which is never more than a few tens of nanometers away, essentially healing the material. "What we're showing is that it doesn't matter if the bubble is within the interface or uniformly distributed—the pillars don't ever fail in a catastrophic, abrupt fashion," Greer says. She notes that the implanted helium bubbles—which are described in the Advanced Functional Materials paper—were one to two nanometers in diameter; in future studies, the group will repeat the experiment with larger bubbles at higher temperatures in order to represent additional conditions related to radiation damage.

In the Small paper, the researchers showed that even nanopillars made entirely of copper, with no layering of metals, exhibited irradiation-induced hardening. That stands in stark contrast to the results from previous work by other researchers on proton-irradiated copper nanopillars, which exhibited the same strengths as those that had not been irradiated. Greer says that this points to the need to evaluate different types of irradiation-induced defects at the nanoscale, because they may not all have the same effects on materials.

While no one is likely to be building nuclear reactors out of nanopillars anytime soon, Greer argues that it is important to understand how individual interfaces and nanostructures behave. "This work is basically teaching us what gives materials the ability to heal radiation damage—what tolerances they have and how to design them," she says. That information can be incorporated into future models of material behavior that can help with the design of new materials.

Along with Greer, Landau, and Qiang, Khalid Hattar of Sandia National Laboratories is also a coauthor on the paper "The Effect of He Implantation on the Tensile Properties and Microstructure of Cu/Fe Nano-bicrystals," which appears online in Advanced Functional Materials. Peter Hosemann of UC Berkeley and Yongqiang Wang of Los Alamos National Laboratory are coauthors on the paper "Helium Implantation Effects on the Compressive Response of Cu Nanopillars," which appears online in the journal Small. The work was supported by the U.S. Department of Energy and carried out, in part, in the Kavli Nanoscience Institute at Caltech.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

A Fresh Look at Psychiatric Drugs

Caltech researchers propose a new approach to understanding common treatments

Drugs for psychiatric disorders such as depression and schizophrenia often require weeks to take full effect. "What takes so long?" has formed one of psychiatry's most stubborn mysteries. Now a fresh look at previous research on quite a different drug—nicotine—is providing answers. The new ideas may point the way toward new generations of psychiatric drugs that work faster and better.

For several years, Henry Lester, Bren Professor of Biology at Caltech, and his colleagues have worked to understand nicotine addiction by repeatedly exposing nerve cells to the drug and studying the effects. At first glance, it's a simple story: nicotine binds to, and activates, specific nicotine receptors on the surface of nerve cells within a few seconds of being inhaled. But nicotine addiction develops over weeks or months; and so the Caltech team wanted to know what changes in the nerve cell during that time, hidden from view.

The story that developed is that nicotine infiltrates deep into the cell, entering a protein-making structure called the endoplasmic reticulum and increasing its output of the same nicotine receptors. These receptors then travel to the cell's surface. In other words, nicotine acts "inside out," directing actions that ultimately fuel and support the body's addiction to nicotine.

"That nicotine works 'inside out' was a surprise a few years ago," says Lester. "We originally thought that nicotine acted only from the outside in, and that a cascade of effects trickled down to the endoplasmic reticulum and the cell's nucleus, slowly changing their function."

In a new research review paper, published in Biological Psychiatry, Lester—along with senior research fellow Julie M. Miwa and postdoctoral scholar Rahul Srinivasan—proposes that psychiatric medications may work in the same "inside-out" fashion—and that this process explains how it takes weeks rather than hours or days for patients to feel the full effect of such drugs.

"We've known what happens within minutes and hours after a person takes Prozac, for example," explains Lester. "The drug binds to serotonin uptake proteins on the cell surface, and prevents the neurotransmitter serotonin from being reabsorbed by the cell. That's why we call Prozac a selective serotonin reuptake inhibitor, or SSRI." While the new hypothesis preserves that idea, it also presents several arguments for the idea that the drugs also enter into the bodies of the nerve cells themselves.

There, the drugs would enter the endoplasmic reticulum similarly to nicotine and then bind to the serotonin uptake proteins as they are being synthesized. The result, Lester hypothesizes, is a collection of events within neurons that his team calls "pharmacological chaperoning, matchmaking, escorting, and abduction." These actions—such as providing more stability for various proteins—could improve the function of those cells, leading to therapeutic effects in the patient. But those beneficial effects would occur only after the nerve cells have had time to make their intracellular changes and to transport those changes to the ends of axons and dendrites.

"These 'inside-out' hypotheses explain two previously mysterious actions," says Lester. "On the one hand, the ideas explain the long time required for the beneficial actions of SSRIs and antischizophrenic drugs. But on the other hand, newer, and very experimental, antidepressants act within hours. Binding within the endoplasmic reticulum of dendrites, rather than near the nucleus, might underlie those actions."

Lester and his colleagues first became interested in nicotine's effects on neural disorders because of a striking statistic: a long-term tobacco user has a roughly twofold lower chance of developing Parkinson's disease. Because there is no medical justification for using tobacco, Lester's group wanted more information about this inadvertent beneficial action of nicotine. They knew that stresses on the endoplasmic reticulum, if continued for years, could harm a cell. Earlier this year, they reported that nicotine's "inside-out" action appears to reduce endoplasmic reticulum stress, which could prevent or forestall the onset of Parkinson's disease.

Lester hopes to test the details of "inside-out" hypotheses for psychiatric medication. First steps would include investigating the extent to which psychiatric drugs enter cells and bind to their nascent receptors in the endoplasmic reticulum. The major challenge is to discover which other proteins and genes, in addition to the targets, participate in "matchmaking, escorting, and abduction."

"Present-day psychiatric drugs have a lot of room for improvement," says Lester. "Systematic research to produce better psychiatric drugs has been hampered by our ignorance of how they work. If the hypotheses are proven and the intermediate steps clarified, it may become possible to generate better medications."

 

Writer: 
Caltech Communications
Exclude from News Hub: 
No
News Type: 
Research News

How I Landed on Mars

Caltech graduate students on the MSL mission drive science, the rover, and their careers

Caltech geology graduate student Katie Stack says her Caltech experience has provided her with the best of both worlds. Literally.

As one of five Caltech graduate students currently staffing the Mars Science Laboratory mission, Stack is simultaneously exploring the geologic pasts of both Mars and Earth. She and her student colleagues apply their knowledge of Earth's history and environment—gleaned from Caltech classes and field sites across the globe—to the analysis of Curiosity's discoveries as well as the hunt for evidence of past life on the Red Planet.

"Mars exploration is that perfect combination of understanding what is close to home and far afield," says Stack, who studies sedimentology and stratigraphy in the lab of John Grotzinger, the mission's project scientist and Caltech's Fletcher Jones Professor of Geology.

"The mission is providing a different perspective for seeing the world . . . as well as for seeing myself," she adds. "As a graduate student, you often struggle with your place in your academic community, and taking part in the mission is one of the ways that we are just thrown into the mix. We are working on the same level as a bunch of senior scientists, who have a lot of experience, and yet they are asking us questions—seeking our expertise. That's an experience you don't often get to have."

Caltech's graduate student participants on the MSL—who include Stack, Kirsten Siebach, Lauren Edgar, Jeff Marlow, and Hayden Miller, all from the Division of Geological and Planetary Sciences—represent the largest contingent of students from any one institution in a mission that has more than 400 participating scientists. Caltech's strong student presence is aided in large part by the leadership role that faculty are playing in the mission as well as the Institute's close proximity to mission control at the Jet Propulsion Laboratory (JPL), which Caltech manages for NASA.

Caltech's graduate students are among the mission personnel responsible for sequencing the scientific plan and programming the rover each day, as well as for documenting the scientific discussion and decisions at each step of the mission. As the mission's blogger, Marlow also helps share the science team's work with the public.

"The graduate students are the heart of the mission," Grotzinger says. "They are the keepers of the plan and are able to efficiently operate the technology to run the rover every day, especially when senior scientists are unable to do so."

Making the science plans for the rover, says graduate student Kirsten Siebach, is "as close as I get to driving the rover. I can help program it to take pictures, analyze samples, and shoot the ChemCam laser."

"It's always fun when something that I helped command the rover to do, like take a picture, ends up making the news," she adds. "I helped command it to take one such picture of the Hottah outcrop that showed evidence of an ancient streambed."

In addition to staffing operations for the mission, Caltech's students are also key contributors to the scientific analysis of the data and help make decisions about where Curiosity goes.

Before Curiosity landed, for instance, Stack and Lauren Edgar, helped compile a geologic map of the Gale Crater landing ellipse, using orbital images to identify the geologic diversity and relationships among rocks. Their work has continued to serve as a "road map" for the rover's research. Meanwhile, Siebach has been exploring the history of water on Mars, looking at the geomorphology of channel structures and fractures on the planet.

"We really have grown up in the golden age of Mars exploration," Edgar says, noting that while at Caltech, she's had the opportunity to contribute to three Mars rover missions—Spirit, Opportunity, and now Curiosity. "They just keep getting better and better."

In addition to the graduate students, several undergraduate students have taken part in the mission, participating through Caltech's Summer Undergraduate Research Fellowships (SURF) program. This past summer, a student working with Bethany Ehlmann, an assistant professor of planetary science at Caltech and an MSL science team member, helped to characterize and classify hundreds of Earth rock samples for potential comparison with Mars specimens that will be analyzed by the ChemCam instrument. Meanwhile, over the past two summers, Solomon Chang, a Caltech sophomore studying computer science, worked with JPL engineers to model Curiosity's mobility to ensure that it would actually move on Mars as it had been programmed to do.

Those summer projects have ended, but for the Caltech grad students on the MSL team, the work continues. Indeed, says Grotzinger, because many of the mission's scientists will be leaving Pasadena to return to their home institutions during the coming months, the grad students will be called upon to fill additional roles in the rover's daily operation and science.

"One of the great things about working on a mission as a student is that science is a fairly merit-driven process," says Ehlmann, who participated in Mars exploration missions as both an undergraduate and graduate student. "So if you have a good idea and you are there, you can contribute to deciding what measurements to make, can develop hypothesis about what's going on. It's a very inspiring and empowering experience."

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Developmental Bait and Switch

Caltech-led team discovers enzyme responsible for neural crest cell development

PASADENA, Calif.—During the early developmental stages of vertebrates—animals that have a backbone and spinal column, including humans—cells undergo extensive rearrangements, and some cells migrate over large distances to populate particular areas and assume novel roles as differentiated cell types. Understanding how and when such cells switch their purpose in an embryo is an important and complex goal for developmental biologists. A recent study, led by researchers at the California Institute of Technology (Caltech), provides new clues about this process—at least in the case of neural crest cells, which give rise to most of the peripheral nervous system, to pigment cells, and to large portions of the facial skeleton.

"There has been a long-standing mystery regarding why some cells in the developing embryo start out as part of the future central nervous system, but leave to populate peripheral parts of the body," says Marianne Bronner, the Albert Billings Ruddock Professor of Biology at Caltech and corresponding author of the paper, published in the November 1 issue of the journal Genes & Development. "In this paper, we find that an important type of enzyme called DNA-methyltransferase, or DNMT, acts as a switch, determining which cells will remain part of the central nervous system, and which will become neural crest cells."

According to Bronner, DNMT arranges this transition by silencing expression of the genes that promote central nervous system (CNS) identity, thereby giving the cells the green light to become neural crest, migrate, and do new things, like help build a jaw bone. The team came to this conclusion after analyzing the actions of one type of DNMT—DNMT3A—at different stages of development in a chicken embryo.

This is important, says Bronner, because while most scientists who study the function of DNMTs use embryonic stem cells that can be maintained in culture, her team is "studying events that occur in living embryos as opposed to cells grown under artificial conditions," she explains.

"It is somewhat counterintuitive that this kind of shutting off of genes is essential for promoting neural crest cell fate," she says. "Embryonic development often involves switches in the types of inputs that a cell receives. This is an example of a case where a negative factor must be turned off—essentially a double negative—in order to achieve a positive outcome."

Bronner says it was also surprising to see that an enzyme like DNMT has such a specific function at a specific time. DNMTs are sometimes thought to act in every cell, she says, yet the researchers have discovered a function for this enzyme that is exquisitely controlled in space and time.

"It is amazing how an enzyme, at a given time point during development, can play such a specific role of making a key developmental decision within the embryo," says Na Hu, a graduate student in Bronner's lab and lead author of the paper. "Our findings can be applied to stem cell therapy, by giving clues about how to engineer other cell types or stem cells to become neural crest cells."

Bronner points out that their work relates to the discovery, which won a recent Nobel Prize in Medicine or Physiology, that it is possible to "reprogram" cells taken from adult tissue. These induced pluripotent stem (iPS) cells are similar to embryonic stem cells, and many investigators are attempting to define the conditions needed for them to differentiate into particular cell types, including neural crest derivatives.

"Our results showing that DNMT is important for converting CNS cells to neural crest cells will be useful in defining the steps needed to reprogram such iPS cells," she says. "The iPS cells may in turn be useful for repair in human diseases such as familial dysautonomia, a disease in which there is depletion of autonomic and sensory neurons that are neural crest–derived; for repair of jaw bones lost in osteonecrosis; and for many other potential treatments."

In the short term, the team will explore the notion that DNMT enzymes may have different functions in the embryo at different places and times. That's why the next step in their research, says Bronner, is to examine the later role of these enzymes in nervous-system development, like whether or not they effect the length of time during which the CNS is able to produce neural crest cells.

Additional authors on the paper, titled "DNA methyltransferase3A as a molecular switch mediating the neural tube-to-neural crest fate transition," are Pablo Strobl-Mazzulla from the Laboratorio de Biología del Desarrollo in Chascomús, Argentina, and Tatjana Sauka-Spengler from the Weatherall Institute of Molecular Medicine at the University of Oxford. The work was supported by the National Institutes of Health and the United States Public Health Service.

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Progress for Paraplegics

Caltech investigators expand project to restore functions to people with spinal-cord injuries

In May 2011, a new therapy created in part by Caltech engineers enabled a paraplegic man to stand and move his legs voluntarily. Now those same researchers are working on a way to automate their system, which provides epidural electrical stimulation to the lower spinal cord. Their goal is for the system to soon be made available to rehab clinics—and thousands of patients—worldwide.

That first patient—former athlete Rob Summers, who had been completely paralyzed below the chest following a 2006 accident—performed remarkably well with the electromechanical system. Although it wasn't initially part of the testing protocol established by the Food and Drug Administration, the FDA allowed Summers to take the entire system with him when he left the Frazier Rehab Institute in Louisville—where his postsurgical physical therapy was done—provided he returns every three months for a checkup.

Joel Burdick, the Richard L. and Dorothy M. Hayman Professor of Mechanical Engineering and Bioengineering at Caltech, and Yu-Chong Tai, a Caltech professor of electrical engineering and mechanical engineering, helped create the therapy, which involves the use of a sheetlike array of electrodes that stimulate Summers' neurons and thus activate the circuits in his lower spinal cord that control standing and stepping. The approach has subsequently been successfully tested on a second paraplegic, and therapists are about to finish testing a third subject, who has shown positive results.

But Tai and Burdick want to keep the technology, as well as the subjects, moving forward. To that end, Tai is developing new versions of the electrode array currently approved for human implantation; these will improve patients' stepping motions, among other advances, and they will be easier to implant. Burdick is also working on a way to let a computer control the pattern of electrical stimulation applied to the spinal cord.

"We need to go further," Burdick says. "And for that, we need new technology."

Because spinal-cord injuries vary from patient to patient, deploying the system has required constant individualized adjustments by clinicians and researchers at the Frazier Institute, a leading center for spinal-cord rehabilitation. "Right now there are 16 electrodes in the array, and for each individual electrode, we send a pulse, which can be varied for amplitude and frequency to cause a response in the patient," Burdick says. Using the current method, he notes, "it takes substantial effort to test all the variables to find the optimum setting for a patient for each of the different functions we want to activate."

The team of investigators, which also includes researchers from UCLA and the University of Louisville, has until now used intelligent guesswork to determine which stimuli might work best. But soon, using a new algorithm developed by Burdick, they will be able to rely on a computer to determine the optimum stimulation levels, based on the patient's response to previous stimuli. This would allow patients to go home after the extensive rehab process with a system that could be continually adjusted by computer—saving Summers and the other patients many of those inconvenient trips back to Louisville. Doctors and technicians could monitor patients' progress remotely.

In addition to providing the subjects with continued benefits from the use of the device, there are other practical reasons for wanting to automate the system. An automated system would be easier to share with other hospitals and clinics around the world, Burdick says, and without a need for intensive training, it could lower the cost.

The FDA has approved testing the system in five spinal-cord injury patients, including the three already enrolled in the trial; Burdick is planning to test the new computerized version in the fourth patient, as well as in Rob Summers during 2013. Once the investigators have completed testing on all five patients, Burdick says, the team will spend time analyzing the data before deciding how to improve the system and expand its use.

The strategy is not a cure for paraplegics, but a tool that can be used to help improve the quality of their health, Burdick says. The technology could also complement stem-cell therapies or other methods that biologists are working on to repair or circumvent the damage to nervous systems that results from spinal-cord injury.

"There's not going to be one silver bullet for treating spinal-cord injuries," Burdick says. "We think that our technique will play a role in the rehabilitation of spinal-cord injury patients, but a more permanent cure will likely come from biological solutions."

Even with the limitations of the current system, Burdick says, the results have exceeded his expectations.

"All three subjects stood up within 48 hours of turning on the array," Burdick says. "This shows that the first patient wasn't a fluke, and that many aspects of the process are repeatable." In some ways, the second and third patients are performing even better than Summers, though it will be some time before the team can fully analyze those results. "We were expecting variations because of the distinct differences in the patients' injuries. Rob gave us a starting point, and now we've learned how to tune the array for each patient and to make adjustments as each patient changes over time.

"I do this work because I love it," Burdick says. "When you work with these people and get to know them and see how they are improving, it's personally inspiring."

Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Reconsidering the Global Thermostat

Caltech researcher and colleagues show outcome of geoengineering can be tunable

PASADENA, Calif.—From making clouds whiter and injecting aerosols into the stratosphere, to building enormous sunshades in space, people have floated many ideas about how the planet's climate could be manipulated to counteract the effects of global warming—a concept known as geoengineering. Many of the ideas involve deflecting incoming sunlight before it has a chance to further warm the earth. Because this could affect areas of the planet inequitably, geoengineering has often raised an ethical question: Whose hand would control the global thermostat?

Now a team of researchers from the California Institute of Technology (Caltech), Harvard University, and the Carnegie Institution says there doesn't have to be just a single global control. Using computer modeling, they have shown that varying the amount of sunlight deflected away from the earth by season and by region can significantly improve the parity of the situation. The results appear in an advance online publication of the journal Nature Climate Change.

Previous geoengineering studies have typically assumed uniform deflection of sunlight everywhere on the planet. But the pattern of temperature and precipitation effects that would result from such efforts would never compensate perfectly for the complex pattern of changes that have resulted from global warming. Some areas would end up better off than others, and the climate effects are complex. For example, as the planet warms, the poles are heating up more than the tropics. However, in models where sunlight is deflected uniformly, when enough sunlight is redirected to compensate for this polar warming, the tropics end up colder than they were before man-made activities pumped excess carbon dioxide into the atmosphere.

In the new study, the researchers worked with a climate model of relatively coarse resolution. Rather than selecting one geoengineering strategy, they mimicked the desired effect of many projects by simply "turning down the sun"—decreasing the amount of sunlight reaching the planet. Instead of turning down the sun uniformly, they tailored when and where they reduced incoming sunlight, looking at 15 different combinations. In one, for example, they turned down the sun between January and March while also turning it down more at the poles than at the tropics.

"That essentially gives us 15 knobs that we can tune in order to try to minimize effects at the worst-off regions on the planet," says Doug MacMartin, a senior research associate at Caltech and lead author of the new paper. "In our model, we were able to reduce the residual climate changes (after geoengineering) in the worst-off regions by about 30 percent relative to what could be achieved using a uniform reduction in sunlight."

The group also found that by varying where and when sunlight was reduced, they needed to turn down the sun just 70 percent as much as they would in uniform reflectance to get a similar result. "Based on this work, it's at least plausible that there are ways that you could implement a geoengineering solution that would have less severe consequences, such as a reduced impact on ozone," MacMartin says.

The researchers also used the tuning approach to focus on recovering Arctic sea ice. In their model, it took five times less solar reduction than in the uniform reflectance models to recover the Arctic sea ice to the extent typical of pre-Industrial years.

"These results indicate that varying geoengineering efforts by region and over different periods of time could potentially improve the effectiveness of solar geoengineering and reduce climate impacts in at-risk areas," says Ken Caldeira of the Carnegie Institution. "For example, these approaches may be able to reverse long-term changes in the Arctic sea ice."

The group acknowledges that geoengineering ideas are untested and could come with serious consequences, such as making the skies whiter and depleting the ozone layer, not to mention the unintended consequences that tend to arise when dealing with such a complicated system as the planet. They also say that the best solution would be to reduce greenhouse gas emissions. "I'm approaching it as an engineering problem," MacMartin says. "I'm interested in whether we can come up with a better way of doing the geoengineering that minimizes the negative consequences."  

In addition to MacMartin and Caldeira, David Keith of Harvard University and Ben Kravitz, formerly of the Carnegie Institution but now at the DOE's Pacific Northwest National Lab, are also coauthors on the paper, "Management of trade-offs in geoengineering through optimal choice of non-uniform radiative forcing."

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Technology Has Improved Voting Procedures

New report assesses voting procedures over the last decade

PASADENA, Calif.—Thanks to better voting technology over the last decade, the country's election process has seen much improvement, according to a new report released today by researchers at Caltech and MIT. However, the report notes, despite this progress, some problems remain.

Spurred by the debacle of hanging chads and other voting problems during the 2000 presidential election, the Voting Technology Project (VTP) was started by Caltech and MIT to bring together researchers from across disciplines to figure out how to improve elections. The VTP issued its first report in 2001.

"Since that report came out and since our project was formed, a lot of progress has been made in improving how American elections are run," says Michael Alvarez, professor of political science at Caltech and codirector of the VTP.

For example, the report found that getting rid of outdated voting machines has caused a drop in the number of votes lost to ballot errors. To assess how many votes are lost in each election due to voting mistakes, the researchers calculate the number of residual votes—or the difference between the number of votes that are counted for a particular office and the total number of votes cast. If there are no voting errors, there should be no residual votes.

In their first report in 2001, the researchers found that older voting technology—like punch cards—led to a high residual vote rate. But their new research now shows that the rate has dropped. In particular, Charles Stewart III, a professor of political science at MIT and the other codirector of the VTP, and his colleagues found that the total number of residual votes decreased from 2 percent in 2000 to 1 percent in 2006 and 2008, meaning that fewer votes were lost due to voting errors. The drop was greater in states that instituted more modern voting technology.

"As we moved away from punch cards, lever machines, and paper ballots and towards optical scan systems and electronic systems that have voter verification, we have seen the voter residual rate plummet," Alvarez says. Voter-verification technology gives voters immediate feedback if they make a mistake—by filling in a circle incorrectly, for example—and a chance to correct their error to ensure that their votes are counted.

In addition, the report urges officials to continue and expand election auditing to study the accuracy of registration and voting procedures. For example, after an election, officials can recount ballots to make sure the electronic ballot counters are accurate. "Postelection ballot auditing is a great idea and states need to continue their efforts to use those election ballot-auditing procedures to increase the amount of confidence and integrity of elections," Alvarez says.

The researchers also describe concern with the rise of absentee and early voting, since voter verification is much harder to do via mail. Unlike with in-person voting, these methods offer no immediate feedback about whether a ballot was filled out correctly or if it got counted at all. Once you put your ballot in the mailbox, it's literally out of your hands.

The report also weighs in on voter-identification laws, which have been proposed in many states and subsequently challenged in court. Proponents say they are necessary to prevent voter fraud while opponents argue that there is little evidence that such fraud exists. Moreover, opponents say, voter identification laws make it much more difficult for people without government-issued IDs to vote. But, the report says, technology may resolve the conflict.

"Technology may help ensure voter authentication while alleviating or mitigating the costs that are imposed on voters by laws requiring state-issued identification," says Jonathan Katz, the Kay Sugahara Professor of Social Sciences and Statistics and coauthor of the VTP report.

For example, polling places can have access to a database of registered voters that is also linked to the state's database of DMV photos. A voter's identification can then be confirmed without them having to carry a photo ID. For voters who do not have an ID, the polling place can be equipped with a camera to take an ID picture immediately. The photo can then be entered into the database to verify identification in future elections.

Click here to read the complete report and learn more about the VTP.

In addition to Alvarez, Stewart, and Katz, the other authors of the Caltech/MIT VTP report are Stephen Ansolabehere of Harvard, Thad Hall of the University of Utah, and Ronald Rivest of MIT. The report was supported by the Carnegie Corporation of New York. The project has been supported by the John S. and James L. Knight Foundation and the Pew Charitable Trusts.

Writer: 
Marcus Woo
Writer: 

Pages

Subscribe to RSS - research_news