How I Landed on Mars

Caltech graduate students on the MSL mission drive science, the rover, and their careers

Caltech geology graduate student Katie Stack says her Caltech experience has provided her with the best of both worlds. Literally.

As one of five Caltech graduate students currently staffing the Mars Science Laboratory mission, Stack is simultaneously exploring the geologic pasts of both Mars and Earth. She and her student colleagues apply their knowledge of Earth's history and environment—gleaned from Caltech classes and field sites across the globe—to the analysis of Curiosity's discoveries as well as the hunt for evidence of past life on the Red Planet.

"Mars exploration is that perfect combination of understanding what is close to home and far afield," says Stack, who studies sedimentology and stratigraphy in the lab of John Grotzinger, the mission's project scientist and Caltech's Fletcher Jones Professor of Geology.

"The mission is providing a different perspective for seeing the world . . . as well as for seeing myself," she adds. "As a graduate student, you often struggle with your place in your academic community, and taking part in the mission is one of the ways that we are just thrown into the mix. We are working on the same level as a bunch of senior scientists, who have a lot of experience, and yet they are asking us questions—seeking our expertise. That's an experience you don't often get to have."

Caltech's graduate student participants on the MSL—who include Stack, Kirsten Siebach, Lauren Edgar, Jeff Marlow, and Hayden Miller, all from the Division of Geological and Planetary Sciences—represent the largest contingent of students from any one institution in a mission that has more than 400 participating scientists. Caltech's strong student presence is aided in large part by the leadership role that faculty are playing in the mission as well as the Institute's close proximity to mission control at the Jet Propulsion Laboratory (JPL), which Caltech manages for NASA.

Caltech's graduate students are among the mission personnel responsible for sequencing the scientific plan and programming the rover each day, as well as for documenting the scientific discussion and decisions at each step of the mission. As the mission's blogger, Marlow also helps share the science team's work with the public.

"The graduate students are the heart of the mission," Grotzinger says. "They are the keepers of the plan and are able to efficiently operate the technology to run the rover every day, especially when senior scientists are unable to do so."

Making the science plans for the rover, says graduate student Kirsten Siebach, is "as close as I get to driving the rover. I can help program it to take pictures, analyze samples, and shoot the ChemCam laser."

"It's always fun when something that I helped command the rover to do, like take a picture, ends up making the news," she adds. "I helped command it to take one such picture of the Hottah outcrop that showed evidence of an ancient streambed."

In addition to staffing operations for the mission, Caltech's students are also key contributors to the scientific analysis of the data and help make decisions about where Curiosity goes.

Before Curiosity landed, for instance, Stack and Lauren Edgar, helped compile a geologic map of the Gale Crater landing ellipse, using orbital images to identify the geologic diversity and relationships among rocks. Their work has continued to serve as a "road map" for the rover's research. Meanwhile, Siebach has been exploring the history of water on Mars, looking at the geomorphology of channel structures and fractures on the planet.

"We really have grown up in the golden age of Mars exploration," Edgar says, noting that while at Caltech, she's had the opportunity to contribute to three Mars rover missions—Spirit, Opportunity, and now Curiosity. "They just keep getting better and better."

In addition to the graduate students, several undergraduate students have taken part in the mission, participating through Caltech's Summer Undergraduate Research Fellowships (SURF) program. This past summer, a student working with Bethany Ehlmann, an assistant professor of planetary science at Caltech and an MSL science team member, helped to characterize and classify hundreds of Earth rock samples for potential comparison with Mars specimens that will be analyzed by the ChemCam instrument. Meanwhile, over the past two summers, Solomon Chang, a Caltech sophomore studying computer science, worked with JPL engineers to model Curiosity's mobility to ensure that it would actually move on Mars as it had been programmed to do.

Those summer projects have ended, but for the Caltech grad students on the MSL team, the work continues. Indeed, says Grotzinger, because many of the mission's scientists will be leaving Pasadena to return to their home institutions during the coming months, the grad students will be called upon to fill additional roles in the rover's daily operation and science.

"One of the great things about working on a mission as a student is that science is a fairly merit-driven process," says Ehlmann, who participated in Mars exploration missions as both an undergraduate and graduate student. "So if you have a good idea and you are there, you can contribute to deciding what measurements to make, can develop hypothesis about what's going on. It's a very inspiring and empowering experience."

Exclude from News Hub: 
News Type: 
Research News

Developmental Bait and Switch

Caltech-led team discovers enzyme responsible for neural crest cell development

PASADENA, Calif.—During the early developmental stages of vertebrates—animals that have a backbone and spinal column, including humans—cells undergo extensive rearrangements, and some cells migrate over large distances to populate particular areas and assume novel roles as differentiated cell types. Understanding how and when such cells switch their purpose in an embryo is an important and complex goal for developmental biologists. A recent study, led by researchers at the California Institute of Technology (Caltech), provides new clues about this process—at least in the case of neural crest cells, which give rise to most of the peripheral nervous system, to pigment cells, and to large portions of the facial skeleton.

"There has been a long-standing mystery regarding why some cells in the developing embryo start out as part of the future central nervous system, but leave to populate peripheral parts of the body," says Marianne Bronner, the Albert Billings Ruddock Professor of Biology at Caltech and corresponding author of the paper, published in the November 1 issue of the journal Genes & Development. "In this paper, we find that an important type of enzyme called DNA-methyltransferase, or DNMT, acts as a switch, determining which cells will remain part of the central nervous system, and which will become neural crest cells."

According to Bronner, DNMT arranges this transition by silencing expression of the genes that promote central nervous system (CNS) identity, thereby giving the cells the green light to become neural crest, migrate, and do new things, like help build a jaw bone. The team came to this conclusion after analyzing the actions of one type of DNMT—DNMT3A—at different stages of development in a chicken embryo.

This is important, says Bronner, because while most scientists who study the function of DNMTs use embryonic stem cells that can be maintained in culture, her team is "studying events that occur in living embryos as opposed to cells grown under artificial conditions," she explains.

"It is somewhat counterintuitive that this kind of shutting off of genes is essential for promoting neural crest cell fate," she says. "Embryonic development often involves switches in the types of inputs that a cell receives. This is an example of a case where a negative factor must be turned off—essentially a double negative—in order to achieve a positive outcome."

Bronner says it was also surprising to see that an enzyme like DNMT has such a specific function at a specific time. DNMTs are sometimes thought to act in every cell, she says, yet the researchers have discovered a function for this enzyme that is exquisitely controlled in space and time.

"It is amazing how an enzyme, at a given time point during development, can play such a specific role of making a key developmental decision within the embryo," says Na Hu, a graduate student in Bronner's lab and lead author of the paper. "Our findings can be applied to stem cell therapy, by giving clues about how to engineer other cell types or stem cells to become neural crest cells."

Bronner points out that their work relates to the discovery, which won a recent Nobel Prize in Medicine or Physiology, that it is possible to "reprogram" cells taken from adult tissue. These induced pluripotent stem (iPS) cells are similar to embryonic stem cells, and many investigators are attempting to define the conditions needed for them to differentiate into particular cell types, including neural crest derivatives.

"Our results showing that DNMT is important for converting CNS cells to neural crest cells will be useful in defining the steps needed to reprogram such iPS cells," she says. "The iPS cells may in turn be useful for repair in human diseases such as familial dysautonomia, a disease in which there is depletion of autonomic and sensory neurons that are neural crest–derived; for repair of jaw bones lost in osteonecrosis; and for many other potential treatments."

In the short term, the team will explore the notion that DNMT enzymes may have different functions in the embryo at different places and times. That's why the next step in their research, says Bronner, is to examine the later role of these enzymes in nervous-system development, like whether or not they effect the length of time during which the CNS is able to produce neural crest cells.

Additional authors on the paper, titled "DNA methyltransferase3A as a molecular switch mediating the neural tube-to-neural crest fate transition," are Pablo Strobl-Mazzulla from the Laboratorio de Biología del Desarrollo in Chascomús, Argentina, and Tatjana Sauka-Spengler from the Weatherall Institute of Molecular Medicine at the University of Oxford. The work was supported by the National Institutes of Health and the United States Public Health Service.

Katie Neith
Exclude from News Hub: 
News Type: 
Research News

Progress for Paraplegics

Caltech investigators expand project to restore functions to people with spinal-cord injuries

In May 2011, a new therapy created in part by Caltech engineers enabled a paraplegic man to stand and move his legs voluntarily. Now those same researchers are working on a way to automate their system, which provides epidural electrical stimulation to the lower spinal cord. Their goal is for the system to soon be made available to rehab clinics—and thousands of patients—worldwide.

That first patient—former athlete Rob Summers, who had been completely paralyzed below the chest following a 2006 accident—performed remarkably well with the electromechanical system. Although it wasn't initially part of the testing protocol established by the Food and Drug Administration, the FDA allowed Summers to take the entire system with him when he left the Frazier Rehab Institute in Louisville—where his postsurgical physical therapy was done—provided he returns every three months for a checkup.

Joel Burdick, the Richard L. and Dorothy M. Hayman Professor of Mechanical Engineering and Bioengineering at Caltech, and Yu-Chong Tai, a Caltech professor of electrical engineering and mechanical engineering, helped create the therapy, which involves the use of a sheetlike array of electrodes that stimulate Summers' neurons and thus activate the circuits in his lower spinal cord that control standing and stepping. The approach has subsequently been successfully tested on a second paraplegic, and therapists are about to finish testing a third subject, who has shown positive results.

But Tai and Burdick want to keep the technology, as well as the subjects, moving forward. To that end, Tai is developing new versions of the electrode array currently approved for human implantation; these will improve patients' stepping motions, among other advances, and they will be easier to implant. Burdick is also working on a way to let a computer control the pattern of electrical stimulation applied to the spinal cord.

"We need to go further," Burdick says. "And for that, we need new technology."

Because spinal-cord injuries vary from patient to patient, deploying the system has required constant individualized adjustments by clinicians and researchers at the Frazier Institute, a leading center for spinal-cord rehabilitation. "Right now there are 16 electrodes in the array, and for each individual electrode, we send a pulse, which can be varied for amplitude and frequency to cause a response in the patient," Burdick says. Using the current method, he notes, "it takes substantial effort to test all the variables to find the optimum setting for a patient for each of the different functions we want to activate."

The team of investigators, which also includes researchers from UCLA and the University of Louisville, has until now used intelligent guesswork to determine which stimuli might work best. But soon, using a new algorithm developed by Burdick, they will be able to rely on a computer to determine the optimum stimulation levels, based on the patient's response to previous stimuli. This would allow patients to go home after the extensive rehab process with a system that could be continually adjusted by computer—saving Summers and the other patients many of those inconvenient trips back to Louisville. Doctors and technicians could monitor patients' progress remotely.

In addition to providing the subjects with continued benefits from the use of the device, there are other practical reasons for wanting to automate the system. An automated system would be easier to share with other hospitals and clinics around the world, Burdick says, and without a need for intensive training, it could lower the cost.

The FDA has approved testing the system in five spinal-cord injury patients, including the three already enrolled in the trial; Burdick is planning to test the new computerized version in the fourth patient, as well as in Rob Summers during 2013. Once the investigators have completed testing on all five patients, Burdick says, the team will spend time analyzing the data before deciding how to improve the system and expand its use.

The strategy is not a cure for paraplegics, but a tool that can be used to help improve the quality of their health, Burdick says. The technology could also complement stem-cell therapies or other methods that biologists are working on to repair or circumvent the damage to nervous systems that results from spinal-cord injury.

"There's not going to be one silver bullet for treating spinal-cord injuries," Burdick says. "We think that our technique will play a role in the rehabilitation of spinal-cord injury patients, but a more permanent cure will likely come from biological solutions."

Even with the limitations of the current system, Burdick says, the results have exceeded his expectations.

"All three subjects stood up within 48 hours of turning on the array," Burdick says. "This shows that the first patient wasn't a fluke, and that many aspects of the process are repeatable." In some ways, the second and third patients are performing even better than Summers, though it will be some time before the team can fully analyze those results. "We were expecting variations because of the distinct differences in the patients' injuries. Rob gave us a starting point, and now we've learned how to tune the array for each patient and to make adjustments as each patient changes over time.

"I do this work because I love it," Burdick says. "When you work with these people and get to know them and see how they are improving, it's personally inspiring."

Exclude from News Hub: 
News Type: 
Research News

Reconsidering the Global Thermostat

Caltech researcher and colleagues show outcome of geoengineering can be tunable

PASADENA, Calif.—From making clouds whiter and injecting aerosols into the stratosphere, to building enormous sunshades in space, people have floated many ideas about how the planet's climate could be manipulated to counteract the effects of global warming—a concept known as geoengineering. Many of the ideas involve deflecting incoming sunlight before it has a chance to further warm the earth. Because this could affect areas of the planet inequitably, geoengineering has often raised an ethical question: Whose hand would control the global thermostat?

Now a team of researchers from the California Institute of Technology (Caltech), Harvard University, and the Carnegie Institution says there doesn't have to be just a single global control. Using computer modeling, they have shown that varying the amount of sunlight deflected away from the earth by season and by region can significantly improve the parity of the situation. The results appear in an advance online publication of the journal Nature Climate Change.

Previous geoengineering studies have typically assumed uniform deflection of sunlight everywhere on the planet. But the pattern of temperature and precipitation effects that would result from such efforts would never compensate perfectly for the complex pattern of changes that have resulted from global warming. Some areas would end up better off than others, and the climate effects are complex. For example, as the planet warms, the poles are heating up more than the tropics. However, in models where sunlight is deflected uniformly, when enough sunlight is redirected to compensate for this polar warming, the tropics end up colder than they were before man-made activities pumped excess carbon dioxide into the atmosphere.

In the new study, the researchers worked with a climate model of relatively coarse resolution. Rather than selecting one geoengineering strategy, they mimicked the desired effect of many projects by simply "turning down the sun"—decreasing the amount of sunlight reaching the planet. Instead of turning down the sun uniformly, they tailored when and where they reduced incoming sunlight, looking at 15 different combinations. In one, for example, they turned down the sun between January and March while also turning it down more at the poles than at the tropics.

"That essentially gives us 15 knobs that we can tune in order to try to minimize effects at the worst-off regions on the planet," says Doug MacMartin, a senior research associate at Caltech and lead author of the new paper. "In our model, we were able to reduce the residual climate changes (after geoengineering) in the worst-off regions by about 30 percent relative to what could be achieved using a uniform reduction in sunlight."

The group also found that by varying where and when sunlight was reduced, they needed to turn down the sun just 70 percent as much as they would in uniform reflectance to get a similar result. "Based on this work, it's at least plausible that there are ways that you could implement a geoengineering solution that would have less severe consequences, such as a reduced impact on ozone," MacMartin says.

The researchers also used the tuning approach to focus on recovering Arctic sea ice. In their model, it took five times less solar reduction than in the uniform reflectance models to recover the Arctic sea ice to the extent typical of pre-Industrial years.

"These results indicate that varying geoengineering efforts by region and over different periods of time could potentially improve the effectiveness of solar geoengineering and reduce climate impacts in at-risk areas," says Ken Caldeira of the Carnegie Institution. "For example, these approaches may be able to reverse long-term changes in the Arctic sea ice."

The group acknowledges that geoengineering ideas are untested and could come with serious consequences, such as making the skies whiter and depleting the ozone layer, not to mention the unintended consequences that tend to arise when dealing with such a complicated system as the planet. They also say that the best solution would be to reduce greenhouse gas emissions. "I'm approaching it as an engineering problem," MacMartin says. "I'm interested in whether we can come up with a better way of doing the geoengineering that minimizes the negative consequences."  

In addition to MacMartin and Caldeira, David Keith of Harvard University and Ben Kravitz, formerly of the Carnegie Institution but now at the DOE's Pacific Northwest National Lab, are also coauthors on the paper, "Management of trade-offs in geoengineering through optimal choice of non-uniform radiative forcing."

Kimm Fesenmaier
Exclude from News Hub: 
News Type: 
Research News

Technology Has Improved Voting Procedures

New report assesses voting procedures over the last decade

PASADENA, Calif.—Thanks to better voting technology over the last decade, the country's election process has seen much improvement, according to a new report released today by researchers at Caltech and MIT. However, the report notes, despite this progress, some problems remain.

Spurred by the debacle of hanging chads and other voting problems during the 2000 presidential election, the Voting Technology Project (VTP) was started by Caltech and MIT to bring together researchers from across disciplines to figure out how to improve elections. The VTP issued its first report in 2001.

"Since that report came out and since our project was formed, a lot of progress has been made in improving how American elections are run," says Michael Alvarez, professor of political science at Caltech and codirector of the VTP.

For example, the report found that getting rid of outdated voting machines has caused a drop in the number of votes lost to ballot errors. To assess how many votes are lost in each election due to voting mistakes, the researchers calculate the number of residual votes—or the difference between the number of votes that are counted for a particular office and the total number of votes cast. If there are no voting errors, there should be no residual votes.

In their first report in 2001, the researchers found that older voting technology—like punch cards—led to a high residual vote rate. But their new research now shows that the rate has dropped. In particular, Charles Stewart III, a professor of political science at MIT and the other codirector of the VTP, and his colleagues found that the total number of residual votes decreased from 2 percent in 2000 to 1 percent in 2006 and 2008, meaning that fewer votes were lost due to voting errors. The drop was greater in states that instituted more modern voting technology.

"As we moved away from punch cards, lever machines, and paper ballots and towards optical scan systems and electronic systems that have voter verification, we have seen the voter residual rate plummet," Alvarez says. Voter-verification technology gives voters immediate feedback if they make a mistake—by filling in a circle incorrectly, for example—and a chance to correct their error to ensure that their votes are counted.

In addition, the report urges officials to continue and expand election auditing to study the accuracy of registration and voting procedures. For example, after an election, officials can recount ballots to make sure the electronic ballot counters are accurate. "Postelection ballot auditing is a great idea and states need to continue their efforts to use those election ballot-auditing procedures to increase the amount of confidence and integrity of elections," Alvarez says.

The researchers also describe concern with the rise of absentee and early voting, since voter verification is much harder to do via mail. Unlike with in-person voting, these methods offer no immediate feedback about whether a ballot was filled out correctly or if it got counted at all. Once you put your ballot in the mailbox, it's literally out of your hands.

The report also weighs in on voter-identification laws, which have been proposed in many states and subsequently challenged in court. Proponents say they are necessary to prevent voter fraud while opponents argue that there is little evidence that such fraud exists. Moreover, opponents say, voter identification laws make it much more difficult for people without government-issued IDs to vote. But, the report says, technology may resolve the conflict.

"Technology may help ensure voter authentication while alleviating or mitigating the costs that are imposed on voters by laws requiring state-issued identification," says Jonathan Katz, the Kay Sugahara Professor of Social Sciences and Statistics and coauthor of the VTP report.

For example, polling places can have access to a database of registered voters that is also linked to the state's database of DMV photos. A voter's identification can then be confirmed without them having to carry a photo ID. For voters who do not have an ID, the polling place can be equipped with a camera to take an ID picture immediately. The photo can then be entered into the database to verify identification in future elections.

Click here to read the complete report and learn more about the VTP.

In addition to Alvarez, Stewart, and Katz, the other authors of the Caltech/MIT VTP report are Stephen Ansolabehere of Harvard, Thad Hall of the University of Utah, and Ronald Rivest of MIT. The report was supported by the Carnegie Corporation of New York. The project has been supported by the John S. and James L. Knight Foundation and the Pew Charitable Trusts.

Marcus Woo

Caltech Modeling Feat Sheds Light on Protein Channel's Function

PASADENA, Calif.—Chemists at the California Institute of Technology (Caltech) have managed, for the first time, to simulate the biological function of a channel called the Sec translocon, which allows specific proteins to pass through membranes. The feat required bridging timescales from the realm of nanoseconds all the way up to full minutes, exceeding the scope of earlier simulation efforts by more than six orders of magnitude. The result is a detailed molecular understanding of how the translocon works.

Modeling behavior across very different timescales is a major challenge in modern simulation research. "Computer simulations often provide almost uselessly detailed information on a timescale that is way too short, from which you get a cartoon, or something that might raise as many questions as it answers," says Thomas Miller, an assistant professor of chemistry at Caltech. "We've managed to go significantly beyond that, to create a tool that can actually be compared against experiments and even push experiments—to predict things that they haven't been able to see."

The new computational model and the findings based on its results are described by Miller and graduate student Bin Zhang in the current issue of the journal Cell Reports.

The Sec translocon is a channel in cellular membranes involved in the targeting and delivery of newly made proteins. Such channels are needed because the proteins that are synthesized at ribosomes must travel to other regions of the cell or outside the cell in order to perform their functions; however, the cellular membranes prevent even the smallest of molecules, including water, from passing through them willy-nilly. In many ways, channels such as the Sec translocon serve as gatekeepers—once the Sec translocon determines that a given protein should be allowed to pass through, it opens up and allows the protein to do one of two things: to be integrated into the membrane, or to be secreted completely out of the cell.

Scientists have disagreed about how the fate of a given protein entering the translocon is determined. Based on experimental evidence, some have argued that a protein's amino-acid sequence is what matters—that is, how many of its amino acids interact favorably with water and how many clash. This argument treats the process as one in equilibrium, where the extremely slow rate at which a ribosome adds proteins to the channel can be considered infinitely slow.  Other researchers have shown that slowing down the rate of protein insertion into the channel actually changes the outcome, suggesting that kinetic effects can also play a role.

"There was this equilibrium picture, suggesting that only the protein sequence is really important. And then there was an alternative picture, suggesting that kinetic effects are critical to understanding the translocon," Miller says. "So we wondered, could both pictures, in some sense, be right? And that turns out to be the case."

In 2010 and earlier this year, Miller and Zhang published papers in the Proceedings of the National Academy of Sciences and the Journal of the American Chemical Society describing atomistic simulations of the Sec translocon. These computer simulations attempt to account for every motion of every single atom in a system—and typically require so much computing time that they can only model millionths of seconds of activity, at most. Meanwhile, actual biological processes involving proteins in the translocon last many seconds or minutes.

Miller and Zhang were able to use their atomistic simulations to determine which parts of the translocon are most important and to calculate how much energy it costs those parts to move in ways that allow proteins to pass through. In this way, they were able to build a simpler version of the simulation that modeled important groupings of atoms, rather than each individual atom. Using the simplified simulation, they could simulate the translocon's activity over the course of more than a minute.

The researchers ran that simplified model tens of thousands of times and observed the different ways in which proteins move through the channel. In the simulation, any number of variables could be changed—including the protein's amino-acid sequence, its electronic charge, the rate at which it is inserted into the translocon, the length of its tail, and more. The effect of these alterations on the protein's fate was then studied, revealing that proteins move so slowly within the tightly confined environment of the translocon that the pace at which they are added to the channel during translation—a process that might seem infinitely slow—can become important. At the same time, Miller and Zhang saw that other relatively fast processes give rise to the results associated with the equilibrium behavior.

"In fact, both equilibrium and kinetically controlled processes are happening—but in a way that was not obvious until we could actually see everything working together," Miller says.

Beyond elucidating how the translocon works and reconciling seemingly disparate experimental results, the new simulation also lets the researchers perform experiments computationally that have yet to be tried in the lab. For example, they have run simulations with longer proteins and observed that at such lengths—unlike what has been seen with shorter proteins—the equilibrium picture begins to be affected by kinetic effects.  "This could bring the two experimental camps together, and to have led that would be kind of exciting," Miller says.

The new Cell Reports paper is titled "Long-timescale dynamics and regulation of Sec-facilitated protein translocation." The work was supported by the U.S. Office of Naval Research and the Alfred P. Sloan Foundation, with computational resources provided by the U.S. Department of Energy, the National Science Foundation, and the National Institute of General Medical Sciences.

Kimm Fesenmaier
Exclude from News Hub: 
News Type: 
Research News

Developing the Next Generation of Microsensors

Caltech researchers engineer microscale optical accelerometer

PASADENA, Calif.—Imagine navigating through a grocery store with your cell phone. As you turn down the bread aisle, ads and coupons for hot dog buns and English muffins pop up on your screen. The electronics industry would like to make such personal navigators a reality, but, to do so, they need the next generation of microsensors.

Thanks to an ultrasensitive accelerometer—a type of motion detector—developed by researchers at the California Institute of Technology (Caltech) and the University of Rochester, this new class of microsensors is a step closer to reality. Beyond consumer electronics, such sensors could help with oil and gas exploration deep within the earth, could improve the stabilization systems of fighter jets, and could even be used in some biomedical applications where more traditional sensors cannot operate.

Caltech professor of applied physics Oskar Painter and his team describe the new device and its capabilities in an advance online publication of the journal Nature Photonics.

Rather than using an electrical circuit to gauge movements, their accelerometer uses laser light. And despite the device's tiny size, it is an extremely sensitive probe of motion. Thanks to its low mass, it can also operate at a large range of frequencies, meaning that it is sensitive to motions that occur in tens of microseconds, thousands of times faster than the motions that the most sensitive sensors used today can detect.

"The new engineered structures we made show that optical sensors of very high performance are possible, and one can miniaturize them and integrate them so that they could one day be commercialized," says Painter, who is also codirector of Caltech's Kavli Nanoscience Institute.

Although the average person may not notice them, microchip accelerometers are quite common in our daily lives. They are used in vehicle airbag deployment systems, in navigation systems, and in conjunction with other types of sensors in cameras and cell phones. They have successfully moved into commercial use because they can be made very small and at low cost.

Accelerometers work by using a sensitive displacement detector to measure the motion of a flexibly mounted mass, called a proof mass. Most commonly, that detector is an electrical circuit. But because laser light is one of the most sensitive ways to measure position, there has been interest in making such a device with an optical readout. For example, projects such as the Laser Interferometer Gravitational-Wave Observatory (LIGO) rely on optical interferometers, which use laser light reflecting off mirrors separated by kilometers of distance to sensitively measure relative motion of the end mirrors. Lasers can have very little intrinsic noise—meaning that their intensity fluctuates little—and are typically limited by the quantum properties of light itself, so they make it much easier to detect very small movements.

People have tried, with limited success, to make miniature versions of these large-scale interferometers. One stumbling block for miniaturization has been that, in general, the larger the proof mass, the larger the resulting motion when the sensor is accelerated. So it is typically easier to detect accelerations with larger sensors. Also, when dealing with light rather than electrons—as in optical accelerometers—it is a challenge to integrate all the components (the lasers, detectors, and interferometer) into a micropackage.

"What our work really shows is that we can take a silicon microchip and scale this concept of a large-scale optical interferometer all the way down to the nanoscale," Painter says. "The key is this little optical cavity we engineered to read out the motion."

The optical cavity is only about 20 microns (millionths of a meter) long, a single micron wide, and a few tenths of a micron thick. It consists of two silicon nanobeams, situated like the two sides of a zipper, with one side attached to the proof mass. When laser light enters the system, the nanobeams act like a "light pipe," guiding the light into an area where it bounces back and forth between holes in the nanobeams. When the tethered proof mass moves, it changes the gap between the two nanobeams, resulting in a change in the intensity of the laser light being reflected out of the system. The reflected laser signal is in fact tremendously sensitive to the motion of the proof mass, with displacements as small as a few femtometers (roughly the diameter of a proton) being probed on the timescale of a second.

It turns out that because the cavity and proof mass are so small, the light bouncing back and forth in the system pushes the proof mass—and in a special way: when the proof mass moves away, the light helps push it further, and when the proof mass moves closer, the light pulls it in. In short, the laser light softens and damps the proof mass's motion.

"Most sensors are completely limited by thermal noise, or mechanical vibrations—they jiggle around at room temperature, and applied accelerations get lost in that noise," Painter says. "In our device, the light applies a force that tends to reduce the thermal motion, cooling the system." This cooling—down to a temperature of three kelvins (about –270°C) in the current devices—increases the range of accelerations that the device can measure, making it capable of measuring both extremely small and extremely large accelerations.

"We made a very sensitive sensor that, at the same time, can also measure very large accelerations, which is valuable in many applications," Painter says.

The team envisions its optical accelerometers becoming integrated with lasers and detectors in silicon microchips. Microelectronics companies have been working for the past 10 or 15 years to try to integrate lasers and optics into their silicon microelectronics. Painter says that a lot of engineering work still needs to be done to make this happen, but adds that "because of the technological advancements that have been made by these companies, it looks like one can actually start making microversions of these very sensitive optical interferometers."

"Professor Painter's research in this area nicely illustrates how the Engineering and Applied Science faculty at Caltech are working at the edges of fundamental science to invent the technologies of the future," says Ares Rosakis, chair of Caltech's Division of Engineering and Applied Science.  "It is very exciting to envision the ways this research might transform the microelectronics industry and our daily lives."

The lead authors on the paper, titled "A high-resolution microchip optomechanical accelerometer," have all worked in Painter's lab. Alexander Krause and Tim Blasius are currently graduate students at Caltech, while Martin Winger is a former postdoctoral scholar who now works for a sensor company called Sensirion in Zurich, Switzerland. This work was performed in collaboration with Qiang Lin, a former postdoctoral scholar of the Painter group, who now leads his own research group at the University of Rochester. The work is supported by the Defense Advanced Research Projects Administration QuASaR program, the National Science Foundation Graduate Research Fellowship Program, and Intellectual Ventures.

Kimm Fesenmaier

How I Spent My Summer Vacation

A SURF Video Diary

Last summer, Caltech junior Julie Jester worked on a project that might one day partially counteract blindness caused by a deteriorating retina. Her job: to help Assistant Professor of Electrical Engineering Azita Emami and her graduate students create the communications link between a tiny camera and a novel wireless neural stimulator that can be surgically inserted into the eye.

Now in its 34th year, Caltech's Summer Undergraduate Research Fellowships (SURF) program has paired nearly 7,000 students with real-world, hands-on projects in the labs of Caltech faculty and JPL staff.


Doug Smith
Exclude from News Hub: 
News Type: 
In Our Community

Traveling with Purpose

Biologist spends summer vacation volunteering in India

Pamela Bjorkman has been studying HIV at Caltech since 2005. In the lab, she has made significant gains in the fight against the virus, developing antibodies that neutralize most strains. But years spent at the bench were beginning to make her feel disconnected from the possible impact of her work. So this summer she visited India, spending time with HIV-positive women and others who are at risk.

"What I wanted to do was see the real side of HIV, where it affects people," says Bjorkman, the Max Delbrück Professor of Biology and an investigator with the Howard Hughes Medical Institute. "We work in the lab where we have no contact with HIV-infected people—the human impact of the disease is very removed from what we think about in our work."

This was not her first trip to the nation of over 1.2 billion people, where nearly 30 percent of the population lives in poverty. She first visited in 1985 and returned with her teenage daughter in 2008 to work at an orphanage in the Jaipur area called Udayan. The home for children is part of an umbrella organization called Vatsalya that also runs an HIV-education program for female sex workers, among other projects aimed at empowering women and teaching street children vocational skills.

"The orphanage is really incredible," says Bjorkman, whose daughter accompanied her on her most recent trip as well. "There are an estimated 18 million children living on the street in India—a lot who are not actually orphans, but on the street anyway. The organization takes in as many children as it can—around 60—and those kids are never adopted. When they come to the orphanage, the group there becomes their family."

The mission of the organization—founded in 1995 by Jaimala and Hitesh Gupta, both of whom have backgrounds in public health—is to "provide a caring environment where our disadvantaged and vulnerable people can develop their capabilities with dignity." The orphanage is a nearly self-sufficient compound that includes a school, a farm, a garden, and dormitories. They even have a psychologist who visits with the children, many of whom suffered abuse at very young ages.

"It's really an amazing place," says Bjorkman. "Here these kids are, all living with the most horrible back stories, and they are full of joy and respectful and helpful. It makes you realize how incredibly privileged we are here in Pasadena and that we take a lot for granted."

Bjorkman and her daughter stayed at Udayan for two weeks each time they visited, helping to teach the children English and math, participating in art and dance projects, and helping with gardening and cooking. This summer, Bjorkman also traveled to Ajmer, where the group's HIV-education program is located. There, she met with women struggling with the stigma of HIV, particularly because they rely on sex work to support their children and send them to private school; public schools in many impoverished areas of India are notoriously bad.

"The organization identifies women in the community who are sex workers and are interested in learning some other trade, or who need help because of HIV infection," she explains. "The terrible thing is that when they find out they are HIV infected, many of the women start working more because their futures are more uncertain.  Plus, they hesitate to take medication because if anyone finds out that they are positive, they will lose customers." 

The organization provides counseling, runs a female condom education program, offers training classes for those wanting to become proficient at another job, and works to get HIV-positive women on antiretroviral medications. While visiting with the women, Bjorkman talked with them about how the virus works and why it's so tough to treat once it's in the body.

"This is the reason that I'm doing the HIV research," she says. "It's not to get our own papers out first, it's to actually do something that might make a difference. Meeting the women put a lot of the competition and the unpleasantness associated with the rat race of science into perspective."

Bjorkman plans to return to India, but in the meantime she's doing all she can to raise awareness for Vatsalya and their various projects. Like any nonprofit, the organization could use monetary donations, but she hopes that her story inspires others at Caltech to donate their time. Anyone, she says, can volunteer through Vatsalya and receive room, board, and meals at the orphanage for a nominal daily donation.

"Caltech undergrad and grad students don't necessarily have that much money, but they may have time and this would be an amazing way to get to know another culture," she says. "These people are really doing a great job—both with the orphanage and with the HIV program that I had direct experience with. Once you see the way it works, it's really inspiring."

For more information on Vatsalya and the work they do, visit their website. Or contact Pamela Bjorkman to find out how you can become directly involved with this organization.  

Katie Neith
Exclude from News Hub: 
News Type: 
Research News

Caltech Biologist Named MacArthur Fellow

PASADENA, Calif.—Sarkis Mazmanian, a microbiology expert at the California Institute of Technology (Caltech) whose studies of human gut bacteria have revealed new insights into how these microbes can be beneficial, was named a MacArthur Fellow and awarded a five-year, $500,000 "no strings attached" grant. Each year, the John D. and Catherine T. MacArthur Foundation awards the unrestricted fellowships—also known as "genius" grants—to individuals who have shown "extraordinary originality and dedication in their creative pursuits and a marked capacity for self-direction," according to the foundation's website.

"I was in a state of shock when I heard the news," says Mazmanian, a professor of biology at Caltech, who was tricked into taking the award announcement call; he thought he was simply being added to a prescheduled conference call. "It's not the kind of thing you ever expect—I do what I do because I love science and it makes me happy, so this is terrific and a nice reward. At the same time, I never think of awards as goals of mine because they seem so unattainable. My goals are to make discoveries, so I was just in absolute disbelief."

Long before he was named a 2012 MacArthur Fellow, Mazmanian was showing the attributes that the foundation seeks to reward, particularly a capacity for self-direction. As a graduate student in the in the early 2000s, he decided to stray from the normal path of study and try something new. 

"I had been studying microbial pathogenesis—or bacteria that make us sick—which is what 99.9 percent of the field of microbiology does to this day," says Mazmanian. "Toward the end of my PhD, I decided that I wanted to study organisms that didn't necessarily cause disease, but were associated with our bodies. Ten years ago, this was completely on the fringe of science—we knew that the organisms existed in our intestines and all over our bodies, but had no idea what they were doing."  

Today, Mazmanian's work examines some of the trillions of bacteria living in our bodies that make up complex communities of microbes and regulate processes like digestion and immunity. His main focus is to understand how "good" bacteria promote human health—work that has transformed a quickly evolving field of research that is investigating the connection between gut bacteria and their relationship to both disease and health.

His research helped lay the groundwork for the Human Microbiome Project (HMP), an initiative of the National Institutes of Health that aims to characterize, for the first time, "the microbial communities found at several different sites on the human body, including nasal passages, oral cavities, skin, gastrointestinal tract, and urogenital tract, and to analyze the role of these microbes in human health and disease," according to the HMP website.


His laboratory was the first to demonstrate that specific gut bacteria direct the development of the mammalian immune system and provide protection from intestinal diseases. This means, he says, that fundamental aspects of health are absolutely dependent on microbial interaction within our bodies. In addition, he says that many disorders whose incidences are increasing in Western countries—such as inflammatory bowel disease, multiple sclerosis, and asthma—involve a common immunologic defect believed to be caused by the absence of intestinal bacteria. An understanding of the beneficial immune responses promoted by gut bacteria may lead to the development of natural therapeutics for immunologic and perhaps neurologic diseases, says Mazmanian.

"This award is extremely well-deserved—Sarkis has revolutionized the way we think about the interactions between microorganisms and people," says Stephen L. Mayo, William K. Bowes Jr. Foundation Chair of Caltech's Division of Biology, and Bren Professor of Biology and Chemistry. "His research has had an important impact in making the connection between personal hygiene and the immune system, and even neurological diseases like autism."

When the award announcement went public, Mazmanian was in Armenia, his native homeland, teaching a one-week course on host-microbial interaction to PhD students at a molecular biology institute. He travels to the country once a year to volunteer his services. The timing, he says, couldn't be better, as he hopes to use some of the prize money to develop an international educational outreach program.

"I think that when you have a windfall like this, the least you can do is help people who are in need," says Mazmanian, who credits the members of his lab for his research success. "In many countries, they are in need of education and resources, like lab equipment, text books, you name it. It would be a terrific if I could use the money to help advance science in countries where there is hardship."

Mazmanian received his bachelor's degree in 1995 and his PhD in microbiology in 2002, both from UCLA. Following a postdoctoral fellowship at Harvard, he joined the Caltech faculty as an assistant professor in 2006. In 2012, he was promoted to professor of biology. In 2011, Mazmanian was the recipient of a Burroughs Welcome Fund award for research in the pathogenesis of infectious disease, and in 2008 he was awarded a Searle Scholarship and was named one of Discover magazine's "20 Best Brains Under 40," which highlighted young innovators in science.

This year's crop of 23 Fellows includes stringed-instrument bow maker Benoît Rolland and mathematician Maria Chudnovsky; Mazmanian joins the ranks of Caltech's previous MacArthur Fellows, including 2010 awardee John Dabiri.

For more information on the 2012 MacArthur Fellows, visit the foundation website at

Katie Neith
Frontpage Title: 
Biologist Wins "Genius" Grant
Exclude from News Hub: 
News Type: 
In Our Community


Subscribe to RSS - research_news