Creating Indestructible Self-Healing Circuits

Caltech engineers build electronic chips that repair themselves

PASADENA, Calif.—Imagine that the chips in your smart phone or computer could repair and defend themselves on the fly, recovering in microseconds from problems ranging from less-than-ideal battery power to total transistor failure. It might sound like the stuff of science fiction, but a team of engineers at the California Institute of Technology (Caltech), for the first time ever, has developed just such self-healing integrated chips.

The team, made up of members of the High-Speed Integrated Circuits laboratory in Caltech's Division of Engineering and Applied Science, has demonstrated this self-healing capability in tiny power amplifiers. The amplifiers are so small, in fact, that 76 of the chips—including everything they need to self-heal—could fit on a single penny. In perhaps the most dramatic of their experiments, the team destroyed various parts of their chips by zapping them multiple times with a high-power laser, and then observed as the chips automatically developed a work-around in less than a second.

"It was incredible the first time the system kicked in and healed itself. It felt like we were witnessing the next step in the evolution of integrated circuits," says Ali Hajimiri, the Thomas G. Myers Professor of Electrical Engineering at Caltech. "We had literally just blasted half the amplifier and vaporized many of its components, such as transistors, and it was able to recover to nearly its ideal performance."

The team's results appear in the March issue of IEEE Transactions on Microwave Theory and Techniques.

Until now, even a single fault has often rendered an integrated-circuit chip completely useless. The Caltech engineers wanted to give integrated-circuit chips a healing ability akin to that of our own immune system—something capable of detecting and quickly responding to any number of possible assaults in order to keep the larger system working optimally. The power amplifier they devised employs a multitude of robust, on-chip sensors that monitor temperature, current, voltage, and power. The information from those sensors feeds into a custom-made application-specific integrated-circuit (ASIC) unit on the same chip, a central processor that acts as the "brain" of the system. The brain analyzes the amplifier's overall performance and determines if it needs to adjust any of the system's actuators—the changeable parts of the chip.

Interestingly, the chip's brain does not operate based on algorithms that know how to respond to every possible scenario. Instead, it draws conclusions based on the aggregate response of the sensors. "You tell the chip the results you want and let it figure out how to produce those results," says Steven Bowers, a graduate student in Hajimiri's lab and lead author of the new paper. "The challenge is that there are more than 100,000 transistors on each chip. We don't know all of the different things that might go wrong, and we don't need to. We have designed the system in a general enough way that it finds the optimum state for all of the actuators in any situation without external intervention."

Looking at 20 different chips, the team found that the amplifiers with the self-healing capability consumed about half as much power as those without, and their overall performance was much more predictable and reproducible. "We have shown that self-healing addresses four very different classes of problems," says Kaushik Dasgupta, another graduate student also working on the project. The classes of problems include static variation that is a product of variation across components; long-term aging problems that arise gradually as repeated use changes the internal properties of the system; and short-term variations that are induced by environmental conditions such as changes in load, temperature, and differences in the supply voltage; and, finally, accidental or deliberate catastrophic destruction of parts of the circuits.

The Caltech team chose to demonstrate this self-healing capability first in a power amplifier for millimeter-wave frequencies. Such high-frequency integrated chips are at the cutting edge of research and are useful for next-generation communications, imaging, sensing, and radar applications. By showing that the self-healing capability works well in such an advanced system, the researchers hope to show that the self-healing approach can be extended to virtually any other electronic system.

"Bringing this type of electronic immune system to integrated-circuit chips opens up a world of possibilities," says Hajimiri. "It is truly a shift in the way we view circuits and their ability to operate independently. They can now both diagnose and fix their own problems without any human intervention, moving one step closer to indestructible circuits."

Along with Hajimiri, Bowers, and Dasgupta, former Caltech postdoctoral scholar Kaushik Sengupta (PhD '12), who is now an assistant professor at Princeton University, is also a coauthor on the paper, "Integrated Self-Healing for mm-Wave Power Amplifiers." A preliminary report of this work won the best paper award at the 2012 IEEE Radio Frequency Integrated Circuits Symposium. The work was funded by the Defense Advanced Research Projects Agency and the Air Force Research Laboratory.

Writer: 
Kimm Fesenmaier
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

A Window Into Europa's Ocean Lies Right at the Surface

Caltech and JPL researchers find evidence that a jovian ocean is not isolated

PASADENA, Calif.—If you could lick the surface of Jupiter's icy moon Europa, you would actually be sampling a bit of the ocean beneath. So says Mike Brown, an astronomer at the California Institute of Technology (Caltech). Brown—known as the Pluto killer for discovering a Kuiper-belt object that led to the demotion of Pluto from planetary status—and Kevin Hand from the Jet Propulsion Laboratory (JPL) have found the strongest evidence yet that salty water from the vast liquid ocean beneath Europa's frozen exterior actually makes its way to the surface.

The finding, based on some of the first data of its kind since NASA's Galileo mission (1989–2003) to study Jupiter and its moons, suggests that there is a chemical exchange between the ocean and surface, making the ocean a richer chemical environment, and implies that learning more about the ocean could be as simple as analyzing the moon's surface. The work is described in a paper that has been accepted for publication in the Astronomical Journal.

"We now have evidence that Europa's ocean is not isolated—that the ocean and the surface talk to each other and exchange chemicals," says Brown, the Richard and Barbara Rosenberg Professor and professor of planetary astronomy at Caltech. "That means that energy might be going into the ocean, which is important in terms of the possibilities for life there. It also means that if you'd like to know what's in the ocean, you can just go to the surface and scrape some off."

"The surface ice is providing us a window into that potentially habitable ocean below," says Hand, deputy chief scientist for solar system exploration at JPL.

Since the days of the Galileo mission, when the spacecraft showed that Europa was covered with an icy shell, scientists have debated the composition of Europa's surface. The infrared spectrometer aboard Galileo was not capable of providing the detail needed to definitively identify some of the materials present on the surface. Now, using current technology on ground-based telescopes, Brown and Hand have identified a spectroscopic feature on Europa's surface that indicates the presence of a magnesium sulfate salt, a mineral called epsomite, that could only originate from the ocean below.

"Magnesium should not be on the surface of Europa unless it's coming from the ocean," Brown says. "So that means ocean water gets onto the surface, and stuff on the surface presumably gets into the ocean water."

Europa's ocean is thought to be 100 kilometers deep and covers the entire globe. The moon remains locked in relation to Jupiter, with the same hemisphere always leading and the other trailing in its orbit. The leading hemisphere has a yellowish appearance, while the trailing hemisphere seems to be splattered and streaked with a red material.

The spectroscopic data from that red side has been a cause of scientific debate for 15 years. It is thought that one of Jupiter's largest moons, Io, spews volcanic sulfur from its atmosphere, and Jupiter's strong magnetic field sends some of that sulfur hurtling toward the trailing hemisphere of Europa, where it sticks. It is also clear from Galileo's data that there is something other than pure water ice on the trailing hemisphere's surface. The debate has focused on what that other something is—i.e., what has caused the spectroscopic data to deviate from the signature of pure water ice.

"From Galileo's spectra, people knew something was there besides water. They argued for years over what it might be—sodium sulfate, hydrogen sulfate, sodium hydrogen carbonate, all these things that look more or less similar in this range of the spectrum," says Brown. "But the really difficult thing was that the spectrometer on the Galileo spacecraft was just too coarse."

Brown and Hand decided that the latest spectrometers on ground-based telescopes could improve the data pertaining to Europa, even from a distance of about 400 million miles. Using the Keck II telescope on Mauna Kea—which is outfitted with adaptive optics to adjust for the blurring effect of Earth's atmosphere—and its OH-Suppressing Infrared Integral Field Spectrograph (OSIRIS), they first mapped the distribution of pure water ice versus anything else on the moon. The spectra showed that even Europa's leading hemisphere contains significant amounts of nonwater ice. Then, at low latitudes on the trailing hemisphere—the area with the greatest concentration of the nonwater ice material—they found a tiny dip in the spectrum that had never been detected before.

"We now have the best spectrum of this thing in the world," Brown says. "Nobody knew there was this little dip in the spectrum because no one had the resolution to zoom in on it before."

The two researchers racked their brains to come up with materials that might explain the new spectroscopic feature, and then tested everything from sodium chloride to Drano in Hand's lab at JPL, where he tries to simulate the environments found on various icy worlds. "We tried to think outside the box to consider all sorts of other possibilities, but at the end of the day, the magnesium sulfate persisted," Hand says.

Some scientists had long suspected that magnesium sulfate was on the surface of Europa. But, Brown says, "the interesting twist is that it doesn't look like the magnesium sulfate is coming from the ocean." Since the mineral he and Hand found is only on the trailing side, where the moon is being bombarded with sulfur from Io, they believe that there is a magnesium-bearing mineral everywhere on Europa that produces magnesium sulfate in combination with sulfur. The pervasive magnesium-bearing mineral might also be what makes up the nonwater ice detected on the leading hemisphere's surface.

Brown and Hand believe that this mystery magnesium-bearing mineral is magnesium chloride. But magnesium is not the only unexpected element on the surface of Europa. Fifteen years ago, Brown showed that Europa is surrounded by an atmosphere of atomic sodium and potassium, presumably originating from the surface. The researchers reason that the sodium and potassium chlorides are actually the dominant salts on the surface of Europa, but that they are not detectable because they have no clear spectral features.

The scientists combined this information with the fact that Europa's ocean can only be one of two types—either sulfate-rich or chlorine-rich. Having ruled out the sulfate-rich version since magnesium sulfate was found only on the trailing side, Brown and Hand hypothesize that the ocean is chlorine-rich and that the sodium and potassium must be present as chlorides.

Therefore, Brown says, they believe the composition of Europa's sea closely resembles the salty ocean of Earth. "If you could go swim down in the ocean of Europa and taste it, it would just taste like normal old salt," he says.

Hand emphasizes that, from an astrobiology standpoint, Europa is considered a premier target in the search for life beyond Earth; a NASA-funded study team led by JPL and the Johns Hopkins University Applied Physics Laboratory have been working with the scientific community to identify options to explore Europa further.  "If we've learned anything about life on Earth, it's that where there's liquid water, there's generally life," Hand says. "And of course our ocean is a nice salty ocean. Perhaps Europa's salty ocean is also a wonderful place for life."

The Astronomical Journal paper is titled "Salts and radiation products on the surface of Europa." The work was supported, in part, by the NASA Astrobiology Institute through the Astrobiology of Icy Worlds node at JPL.

Writer: 
Kimm Fesenmaier
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Visualizing Biological Networks in 4D

A unique microscope invented at Caltech captures the motion of DNA structures in space and time

PASADENA, Calif.—Every great structure, from the Empire State Building to the Golden Gate Bridge, depends on specific mechanical properties to remain strong and reliable. Rigidity—a material's stiffness—is of particular importance for maintaining the robust functionality of everything from colossal edifices to the tiniest of nanoscale structures. In biological nanostructures, like DNA networks, it has been difficult to measure this stiffness, which is essential to their properties and functions. But scientists at the California Institute of Technology (Caltech) have recently developed techniques for visualizing the behavior of biological nanostructures in both space and time, allowing them to directly measure stiffness and map its variation throughout the network.

The new method is outlined in the February 4 early edition of the Proceedings of the National Academy of Sciences (PNAS).

"This type of visualization is taking us into domains of the biological sciences that we did not explore before," says Nobel Laureate Ahmed Zewail, the Linus Pauling Professor of Chemistry and professor of physics at Caltech, who coauthored the paper with Ulrich Lorenz, a postdoctoral scholar in Zewail's lab. "We are providing the methodology to find out—directly—the stiffness of a biological network that has nanoscale properties."

Knowing the mechanical properties of DNA structures is crucial to building sturdy biological networks, among other applications. According to Zewail, this type of visualization of biomechanics in space and time should be applicable to the study of other biological nanomaterials, including the abnormal protein assemblies that underlie diseases like Alzheimer's and Parkinson's.

Zewail and Lorenz were able to see, for the first time, the motion of DNA nanostructures in both space and time using the four-dimensional (4D) electron microscope developed at Caltech's Physical Biology Center for Ultrafast Science and Technology. The center is directed by Zewail, who created it in 2005 to advance understanding of the fundamental physics of chemical and biological behavior.

"In nature, the behavior of matter is determined by its structure—the arrangements of its atoms in the three dimensions of space—and by how the structure changes with time, the fourth dimension," explains Zewail. "If you watch a horse gallop in slow motion, you can follow the time of the gallops, and you can see in detail what, for example, each leg is doing over time. When we get to the nanometer scale, that is a different story—we need to improve the spatial resolution to a billion times that of the horse in order to visualize what is happening."

Zewail was awarded the 1999 Nobel Prize in Chemistry for his development of femtochemistry, which uses ultrashort laser flashes to observe fundamental chemical reactions occurring at the timescale of the femtosecond (one millionth of a billionth of a second). Although femtochemistry can capture atoms and molecules in motion, giving the time dimension, it cannot concurrently show the dimensions of space, and thus the structure of the material. This is because it utilizes laser light with wavelengths that far exceed the dimension of a nanostructure, making it impossible to resolve and image nanoscale details in tiny physical structures such as DNA .

To overcome this major hurdle, the 4D electron microscope employs a stream of individual electrons that scatter off objects to produce an image. The electrons are accelerated to wavelengths of picometers, or trillionths of a meter, providing the capability for visualizing the structure in space with a resolution a thousand times higher than that of a nanostructure, and with a time resolution of femtoseconds or longer.

The experiments reported in PNAS began with a structure created by stretching DNA over a hole embedded in a thin carbon film. Using the electrons in the microscope, several DNA filaments were cut away from the carbon film so that a three-dimensional, free-standing structure was achieved under the 4D microscope.

Next, the scientists employed laser heat to excite oscillations in the DNA structure, which were imaged using the electron pulses as a function of time—the fourth dimension. By observing the frequency and amplitude of these oscillations, a direct measure of stiffness was made.

"It was surprising that we could do this with a complex network," says Zewail. "And yet by cutting and probing, we could go into a selective area of the network and find out about its behavior and properties."

Using 4D electron microscopy, Zewail's group has begun to visualize protein assemblies called amyloids, which are believed to play a role in many neurodegenerative diseases, and they are continuing their investigation of the biomechanical properties of these networks. He says that this technique has the potential for broad applications not only to biological assemblies, but also in the materials science of nanostructures.

Funding for the research outlined in the PNAS paper, "Biomechanics of DNA structures visualized by 4D electron microscopy," was provided by the National Science Foundation and the Air Force Office of Scientific Research. The Physical Biology Center for Ultrafast Science and Technology at Caltech is supported by the Gordon and Betty Moore Foundation.

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Creating New Quantum Building Blocks

Caltech researcher says diamond defects could serve as nodes of a quantum network

PASADENA, Calif.—Scientists have long dreamed of creating a quantum computer—a device rooted in the bizarre phenomena that transpire at the level of the very small, where quantum mechanics rules the scene. It is believed that such new computers could process currently unsolvable problems in seconds.  

Researchers have tried using various quantum systems, such as atoms or ions, as the basic, transistor-like units in simple quantum computation devices. Now, laying the groundwork for an on-chip optical quantum network, a team of researchers, including Andrei Faraon from the California Institute of Technology (Caltech), has shown that defects in diamond can be used as quantum building blocks that interact with one another via photons, the basic units of light.

The device is simple enough—it involves a tiny ring resonator and a tunnel-like optical waveguide, which both funnel light. Both structures, each only a few hundred nanometers wide, are etched in a diamond membrane and positioned close together atop a glass substrate. Within the resonator lies a nitrogen-vacancy center (NV center)—a defect in the structure of diamond in which a nitrogen atom replaces a carbon atom, and in which a nearby spot usually reserved for another carbon atom is simply empty. Such NV centers are photoluminescent, meaning they absorb and emit photons.

"These NV centers are like the building blocks of the network, and we need to make them interact—like having an electrical current connecting one transistor to another," explains Faraon, lead author on a paper describing the work in the New Journal of Physics. "In this case, photons do that job."

In recent years, diamond has become a heavily researched material for use in quantum photonic devices in part because the diamond lattice is able to protect impurities from excessive interactions. The so-called quietness it affords enables impurities—such as NV centers—to store information unaltered for relatively long periods of time.  

To begin their experiment, the researchers first cool the device below 10 Kelvin (−441.67 degrees Fahrenheit) and then shine green laser light on the NV center, causing it to reach an excited state and then emit red light. As the red light circles within the resonator, it constructively interferes with itself, increasing its intensity. Slowly, the light then leaks into the nearby waveguide, which channels the photons out through gratings at either end, scattering the light out of the plane of the chip.

The emitted photons have the property of being correlated, or entangled, with the NV center from which they came. This mysterious quality of entanglement, which makes two quantum states inextricably linked in such a way that any information you learn about one provides information about the other, is a necessary ingredient for quantum computation. It enables a large amount of information to be stored and processed by fewer components that take up a small amount of space.

"Right now we only have one nitrogen-vacancy center that's emitting photons, but in the future we envision creating multiple NV centers that emit photons on the same chip," Faraon says. "By measuring these photons we could create entanglement among multiple NV centers on the chip."

And that's important because, in order to make a quantum computer, you would need millions—maybe billions—of these units. "As you can see, we're just working at making one or a few," Faraon says. "But there are other applications down the line that are easier to achieve." For example, a quantum network with a couple hundred units could simulate the behavior of a complex molecule—a task that conventional computers struggle with.

Going forward, Faraon plans to investigate whether other materials can behave similarly to diamond in an optical quantum network.

In addition to Faraon, the authors on the paper, "Quantum photonic devices in single-crystal diamond," are Charles Santori, Zhihong Huang, Kai-Mei Fu, Victor Acosta, David Fattal, and Raymond Beausoleil of Hewlett-Packard Laboratories, in Palo Alto, California. Fu is now an assistant professor at the University of Washington in Seattle, Washington. The work was supported by the Defense Advanced Research Projects Agency and The Regents of the University of California.  

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Caltech Senior Wins Churchill Scholarship

Caltech senior Andrew Meng has been selected to receive a Churchill Scholarship, which will fund his graduate studies at the University of Cambridge for the next academic year. Meng, a chemistry and physics major, was one of only 14 students nationwide who were chosen to receive the fellowship this year.

Taking full advantage of Caltech's strong tradition of undergraduate research, Meng has worked since his freshman year in the lab of Nate Lewis, the George L. Argyros Professor and professor of chemistry. Over the course of three Summer Undergraduate Research Fellowships (SURFs) and several terms in the lab, Meng has investigated various applications of silicon microwire solar cells. Lewis's group has shown that arrays of these ultrathin wires hold promise as a cost-effective way to construct solar cells that can convert light into electricity with relatively high efficiencies.

Meng, who grew up in Baton Rouge, Louisiana, first studied some of the fundamental limitations of silicon microwires in fuel-forming reactions. In these applications, it is believed that the microwires can harness energy from the sun to drive chemical reactions such as the production of hydrogen and oxygen from splitting water. Meng's work showed that the geometry of the microwires would not limit the fuel-forming reaction as some had expected.

More recently, Meng has turned his attention to using silicon microwires to generate electricity. He is developing an inexpensive electrical contact to silicon microwire chips, using a method that facilitates scale-up and can be applied to flexible solar cells.

"Andrew is one of the best undergraduates that I have had the pleasure of working with in over a decade," says Lewis. "He excels in academics, in leadership, and in research. I believe he is truly worthy of the distinction of receiving a Churchill Fellowship. " 

As he pursues a Master of Philosophy degree in chemistry at the University of Cambridge over the next year, Meng will work in the group of theoretical chemist Michiel Sprik. He plans to apply computational methods to his studies of fuel-forming reactions using solar-energy materials.

"I'm very grateful for this opportunity to learn a computational perspective, since up until now I've been doing experimental work," Meng says. "I'm very excited, and most importantly, I'd like to thank Caltech and all of my mentors and co-mentors, without whom I would not be in this position today."

According to the Winston Churchill Foundation's website, the Churchill Scholarship program "offers American citizens of exceptional ability and outstanding achievement the opportunity to pursue graduate studies in engineering, mathematics, or the sciences at Cambridge. One of the newer colleges at the University of Cambridge, Churchill College was built as the national and Commonwealth tribute to Sir Winston, who in the years after the Second World War presciently recognized the growing importance of science and technology for prosperity and security. Churchill College focuses on the sciences, engineering, and mathematics." The first Churchill Scholarships were awarded in 1963, and this year's recipients bring the total to 479 Churchill Scholars.

Each year, a select group of universities, including Caltech, is eligible to nominate students for consideration for the scholarship. Meng is the seventh Caltech student to have won the award since the year 2000. A group of Caltech faculty members and researchers work with Lauren Stolper, director of fellowships advising, to identify and nominate candidates. This year, the members of the group were Churchill Scholar alumni John Brady, the Chevron Professor of Chemical Engineering and professor of mechanical engineering; Mitchio Okumura, professor of chemical physics; Alan Cummings, senior research scientist; and Eric Rains, professor of mathematics.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Sorting Out Stroking Sensations

Caltech biologists find individual neurons in the skin that react to massage

PASADENA, Calif.—The skin is a human being's largest sensory organ, helping to distinguish between a pleasant contact, like a caress, and a negative sensation, like a pinch or a burn. Previous studies have shown that these sensations are carried to the brain by different types of sensory neurons that have nerve endings in the skin. Only a few of those neuron types have been identified, however, and most of those detect painful stimuli. Now biologists at the California Institute of Technology (Caltech) have identified in mice a specific class of skin sensory neurons that reacts to an apparently pleasurable stimulus.

More specifically, the team, led by David J. Anderson, Seymour Benzer Professor of Biology at Caltech, was able to pinpoint individual neurons that were activated by massage-like stroking of the skin. The team's results are outlined in the January 31 issue of the journal Nature.

"We've known a lot about the neurons that detect things that make us hurt or feel pain, but we've known much less about the identity of the neurons that make us feel good when they are stimulated," says Anderson, who is also an investigator with the Howard Hughes Medical Institute. "Generally it's a lot easier to study things that are painful because animals have evolved to become much more sensitive to things that hurt or are fearful than to things that feel good. Showing a positive influence of something on an animal model is not that easy."

In fact, the researchers had to develop new methods and technologies to get their results. First, Sophia Vrontou, a postdoctoral fellow in Anderson's lab and the lead author of the study, developed a line of genetically modified mice that had tags, or molecular markers, on the neurons that the team wanted to study. Then she placed a molecule in this specific population of neurons that fluoresced, or lit up, when the neurons were activated.

"The next step was to figure out a way of recording those flashes of light in those neurons in an intact mouse while stroking and poking its body," says Anderson. "We took advantage of the fact that these sensory neurons are bipolar in the sense that they send one branch into the skin that detects stimuli, and another branch into the spinal cord to relay the message detected in the skin to the brain."

The team obtained the needed data by placing the mouse under a special microscope with very high magnification and recording the level of fluorescent light in the fibers of neurons in the spinal cord as the animal was stroked, poked, tickled, and pinched. Through a painstaking process of applying stimuli to one tiny area of the animal's body at a time, they were able to confirm that certain neurons lit up only when stroked. A different class of neurons, by contrast, was activated by poking or pinching the skin, but not by stroking.

"Massage-like stroking is a stimulus that, if were we to experience it, would feel good to us, but as scientists we can't just assume that because something feels good to us, it has to also feel good to an animal," says Anderson. "So we then had to design an experiment to show that artificially activating just these neurons—without actually stroking the mouse—felt good to the mouse."

The researchers did this by creating a box that contained left, right, and center rooms connected by little doors. The left and right rooms were different enough that a mouse could distinguish them through smell, sight, and touch. In the left room, the mouse received an injection of a drug that selectively activated the neurons shown to detect massage-like stroking. In the room on the right, the mouse received a control injection of saline. After a few sessions in each outer room, the animal was placed in the center, with the doors open to see which room it preferred. It clearly favored the room where the massage-sensitive neurons were activated. According to Anderson, this was the first time anyone has used this type of conditioned place-preference experiment to show that activating a specific population of neurons in the skin can actually make an animal experience a pleasurable or rewarding state—in effect, to "feel good."

The team's findings are significant for several reasons, he says. First, the methods that they developed give scientists who have discovered a new kind of neuron a way to find out what activates that neuron in the skin.

"Since there are probably dozens of different kinds of neurons that innervate the skin, we hope this will advance the field by making it possible to figure out all of the different kinds of neurons that detect various types of stimuli," explains Anderson. The second reason the results are important, he says, "is that now that we know these neurons detect massage-like stimuli, the results raise new sets of questions about which molecules in those neurons help the animal detect stroking but not poking."

The other benefit of their new methods, Anderson says, is that they will allow researchers to, in principle, trace the circuitry from those neurons up into the brain to ask why and how activating these neurons makes the animal feel good, whereas activating other neurons that are literally right next to them in the skin makes the animal feel bad.

"We are now most interested in how these neurons communicate to the brain through circuits," says Anderson. "In other words, what part of the circuit in the brain is responsible for the good feeling that is apparently produced by activating these neurons? It may seem frivolous to be identifying massage neurons in a mouse, but it could be that some good might come out of this down the road."

Allan M. Wong, a senior research fellow in biology at Caltech, and Kristofer K. Rau and Richard Koerber from the University of Pittsburgh were also coauthors on the Nature paper, "Genetic identification of C fibers that detect massage-like stroking of hairy skin in vivo." Funding for this research was provided by the National Institutes of Health, the Human Frontiers Science Program, and the Helen Hay Whitney Foundation.

Writer: 
Katie Neith
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

TEDxCaltech: Advancing Humanoid Robots

This week we will be highlighting the student speakers who auditioned and were selected to give five-minute talks about their brain-related research at TEDxCaltech: The Brain, a special event that will take place on Friday, January 18, in Beckman Auditorium. 

In the spirit of ideas worth spreading, TED has created a program of local, self-organized events called TEDx. Speakers are asked to give the talk of their lives. Live video coverage of the TEDxCaltech experience will be available during the event at http://tedxcaltech.caltech.edu.

When Matanya Horowitz started his undergraduate work in 2006 at University of Colorado at Boulder, he knew that he wanted to work in robotics—mostly because he was disappointed that technology had not yet made good on his sci-fi–inspired dreams of humanoid robots.

"The best thing we had at the time was the Roomba, which is a great product, but compared to science fiction it seemed really diminutive," says Horowitz. He therefore decided to major in not just electrical engineering, but also economics, applied math, and computer science. "I thought that the answer to better robots would lie somewhere in the middle of these different subjects, and that maybe each one held a different key," he explains.

Now a doctoral student at Caltech—he earned his masters in the same four years as his multiple undergrad degrees—Horowitz is putting his range of academic experience to work in the labs of engineers Joel Burdick and John Doyle to help advance robotics and intelligent systems. As a member of the control and dynamical systems group, he is active in several Defense Advanced Research Projects Agency (DARPA) challenges that seek to develop better control mechanisms for robotic arms, as well as develop humanoid robots that can do human-like tasks in dangerous situations, such as disable bombs or enter nuclear power plants during an emergency. 

But beneficial advances in robotics also bring challenges. Inspired as a kid by the robot tales of Isaac Asimov, Horowitz has long been interested in how society might be affected by robots.

"As I began programming just on my own, I saw how easy it was to create something that at least seemed to act with intelligence," he says. "It was interesting to me that we were so close to humanoid robots and that doing these things was so easy. But we also have all these implications we need to think about."

Horowitz's TEDx talk will explore some of the challenges of building and controlling something that needs to interact in the physical world. He says he's thrilled to have the opportunity to speak at TEDx, not just for the chance to talk to a general audience about his work, but also to hopefully inspire others by his enthusiasm for the field.

"Recently, there has been such a monumental shift from what robots were capable of even just five years ago, and people should be really excited about this," says Horowitz. "We've been hearing about robots for 30, 40 years—they've always been 'right around the corner.' But now we can finally point to one and say, 'Here it is, literally coming around a corner.'"

 

 

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

TEDxCaltech: If You Click a Cookie with a Mouse

This week we will be highlighting the student speakers who auditioned and were selected to give five-minute talks about their brain-related research at TEDxCaltech: The Brain, a special event that will take place on Friday, January 18, in Beckman Auditorium. 

In the spirit of ideas worth spreading, TED has created a program of local, self-organized events called TEDx. Speakers are asked to give the talk of their lives. Live video coverage of the TEDxCaltech experience will be available during the event at http://tedxcaltech.caltech.edu.

When offered spinach or a cookie, how do you decide which to eat? Do you go for the healthy choice or the tasty one? To study the science of decision making, researchers in the lab of Caltech neuroeconomist Antonio Rangel analyze what happens inside people's brains as they choose between various kinds of food. The researchers typically use functional magnetic resonance imaging (fMRI) to measure the changes in oxygen flow through the brain; these changes serve as proxies for spikes or dips in brain activity. Recently, however, investigators have started using a new technique that may better tease out how you choose between the spinach or the cookie—a decision that's often made in a fraction of a second.

While fMRI is a powerful method, it can only measure changes in brain activity down to the scale of a second or so. "That's not fast enough because these decisions are made sometimes within half a second," says Caltech senior Joy Lu, who will be talking about her research in Rangel's lab at TEDx Caltech. Instead of using fMRI, Lu—along with postdoctoral scholar Cendri Hutcherson and graduate student Nikki Sullivan—turned to the standard old computer mouse.

During the experiments—which are preliminary, as the researchers are still conducting and refining them—volunteers rate 250 kinds of food for healthiness and tastiness. The choices range from spinach and cookies to broccoli and chips. Then, the volunteers are given a choice between two of those items, represented by pictures on a computer screen. When they decide which option they want, they click with their mouse. But while they mull over their choices, the paths of their mouse cursor are being tracked—the idea being that the cursor paths may reveal how the volunteers arrive at their final decisions.

For example, if the subject initially feels obligated to be healthy, the cursor may hover over the spinach a moment before finally settling on the cookie. Or, if the person is immediately drawn to the sweet treat before realizing that health is a better choice, the cursor may hover over the cookie first.

Lu, Hutcherson, and Sullivan are using computer models to find cursor-path patterns or trends that may offer insight into the factors that influence such decisions. Do the paths differ between those who value health over taste and those who favor taste more?

Although the researchers are still refining their computer algorithms and continuing their experiments, they have some preliminary results. They found that with many people, for example, the cursor first curves toward one choice before ending up at the other. The time it takes for someone's health consciousness to kick in seems to be longer than the time it takes for people to succumb to cravings for something delicious.

After graduation, Lu plans to go to graduate school in marketing, where she'll use not only neuroscience techniques but also field studies to investigate consumer behavior. She might even compare the two methods. "Using neuroscience in marketing is a very new thing," she says. "That's what draws me toward it. We can't answer all the questions we want to answer just using field studies. You have to look at what's going on in a person's mind."

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Research Update: Atomic Motions Help Determine Temperatures Inside Earth

In December 2011, Caltech mineral-physics expert Jennifer Jackson reported that she and a team of researchers had used diamond-anvil cells to compress tiny samples of iron—the main element of the earth's core. By squeezing the samples to reproduce the extreme pressures felt at the core, the team was able to get a closer estimate of the melting point of iron. At the time, the measurements that the researchers made were unprecedented in detail. Now, they have taken that research one step further by adding infrared laser beams to the mix.

The lasers are a source of heat that, when sent through the compressed iron samples, warm them up to the point of melting.  And because the earth's core consists of a solid inner region surrounded by a liquid outer shell, the melting temperature of iron at high pressure provides an important reference point for the temperature distribution within the earth's core.

"This is the first time that anyone has combined Mössbauer spectroscopy and heating lasers to detect melting in compressed samples," says Jackson, a professor of mineral physics at Caltech and lead author of a recent paper in the journal Earth and Planetary Science Letters that outlined the team's new method. "What we found is that iron, compared to previous studies, melts at higher temperatures than what has been reported in the past."

Earlier research by other teams done at similar compressions—around 80 gigapascals—reported a range of possible melting points that topped out around 2600 Kelvin (K). Jackson's latest study indicates an iron melting point at this pressure of approximately 3025 K, suggesting that the earth's core is likely warmer than previously thought.

Knowing more about the temperature, composition, and behavior of the earth's core is essential to understanding the dynamics of the earth's interior, including the processes responsible for maintaining the earth's magnetic field. While iron makes up roughly 90 percent of the core, the rest is thought to be nickel and light elements—like silicon, sulfur, or oxygen—that are alloyed, or mixed, with the iron.

To develop and perform these experiments, Jackson worked closely with the Inelastic X-ray and Nuclear Resonant Scattering Group at the Advanced Photon Source at Argonne National Laboratory in Illinois. By laser heating the iron sample in a diamond-anvil cell and monitoring the dynamics of the iron atoms via a technique called synchrotron Mössbauer spectroscopy (SMS), the researchers were able to pinpoint a melting temperature for iron at a given pressure. The SMS signal is sensitively related to the dynamical behavior of the atoms, and can therefore detect when a group of atoms is in a molten state.

She and her team have begun experiments on iron alloys at even higher pressures, using their new approach.

"What we're working toward is a very tight constraint on the temperature of the earth's core," says Jackson. "A number of important geophysical quantities, such as the movement and expansion of materials at the base of the mantle, are dictated by the temperature of the earth's core."

"Our approach is a very elegant way to look at melting because it takes advantage of the physical principle of recoilless absorption of X-rays by nuclear resonances—the basis of the Mössbauer effect—for which Rudolf Mössbauer was awarded the Nobel Prize in Physics," says Jackson. "This particular approach to study melting has not been done at high pressures until now."

Jackson's findings not only tell us more about our own planet, but could indicate that other planets with iron-rich cores, like Mercury and Mars, may have warmer internal temperatures as well.

Her paper, "Melting of compressed iron by monitoring atomic dynamics," was published in Earth and Planetary Science Letters on January 8, 2013.

Writer: 
Katie Neith
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

TEDxCaltech: Surmounting the Blood-Brain Barrier

This week we will be highlighting the student speakers who auditioned and were selected to give five-minute talks about their brain-related research at TEDxCaltech: The Brain, a special event that will take place on Friday, January 18, in Beckman Auditorium. 

In the spirit of ideas worth spreading, TED has created a program of local, self-organized events called TEDx. Speakers are asked to give the talk of their lives. Live video coverage of the TEDxCaltech experience will be available during the event at http://tedxcaltech.caltech.edu.

The brain needs its surroundings to be just right. That is, unlike some internal organs, such as the liver, which can process just about anything that comes its way, the brain needs to be protected and to have a chemical environment with the right balance of proteins, sugars, salts, and other metabolites. 

That fact stood out to Caltech MD/PhD candidate and TEDxCaltech speaker Devin Wiley when he was studying medicine at the Keck School of Medicine of USC. "In certain cases, one bacterium detected in the brain can be a medical emergency," he says. "So the microenvironment needs to be highly protected and regulated for the brain to function correctly."

Fortunately, a semipermeable divide, known as the blood-brain barrier, is very good at maintaining such an environment for the brain. This barricade—made up of tightly packed blood-vessel cells—is effective at precisely controlling which molecules get into and out of the brain. Because the blood-brain barrier regulates the molecular traffic into the brain, it presents a significant challenge for anyone wanting to deliver therapeutics to the brain. 

At Caltech, Wiley has been working with his advisor, Mark Davis, the Warren and Katharine Schlinger Professor of Chemical Engineering, to develop a work-around—a way to sneak therapeutics past the barrier and into the brain to potentially treat neurologic diseases such as Alzheimer's and Parkinson's. The scientists' strategy is to deliver large-molecule therapeutics (which are being developed by the Davis lab as well as other research groups) tucked inside nanoparticles that have proteins attached to their surface. These proteins will bind specifically to receptors on the blood-brain barrier, allowing the nanoparticles and their therapeutic cargo to be shuttled across the barrier and released into the brain.

"In essence, this is like a Trojan horse," Wiley explains. "You're tricking the blood-brain barrier into transporting drugs to the brain that normally wouldn't get in."

During his five-minute TEDxCaltech talk on Friday, January 18, Wiley will describe this approach and his efforts to design nanoparticles that can transport and release therapeutics into the brain.

For Wiley, the issue of delivering therapeutics to the brain is more than a fascinating research problem. His grandmother recently passed away from Alzheimer's disease, and his wife's grandmother also suffers from the neurodegenerative disorder.

"This is something that affects a lot of people," Wiley says. "Treatments for cardiovascular diseases, cancer, and infectious diseases are really improving. However, better treatments for brain diseases are not being discovered as quickly. So what are the issues? I want to tell the story of one of them."

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Pages

Subscribe to RSS - research_news