Research Update: Battling Infection With Microbes

A relationship between gut bacteria and blood cell development helps the immune system fight infection, Caltech researchers say.

The human relationship with microbial life is complicated. At almost any supermarket, you can pick up both antibacterial soap and probiotic yogurt during the same shopping trip. Although there are types of bacteria that can make us sick, Caltech professor of biology and biological engineering Sarkis Mazmanian and his team are most interested in the thousands of other bacteria—many already living inside our bodies—that actually keep us healthy. His past work in mice has shown that restoring populations of beneficial bacteria can help alleviate the symptoms of inflammatory bowel disease, multiple sclerosis, and even autism. Now, he and his team have found that these good bugs might also prepare the immune cells in our blood to fight infections from harmful bacteria.

In the recent study, published on March 12 in the journal Cell Host & Microbe, the researchers found that beneficial gut bacteria were necessary for the development of innate immune cells—specialized types of white blood cells that serve as the body's first line of defense against invading pathogens.

In addition to circulating in the blood, reserve stores of immune cells are also kept in the spleen and in the bone marrow. When the researchers looked at the immune cell populations in these areas in so-called germ-free mice, born without gut bacteria, and in healthy mice with a normal population of microbes in the gut, they found that germ-free mice had fewer immune cells—specifically macrophages, monocytes, and neutrophils—than healthy mice.

Germ-free mice also had fewer granulocyte and monocyte progenitor cells, stemlike cells that can eventually differentiate into a few types of mature immune cells. And the innate immune cells that were in the spleen were defective—never fully reaching the proportions found in healthy mice with a diverse population of gut microbes.

"It's interesting to see that these microbes are having an immune effect beyond where they live in the gut," says Arya Khosravi, a graduate student in Mazmanian's lab, and first author on the recent study. "They're affecting places like your blood, spleen, and bone marrow—places where there shouldn't be any bacteria."

Khosravi and his colleagues next wanted to see if the reduction in immune cells in the blood would make the germ-free mice less able to fight off an infection by the harmful bacterium Listeria monocytogenes—a well-studied human pathogen often used to study immune responses in mice. While the healthy mice were able to bounce back after being injected with Listeria, the infection was fatal to germ-free mice. When gut microbes that would normally be present were introduced into germ-free mice, the immune cell population increased and the mice were able to survive the Listeria infection.

The researchers also gave injections of Listeria to healthy mice after those mice were dosed with broad-spectrum antibiotics that killed off both harmful and beneficial bacteria. Interestingly, these mice also had trouble fighting the Listeria infection. "We didn't look at clinical data in this study, but we hypothesize that this might also happen in the clinic," says Mazmanian. "For example, when patients are put on antibiotics for something like hip surgery, are you damaging their gut microbe population and making them more susceptible to an infection that had nothing to do with their hip surgery?"

More importantly, the research also suggests that a healthy population of gut microbes can actually provide a preventative alternative to antibiotics, Khosravi says. "Today there are more and more antibiotic resistant superbugs out there, and we're running out of ways to treat them. Limiting our susceptibility to infection could be a good protective strategy."

These results appear in a paper titled "Gut Microbiota Promote Hematopoiesis to Control Bacterial Infection."

Listing Title: 
Battling Infection With Microbes
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Bending the Light with a Tiny Chip

A silicon chip developed by Caltech researchers acts as a lens-free projector—and could one day end up in your cell phone.

Imagine that you are in a meeting with coworkers or at a gathering of friends. You pull out your cell phone to show a presentation or a video on YouTube. But you don't use the tiny screen; your phone projects a bright, clear image onto a wall or a big screen. Such a technology may be on its way, thanks to a new light-bending silicon chip developed by researchers at Caltech.

The chip was developed by Ali Hajimiri, Thomas G. Myers Professor of Electrical Engineering, and researchers in his laboratory. The results were presented at the Optical Fiber Communication (OFC) conference in San Francisco on March 10.

Traditional projectors—like those used to project a film or classroom lecture notes—pass a beam of light through a tiny image, using lenses to map each point of the small picture to corresponding, yet expanded, points on a large screen. The Caltech chip eliminates the need for bulky and expensive lenses and bulbs and instead uses a so-called integrated optical phased array (OPA) to project the image electronically with only a single laser diode as light source and no mechanically moving parts.

Hajimiri and his colleagues were able to bypass traditional optics by manipulating the coherence of light—a property that allows the researchers to "bend" the light waves on the surface of the chip without lenses or the use of any mechanical movement. If two waves are coherent in the direction of propagation—meaning that the peaks and troughs of one wave are exactly aligned with those of the second wave—the waves combine, resulting in one wave, a beam with twice the amplitude and four times the energy as the initial wave, moving in the direction of the coherent waves.

"By changing the relative timing of the waves, you can change the direction of the light beam," says Hajimiri. For example, if 10 people kneeling in line by a swimming pool slap the water at the exact same instant, they will make one big wave that travels directly away from them. But if the 10 separate slaps are staggered—each person hitting the water a half a second after the last—there will still be one big, combined wave, but with the wave bending to travel at an angle, he says.

Using a series of pipes for the light—called phase shifters—the OPA chip similarly slows down or speeds up the timing of the waves, thus controlling the direction of the light beam. To form an image, electronic data from a computer are converted into multiple electrical currents; by applying stronger or weaker currents to the light within the phase shifter, the number of electrons within each light path changes—which, in turn, changes the timing of the light wave in that path. The timed light waves are then delivered to tiny array elements within a grid on the chip. The light is then projected from each array in the grid, the individual array beams combining coherently in the air to form a single light beam and a spot on the screen.

As the electronic signal rapidly steers the beam left, right, up, and down, the light acts as a very fast pen, drawing an image made of light on the projection surface. Because the direction of the light beam is controlled electronically—not mechanically—it can create a sort of line very quickly. Since the light draws many times per second, the eye sees the process as a single image instead of a moving light beam, says Hajimiri.

"The new thing about our work is really that we can do this on a tiny, one-millimeter-square silicon chip, and the fact that we can do it very rapidly—rapidly enough to form images, since we phase-shift electronically in two dimensions," says Behrooz Abiri, a graduate student in Hajimiri's group and a coauthor on the paper. So far, the images Hajimiri and his team can project with the current version of the chip are somewhat simple—a triangle, a smiley face, or single letters, for example. However, the researchers are currently experimenting with larger chips that include more light-delivering array elements that—like using a larger lens on a camera—can improve the resolution and increase the complexity of the projected images.

In their recent experiments, Hajimiri and his colleagues have used the silicon chip to project images in infrared light, but additional work with different types of semiconductors will also allow the researchers to expand the tiny projector's capabilities into the visible spectrum. "Right now we are using silicon technology, which works better with infrared light. If you want to project visible light, you can take the exact same architecture and do it in what's called compound semiconductor III-V technology," says Firooz Aflatouni, another coauthor on the paper, who in January finished his two-year postdoctoral appointment at Caltech and joined the University of Pennsylvania as an assistant professor. "Silicon is good because it can be easily integrated into electronics, but these other compound semiconductors could be used to do the same thing."

"In the future, this can be incorporated into a phone, and since there is no need for a lens, you can have a phone that acts as a projector all by itself," Hajimiri says. However, although the chip could easily be incorporated into a cell phone, he points out that a tiny projection device can have many applications—including light-based radar systems (called "LIDAR"), which are used in positioning, robotics, geographical measurements, and mapmaking. Such equipment already exists, but current LIDAR technology requires complex, bulky, and expensive equipment—equipment that could be streamlined and simplified to a single chip at a much lower cost.

"But I don't want to limit the device to just a few purposes. The beauty of this thing is that these chips are small and can be made at a very low cost—and this opens up lots of interesting possibilities," he says.

These results were described in a presentation titled "Electronic Two-Dimensional Beam Steering for Integrated Optical Phased Arrays." Along with Hajimiri, Abiri, and Aflatouni, Caltech senior Angad Rekhi also contributed to the work. The study was funded by grants from the Caltech Innovation Initiative, and the Information Science and Technology initiative at Caltech.

Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Building Artificial Cells Will Be a Noisy Business

Engineers like to make things that work. And if one wants to make something work using nanoscale components—the size of proteins, antibodies, and viruses—mimicking the behavior of cells is a good place to start since cells carry an enormous amount of information in a very tiny packet. As Erik Winfree, professor of computer science, computation and neutral systems, and bioengineering, explains, "I tend to think of cells as really small robots. Biology has programmed natural cells, but now engineers are starting to think about how we can program artificial cells. We want to program something about a micron in size, finer than the dimension of a human hair, that can interact with its chemical environment and carry out the spectrum of tasks that biological things do, but according to our instructions."

Getting tiny things to behave is, however, a daunting task. A central problem bioengineers face when working at this scale is that when biochemical circuits, such as the one Winfree has designed, are restricted to an extremely small volume, they may cease to function as expected, even though the circuit works well in a regular test tube. Smaller populations of molecules simply do not behave the same as larger populations of the same molecules, as a recent paper in Nature Chemistry demonstrates.

Winfree and his coauthors began their investigation of the effect of small sample size on biochemical processes with a biochemical oscillator designed in Winfree's lab at Caltech. This oscillator is a solution composed of small synthetic DNA molecules that are activated by RNA transcripts and enzymes. When the DNA molecules are activated by the other components in the solution, a biological circuit is created. This circuit fluoresces in a rhythmic pulse for approximately 15 hours until its chemical reactions slow and eventually stop.

The researchers then "compartmentalized" the oscillator by reducing it from one large system in a test tube to many tiny oscillators. Using an approach developed by Maximilian Weitz and colleagues at the Technical University of Munich and former Caltech graduate student Elisa Franco, currently an assistant professor of mechanical engineering at UC Riverside, an aqueous solution of the DNA, RNA, and enzymes that make up the biochemical oscillator was mixed with oil and shaken until small portions of the solution, each containing a tiny oscillator, were isolated within droplets surrounded by oil.

"After the oil is added and shaken, the mixture turns into a cream, called an emulsion, that looks somewhat like a light mayonnaise," says Winfree. "We then take this cream, pour it on a glass slide and spread it out, and observe the patterns of pulsing fluorescence in each droplet under a microscope." 

When a large sample of the solution is active, it fluoresces in regular pulses. The largest droplets behave as the entire solution does: fluorescing mostly in phase with one another, as though separate but still acting in concert. But the behavior of the smaller droplets was found to be much less consistent, and their pulses of fluorescence quickly moved out of phase with the larger droplets.

Researchers had expected that the various droplets, especially the smaller ones, would behave differently from one another due to an effect known as stochastic reaction dynamics. The specific reactions that make up a biochemical circuit may happen at slightly different times in different parts of a solution. If the solution sample is large enough, this effect is averaged out, but if the sample is very small, these slight differences in the timing of reactions will be amplified. The sensitivity to droplet size can be even more significant depending on the nature of the reactions. As Winfree explains, "If you have two competing reactions—say x could get converted to y, or x could get converted to z, each at the same rate—then if you have a test tube–sized sample, you will end up with a something that is half y and half z. But if you only have four molecules in a droplet, then perhaps they will all convert to y, and that's that: there's no z to be found."

In their experiments on the biochemical oscillator, however, Winfree and his colleagues discovered that this source of noise—stochastic reaction dynamics—was relatively small compared to a source of noise that they did not anticipate: partitioning effects. In other words, the molecules that were captured in each droplet were not exactly the same. Some droplets initially had more molecules, while others had fewer; also, the ratio between the various elements was different in different droplets. So even before the differential timing of reactions could create stochastic dynamics, these tiny populations of molecules started out with dissimilar features. The differences between them were then further amplified as the biochemical reactions proceeded.

"To make an artificial cell work," says Winfree, "you need to know what your sources of noise are. The dominant thought was that the noise you're confronted with when you're engineering nanometer-scale components has to do with randomness of chemical reactions at that scale. But this experience has taught us that these stochastic reaction dynamics are really the next-level challenge. To get to that next level, first we have to learn how to deal with partitioning noise."

For Winfree, this is an exciting challenge: "When I program my computer, I can think entirely in terms of deterministic processes. But when I try to engineer what is essentially a program at the molecular scale, I have to think in terms of probabilities and stochastic (random) processes. This is inherently more difficult, but I like challenges. And if we are ever to succeed in creating artificial cells, these are the sorts of problems we need to address."

Coauthors of the paper, "Diversity in the dynamical behaviour of a compartmentalized programmable biochemical oscillator," include Maximilian Weitz, Korbinian Kapsner, and Friedrich C. Simmel of the Technical University of Munich; Jongmin Kim of Caltech; and Elisa Franco of UC Riverside. The project was funded by the National Science Foundation, UC Riverside, the European Commission, the German Research Foundation Cluster of Excellence Nanosystems Initiative Munich, and the Elite Network of Bavaria.

Writer: 
Cynthia Eller
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Detection of Water Vapor in the Atmosphere of a Hot Jupiter

Caltech researchers develop a new technique to find water vapor on extrasolar planets

Although liquid water covers a majority of Earth's surface, scientists are still searching for planets outside of our solar system that contain water. Researchers at Caltech and several other institutions have used a new technique to analyze the gaseous atmospheres of such extrasolar planets and have made the first detection of water in the atmosphere of the Jupiter-mass planet orbiting the nearby star tau Boötis. With further development and more sensitive instruments, this technique could help researchers learn about how many planets with water—like Earth—exist within our galaxy.

The new results are described in the February 24 online version of The Astrophysical Journal Letters.

Scientists have previously detected water vapor on a handful of other planets, but these detections could only take place under very specific circumstances, says graduate student Alexandra Lockwood, the first author of the study. "When a planet transits—or passes in orbit in front of—its host star, we can use information from this event to detect water vapor and other atmospheric compounds," she says. "Alternatively, if the planet is sufficiently far away from its host star, we can also learn about a planet's atmosphere by imaging it."

However, significant portions of the population of extrasolar planets do not fit either of these criteria, and there was not really a way to find information about the atmospheres of these planets. Looking to resolve this problem, Lockwood and her adviser Geoffrey Blake, professor of cosmochemistry and planetary sciences and professor of chemistry, applied a novel technique for finding water in a planetary atmosphere. Other researchers had used similar approaches previously to detect carbon monoxide in tau Boötis b.

The method utilized the radial velocity (RV) technique—a technique commonly used in the visible region of the spectrum to which our eyes are sensitive—for discovering non-transiting exoplanets. Using the Doppler effect, RV detection traditionally determines the motion of a star due to the gravitational pull of a companion planet; the star moves opposite that of the orbital motion of the planet, and the stellar features shift in wavelength. A large planet or a planet closer to its host star provides a larger shift.

Lockwood, Blake, and their colleagues expanded the RV technique into the infrared to determine the orbit of tau Boötis b around its star, and added further analysis of the light shifts via spectroscopy—an analysis of the light's spectrum. Since every compound emits a different wavelength of light, this unique light signature allows the researchers to analyze molecules that make up the planet's atmosphere. Using data of tau Boötis b from the Near Infrared Echelle Spectrograph (NIRSPEC) at the W. M. Keck Observatory in Hawaii, the researchers were able to compare the molecular signature of water to the light spectrum emitted by the planet, confirming that the atmosphere did indeed include water vapor.

"The information we get from the spectrograph is like listening to an orchestra performance; you hear all of the music together, but if you listen carefully, you can pick out a trumpet or a violin or a cello, and you know that those instruments are present," she says. "With the telescope, you see all of the light together, but the spectrograph allows you to pick out different pieces; like this wavelength of light means that there is sodium, or this one means that there's water."

In addition to using the spectrographic technique to study the planet's atmospheric composition, the method also provides a way for researchers to analyze the mass of planets. "They're actually two separate findings, but they're both very exciting," says Lockwood. "When you're doing calculations to look for the atmospheric signature—which tells you that there's water present—you also determine the 3-D motion of the star and the planet in the system. With this information, if you also know the mass of the star, you can determine the mass of the planet," she says.

Previous RV methods for measuring a planet's mass could only determine the planet's indicative mass—an estimation of its minimum mass, which might be much less than its actual mass. This new technique provides a way to measure the true mass of a planet since both light from the star and the planet are detected, which is critical for understanding how planets and planetary systems form and evolve.

Although the technique promises to augment how planetary scientists analyze the properties of extrasolar planets, it has limitations, the researchers say. For example, the technique is presently limited to so-called "hot Jupiter" gas giant planets like tau Boötis b—those that are large and orbit very close to their host star.

"The technique is limited by the light-collecting power and wavelength range of the telescope, and even with the incredible collecting area of the Keck Observatory mirror on the high, dry summit of Mauna Kea we can basically only analyze hot planets that are orbiting bright stars, but that could be expanded in the future as telescopes and infrared spectrographs improve," Lockwood says. In the future, in addition to analyzing cooler planets and dimmer stars, the researchers plan to continue looking for and analyzing the abundance of other molecules that might be present in the atmosphere of tau Boötis b.

"While the current state of the technique cannot detect earthlike planets around stars like the Sun, with Keck it should soon be possible to study the atmospheres of the so-called 'super-Earth' planets being discovered around nearby low-mass stars, many of which do not transit," Blake says. "Future telescopes such as the James Webb Space Telescope and the Thirty Meter Telescope (TMT) will enable us to examine much cooler planets that are more distant from their host stars and where liquid water is more likely to exist."

The findings appear in the The Astrophysical Journal Letters in a paper titled "Near-IR Direct Detection of Water Vapor in tau Boötis b." Other coauthors include collaborators from Pennsylvania State University, the Naval Research Laboratory, the University of Arizona, and the Harvard-Smithsonian Center for Astrophysics. The work was funded by the National Science Foundation Graduate Research Fellowship Program, the David and Lucile Packard and Alfred P. Sloan foundations, and the Penn State Center for Exoplanets and Habitable Worlds.

Images: 
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

A New Laser for a Faster Internet

A new laser developed by a research group at Caltech holds the potential to increase by orders of magnitude the rate of data transmission in the optical-fiber network—the backbone of the Internet.

The study was published the week of February 10–14 in the online edition of the Proceedings of the National Academy of Sciences. The work is the result of a five-year effort by researchers in the laboratory of Amnon Yariv, Martin and Eileen Summerfield Professor of Applied Physics and professor of electrical engineering; the project was led by postdoctoral scholar Christos Santis (PhD '13) and graduate student Scott Steger.

Light is capable of carrying vast amounts of information—approximately 10,000 times more bandwidth than microwaves, the earlier carrier of long-distance communications. But to utilize this potential, the laser light needs to be as spectrally pure—as close to a single frequency—as possible. The purer the tone, the more information it can carry, and for decades researchers have been trying to develop a laser that comes as close as possible to emitting just one frequency.

Today's worldwide optical-fiber network is still powered by a laser known as the distributed-feedback semiconductor (S-DFB) laser, developed in the mid 1970s in Yariv's research group. The S-DFB laser's unusual longevity in optical communications stemmed from its, at the time, unparalleled spectral purity—the degree to which the light emitted matched a single frequency. The laser's increased spectral purity directly translated into a larger information bandwidth of the laser beam and longer possible transmission distances in the optical fiber—with the result that more information could be carried farther and faster than ever before.

At the time, this unprecedented spectral purity was a direct consequence of the incorporation of a nanoscale corrugation within the multilayered structure of the laser. The washboard-like surface acted as a sort of internal filter, discriminating against spurious "noisy" waves contaminating the ideal wave frequency. Although the old S-DFB laser had a successful 40-year run in optical communications—and was cited as the main reason for Yariv receiving the 2010 National Medal of Science—the spectral purity, or coherence, of the laser no longer satisfies the ever-increasing demand for bandwidth.

"What became the prime motivator for our project was that the present-day laser designs—even our S-DFB laser—have an internal architecture which is unfavorable for high spectral-purity operation. This is because they allow a large and theoretically unavoidable optical noise to comingle with the coherent laser and thus degrade its spectral purity," he says.

The old S-DFB laser consists of continuous crystalline layers of materials called III-V semiconductors—typically gallium arsenide and indium phosphide—that convert into light the applied electrical current flowing through the structure. Once generated, the light is stored within the same material. Since III-V semiconductors are also strong light absorbers—and this absorption leads to a degradation of spectral purity—the researchers sought a different solution for the new laser.

The high-coherence new laser still converts current to light using the III-V material, but in a fundamental departure from the S-DFB laser, it stores the light in a layer of silicon, which does not absorb light. Spatial patterning of this silicon layer—a variant of the corrugated surface of the S-DFB laser—causes the silicon to act as a light concentrator, pulling the newly generated light away from the light-absorbing III-V material and into the near absorption-free silicon.

This newly achieved high spectral purity—a 20 times narrower range of frequencies than possible with the S-DFB laser—could be especially important for the future of fiber-optic communications. Originally, laser beams in optic fibers carried information in pulses of light; data signals were impressed on the beam by rapidly turning the laser on and off, and the resulting light pulses were carried through the optic fibers. However, to meet the increasing demand for bandwidth, communications system engineers are now adopting a new method of impressing the data on laser beams that no longer requires this "on-off" technique. This method is called coherent phase communication.

In coherent phase communications, the data resides in small delays in the arrival time of the waves; the delays—a tiny fraction (10-16) of a second in duration—can then accurately relay the information even over thousands of miles. The digital electronic bits carrying video, data, or other information are converted at the laser into these small delays in the otherwise rock-steady light wave. But the number of possible delays, and thus the data-carrying capacity of the channel, is fundamentally limited by the degree of spectral purity of the laser beam. This purity can never be absolute—a limitation of the laws of physics—but with the new laser, Yariv and his team have tried to come as close to absolute purity as is possible.

These findings were published in a paper titled, "High-coherence semiconductor lasers based on integral high-Q resonators in hybrid Si/III-V platforms." In addition to Yariv, Santis, and Steger, other Caltech coauthors include graduate student Yaakov Vilenchik, and former graduate student Arseny Vasilyev (PhD, '13). The work was funded by the Army Research Office, the National Science Foundation, and the Defense Advanced Research Projects Agency. The lasers were fabricated at the Kavli Nanoscience Institute at Caltech.

Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

NuSTAR Reveals Radioactive Matter in Supernova Remnant

New details suggest how massive stars explode

Using its X-ray vision to observe what is left of a massive star that exploded long ago, NASA's Nuclear Spectroscopic Telescope Array (NuSTAR) spacecraft has shed new light on an old question: How exactly do stars go out with such a bang? For the first time, NuSTAR has mapped radioactive material from the core of such a supernova explosion. The results suggest that the core of the star actually sloshes around before shock waves rip it apart.

Between August 2012 and June 2013, NuSTAR trained its eyes multiple times on the Cassiopeia A (Cas A) remnant—the leftovers of a star that collapsed and exploded more than 11,000 years ago. With the observatory's sensitivity to high-energy X-rays, it was able to image and then map the distribution in Cas A of radioactive titanium-44, an atom produced at the core of the exploding star. Members of the NuSTAR team report the observations in the February 20 issue of the journal Nature.

"We are excited about these new results. Probing supernova explosions is one of the things that NuSTAR was specifically designed to do," says Fiona Harrison, the Benjamin M. Rosen Professor of Physics and Astronomy at Caltech and NuSTAR's principal investigator. "NuSTAR is the only spacecraft currently capable of making the measurements that led to these new insights."

Although other powerful X-ray telescopes, such as NASA's Chandra X-ray Observatory and the European Space Agency's XMM-Newton, have imaged the Cas A remnant before, those observatories can only detect material that has been heated by the explosion. NuSTAR's specially coated optics and newly developed detectors allow it to image at higher energies. So what is particularly exciting about the NuSTAR map is that it shows all of the titanium-44, revealing both the heated and unheated material from the heart of the explosion.

"With NuSTAR we have a new forensic tool to investigate the explosion," says Brian Grefenstette, lead author of the paper, also from Caltech. "Previously, it was hard to interpret what was going on in Cas A because the material that we could see only glows in X-rays when it's heated up. Now that we can see the radioactive material, which glows in X-rays no matter what, we are getting a more complete picture of the core of the explosion."

 
 NuSTAR has provided the first observational evidence in support of a theory that says exploding stars slosh around before detonating. That theory, referred to as mild asymmetries, is shown here in a simulation by Christian Ott, professor of theoretical astrophysics at Caltech.
 

The distribution of titanium-44 that NuSTAR observed suggests that supernova explosions of Cas A's kind are not completely symmetric, nor are they driven by powerful jets, as some had hypothesized. Instead, computer simulations that match the NuSTAR data suggest that stars like Cas A slosh around before exploding and therefore disperse the radioactive material at their cores in a mildly asymmetric way.

"When we try to recreate supernovas with spherical models, the shock wave from the initial collapse of the star's core stalls out," explains Harrison. "Our new results point to strong distortions of a spherical shape as key to the process of reenergizing the blast wave. The exploding star literally sloshes around before detonating."

As revealing as the NuSTAR findings are, they have also created a new mystery for scientists to ponder. Since both the iron and titanium in the remnant originated in the star's core, the researchers had expected to find significant overlap between the titanium-44 map and a previous map based on Chandra's observations of iron in the remnant. Instead, the two did not match up well. So, the researchers say, the case of the Cas A remnant is far from closed.

NuSTAR is a Small Explorer mission led by Caltech and managed by NASA's Jet Propulsion Laboratory (JPL) for NASA's Science Mission Directorate in Washington. Along with Harrison and Grefenstette, additional Caltech coauthors on the paper, "Mapping Cassiopeia A in Radioactive 44Ti: Probing the Explosion's Engine," are Kristin Madsen, Hiromasa Miyasaka, Vikram Rana, and JPL researcher Daniel Stern.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No

A Changing View of Bone Marrow Cells

Caltech researchers show that the cells are actively involved in sensing infection

In the battle against infection, immune cells are the body's offense and defense—some cells go on the attack while others block invading pathogens. It has long been known that a population of blood stem cells that resides in the bone marrow generates all of these immune cells. But most scientists have believed that blood stem cells participate in battles against infection in a delayed way, replenishing immune cells on the front line only after they become depleted.

Now, using a novel microfluidic technique, researchers at Caltech have shown that these stem cells might be more actively involved, sensing danger signals directly and quickly producing new immune cells to join the fight.

"It has been most people's belief that the bone marrow has the function of making these cells but that the response to infection is something that happens locally, at the infection site," says David Baltimore, president emeritus and the Robert Andrews Millikan Professor of Biology at Caltech. "We've shown that these bone marrow cells themselves are sensitive to infection-related molecules and that they respond very rapidly. So the bone marrow is actually set up to respond to infection."

The study, led by Jimmy Zhao, a graduate student in the UCLA-Caltech Medical Scientist Training Program, will appear in the April 3 issue of the journal Cell Stem Cell.

In the work, the researchers show that blood stem cells have all the components needed to detect an invasion and to mount an inflammatory response. They show, as others have previously, that these cells have on their surface a type of receptor called a toll-like receptor. The researchers then identify an entire internal response pathway that can translate activation of those receptors by infection-related molecules, or danger signals, into the production of cytokines, signaling molecules that can crank up immune-cell production. Interestingly, they show for the first time that the transcription factor NF-κB, known to be the central organizer of the immune response to infection, is part of that response pathway.

To examine what happens to a blood stem cell once it is activated by a danger signal, the Baltimore lab teamed up with chemists from the lab of James Heath, the Elizabeth W. Gilloon Professor and professor of chemistry at Caltech. They devised a microfluidic chip—printed in flexible silicon on a glass slide, complete with input and output ports, control valves, and thousands of tiny wells—that would enable single-cell analysis. At the bottom of each well, they attached DNA molecules in strips and introduced a flow of antibodies—pathogen-targeting proteins of the immune system—that had complementary DNA. They then added the stem cells along with infection-related molecules and incubated the whole sample. Since the antibodies were selected based on their ability to bind to certain cytokines, they specifically captured any of those cytokines released by the cells after activation. When the researchers added a secondary antibody and a dye, the cytokines lit up. "They all light up the same color, but you can tell which is which because you've attached the DNA in an orderly fashion," explains Baltimore. "So you've got both visualization and localization that tells you which molecule was secreted." In this way, they were able to measure, for example, that the cytokine IL-6 was secreted most frequently—by 21.9 percent of the cells tested.

"The experimental challenges here were significant—we needed to isolate what are actually quite rare cells, and then measure the levels of a dozen secreted proteins from each of those cells," says Heath. "The end result was sort of like putting on a new pair of glasses—we were able to observe functional properties of these stem cells that were totally unexpected."

The team found that blood stem cells produce a surprising number and variety of cytokines very rapidly. In fact, the stem cells are even more potent generators of cytokines than other previously known cytokine producers of the immune system. Once the cytokines are released, it appears that they are able to bind to their own cytokine receptors or those on other nearby blood stem cells. This stimulates the bound cells to differentiate into the immune cells needed at the site of infection.

"This does now change the view of the potential of bone marrow cells to be involved in inflammatory reactions," says Baltimore.

Heath notes that the collaboration benefited greatly from Caltech's support of interdisciplinary work. "It is a unique and fertile environment," he says, "one that encourages scientists from different disciplines to harness their disparate areas of expertise to solve tough problems like this one."

Additional coauthors on the paper, "Conversion of danger signals into cytokine signals by hematopoietic stem and progenitor cells for regulation of stress-induced hematopoiesis," are Chao Ma, Ryan O'Connell, Arnav Mehta, and Race DiLoreto. The work was supported by grants from the National Institute of Allergy and Infectious Diseases, the National Institutes of Health, a National Research Service Award, the UCLA-Caltech Medical Scientist Training Program, a Rosen Fellowship, a Pathway to Independence Award, and an American Cancer Society Research Grant.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

NOvA Sees First Long-distance Neutrinos

The NOvA experiment, centered at the Department of Energy's Fermi National Accelerator Laboratory (Fermilab) near Chicago has detected its first neutrinos.

Ryan Patterson, assistant professor of physics at Caltech and principal investigator for the Caltech NOvA team of eight researchers, states, "With these first neutrinos in hand, we celebrate the official start of our physics run. The data we collect with NOvA will provide a brand-new window on how neutrino masses arise and relate to one another, and whether there are new physical laws lurking in the neutrino sector of the standard model of particle physics."

Neutrinos are curious particles that travel near the speed of light, rarely interacting with matter. The NOvA experiment, a collaboration of 208 scientists from 38 institutions, is scheduled to run for six years. It includes the Fermilab accelerator and two receivers, one located near Fermilab, the other some 500 miles away in Ash River, Minnesota, near the Canadian border.

Writer: 
Exclude from News Hub: 
No

Is Natural Gas a Solution to Mitigating Climate Change?

Methane, a key greenhouse gas, has more than doubled in volume in Earth's atmosphere since 1750. Its increase is believed to be a leading contributor to climate change. But where is the methane coming from? Research by atmospheric chemist Paul Wennberg of the California Institute of Technology (Caltech) suggests that losses of natural gas—our "cleanest" fossil fuel—into the atmosphere may be a larger source than previously recognized.

Radiation from the sun warms Earth's surface, which then radiates heat back into the atmosphere. Greenhouse gases trap some of this heat. It is this process that makes life on Earth possible for beings such as ourselves, who could not tolerate the lower temperatures Earth would have if not for its "blanket" of greenhouse gases. However, as Goldilocks would tell you, there is "too hot" as well as "too cold," and the precipitous increase in greenhouse gases since the beginning of the Industrial Revolution induces climate change, alters weather patterns, and has increased sea level. Carbon dioxide is the most prevalent greenhouse gas in Earth's atmosphere, but there are others as well, among them methane.

Those who are concerned about greenhouse gases have a very special enemy to fear in atmospheric methane. Methane has a trifecta of effects on the atmosphere. First, like other greenhouse gases, methane works directly to trap Earth's radiation in the atmosphere. Second, when methane oxidizes in Earth's atmosphere, it is broken into components that are also greenhouse gases: carbon dioxide and ozone. Third, the breakdown of methane in the atmosphere produces water vapor, which also functions as a greenhouse gas. Increased humidity, especially in the otherwise arid stratosphere where approximately 10 percent of methane is oxidized, further increases greenhouse-gas induced climate change.

Fully one-third of the increase in radiative forcing (the ability of the atmosphere to retain radiation from the sun) since 1750 is estimated to be due to the presence and effects of methane. Because of the many potential sources of atmospheric methane, from landfills to wetlands to petroleum processing, it can be difficult to quantify which sources are making the greatest contribution. But according to Paul Wennberg, Caltech's R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, and his colleagues, it is possible that a significant source of methane, at least in the Los Angeles basin, is fugitive emissions—leaks—from the natural-gas supply line.

"This was a surprise," Wennberg explains of the results of his research on methane in the Los Angeles atmosphere. In an initial study conducted in 2008, Wennberg's team analyzed measurements from the troposphere, the lowest portion of Earth's atmosphere, via an airplane flying less than a mile above the ground over the Los Angeles basin.

In analyzing chemical signatures of the preliminary samples, Wennberg's team made an intriguing discovery: the signatures bore a striking similarity to the chemical profile of natural gas. Normally, the methane from fossil fuel sources is accompanied by ethane gas—which is the second most common component of natural gas—while biogenic sources of methane (such as livestock and wastewater) are not. Indeed, the researchers found that the ratio of methane and ethane in the L.A. air samples was characteristic of the samples of natural gas provided by the Southern California Gas Company, which is the leading supplier of natural gas to the region.

Wennberg hesitates to pinpoint natural-gas leaks as the sole source of the L.A. methane, however. "Even though it looks like the methane/ethane could come from fugitive natural-gas emissions, it's certainly not all coming from this source," he says. "We're still drilling for oil in L.A., and that yields natural gas that includes ethane too."

The Southern California Gas Company reports very low losses in the delivery of natural gas (approximately 0.1 percent), and yet atmospheric data suggest that the source of methane from either the natural-gas infrastructure or petroleum production is closer to 2 percent of the total gas delivered to the basin. One possible way to reconcile these vastly different estimates is that significant losses of natural gas may occur after consumer metering in the homes, offices, and industrial plants that purchase natural gas. This loss of fuel is small enough to have no immediate negative impact on household users, but cumulatively it could be a major player in the concentration of methane in the atmosphere.

The findings of Wennberg and his colleagues have led to a more comprehensive study of greenhouse gases in urban settings, the Megacities Carbon Project, based at JPL. The goal of the project, which is focusing initially on ground-based measurements in Los Angeles and Paris, is to quantify greenhouse gases in the megacities of the world. Such cities—places like Hong Kong, Berlin, Jakarta, Johannesburg, Seoul, São Paulo, and Tokyo—are responsible for up to 75 percent of global carbon emissions, despite representing only 3 percent of the world's landmass. Documenting the types and sources of greenhouse gases in megacities will provide valuable baseline measurements that can be used in efforts to reduce greenhouse gas emissions.

If the findings of the Megacities Carbon Project are consistent with Wennberg's study of methane in Los Angeles, natural gas may be less of a panacea in the search for a "green" fuel. Natural gas has a cleaner emissions profile and a higher efficiency than coal (that is, it produces more power per molecule of carbon dioxide), but, as far as climate change goes, methods of extraction and distribution are key. "You have to dig it up, put it in the pipe, and burn it without losing more than a few percent," Wennberg says. "Otherwise, it's not nearly as helpful as you would think."

Wennberg's research was published in an article titled "On the Sources of Methane to the Los Angeles Atmosphere" in Environmental Science & Technology. Data for this study were provided by the Southern California Gas Company, NASA, NOAA, and the California Air Resources Board. The research was funded by NASA, the California Energy Commission's Public Interest Environmental Research program, the California Air Resources Board, and the U.S. Department of Energy.

Writer: 
Cynthia Eller
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Caltech-Developed Method for Delivering HIV-Fighting Antibodies Proven Even More Promising

In 2011, biologists at the California Institute of Technology (Caltech) demonstrated a highly effective method for delivering HIV-fighting antibodies to mice—a treatment that protected the mice from infection by a laboratory strain of HIV delivered intravenously. Now the researchers, led by Nobel Laureate David Baltimore, have shown that the same procedure is just as effective against a strain of HIV found in the real world, even when transmitted across mucosal surfaces.

The findings, which appear in the February 9 advance online publication of the journal Nature Medicine, suggest that the delivery method might be effective in preventing vaginal transmission of HIV between humans.

"The method that we developed has now been validated in the most natural possible setting in a mouse," says Baltimore, president emeritus and the Robert Andrews Millikan Professor of Biology at Caltech. "This procedure is extremely effective against a naturally transmitted strain and by an intravaginal infection route, which is a model of how HIV is transmitted in most of the infections that occur in the world."

The new delivery method—called Vectored ImmunoProphylaxis, or VIP for short—is not exactly a vaccine. Vaccines introduce substances such as antigens into the body to try to get the immune system to mount an appropriate attack—to generate antibodies that can block an infection or T cells that can attack infected cells. In the case of VIP, a small, harmless virus is injected and delivers genes to the muscle tissue, instructing it to generate specific antibodies.  

The researchers emphasize that the work was done in mice and that the leap from mice to humans is large. The team is now working with the Vaccine Research Center at the National Institutes of Health to begin clinical evaluation.

The study, "Vectored immunoprophylaxis protects humanized mice from mucosal HIV transmission," was supported by the UCLA Center for AIDS Research, the National Institutes of Health, and the Caltech-UCLA Joint Center for Translational Medicine. Caltech biology researchers Alejandro B. Balazs, Yong Ouyang, Christin H. Hong, Joyce Chen, and Steven M. Nguyen also contributed to the study, as well as Dinesh S. Rao of the David Geffen School of Medicine at UCLA and Dong Sung An of the UCLA AIDS Institute.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - research_news