Picking Apart Photosynthesis

New insights from Caltech chemists could lead to better catalysts for water splitting

PASADENA, Calif.—Chemists at the California Institute of Technology (Caltech) and the Lawrence Berkeley National Laboratory believe they can now explain one of the remaining mysteries of photosynthesis, the chemical process by which plants convert sunlight into usable energy and generate the oxygen that we breathe. The finding suggests a new way of approaching the design of catalysts that drive the water-splitting reactions of artificial photosynthesis.

"If we want to make systems that can do artificial photosynthesis, it's important that we understand how the system found in nature functions," says Theodor Agapie, an assistant professor of chemistry at Caltech and principal investigator on a paper in the journal Nature Chemistry that describes the new results.

One of the key pieces of biological machinery that enables photosynthesis is a conglomeration of proteins and pigments known as photosystem II. Within that system lies a small cluster of atoms, called the oxygen-evolving complex, where water molecules are split and molecular oxygen is made. Although this oxygen-producing process has been studied extensively, the role that various parts of the cluster play has remained unclear. 

The oxygen-evolving complex performs a reaction that requires the transfer of electrons, making it an example of what is known as a redox, or oxidation-reduction, reaction. The cluster can be described as a "mixed-metal cluster" because in addition to oxygen, it includes two types of metals—one that is redox active, or capable of participating in the transfer of electrons (in this case, manganese), and one that is redox inactive (calcium).

"Since calcium is redox inactive, people have long wondered what role it might play in this cluster," Agapie says.

It has been difficult to solve that mystery in large part because the oxygen-evolving complex is just a cog in the much larger machine that is photosystem II; it is hard to study the smaller piece because there is so much going on with the whole. To get around this, Agapie's graduate student Emily Tsui prepared a series of compounds that are structurally related to the oxygen-evolving complex. She built upon an organic scaffold in a stepwise fashion, first adding three manganese centers and then attaching a fourth metal. By varying that fourth metal to be calcium and then different redox-inactive metals, such as strontium, sodium, yttrium, and zinc, Tsui was able to compare the effects of the metals on the chemical properties of the compound.

"When making mixed-metal clusters, researchers usually mix simple chemical precursors and hope the metals will self-assemble in desired structures," Tsui says. "That makes it hard to control the product. By preparing these clusters in a much more methodical way, we've been able to get just the right structures."

It turns out that the redox-inactive metals affect the way electrons are transferred in such systems. To make molecular oxygen, the manganese atoms must activate the oxygen atoms connected to the metals in the complex. In order to do that, the manganese atoms must first transfer away several electrons. Redox-inactive metals that tug more strongly on the electrons of the oxygen atoms make it more difficult for manganese to do this. But calcium does not draw electrons strongly toward itself. Therefore, it allows the manganese atoms to transfer away electrons and activate the oxygen atoms that go on to make molecular oxygen.

A number of the catalysts that are currently being developed to drive artificial photosynthesis are mixed-metal oxide catalysts. It has again been unclear what role the redox-inactive metals in these mixed catalysts play. The new findings suggest that the redox-inactive metals affect the way the electrons are transferred. "If you pick the right redox-inactive metal, you can tune the reduction potential to bring the reaction to the range where it is favorable," Agapie says. "That means we now have a more rational way of thinking about how to design these sorts of catalysts because we know how much the redox-inactive metal affects the redox chemistry."

The paper in Nature Chemistry is titled "Redox-inactive metals modulate the reduction potential in heterometallic manganese-oxido clusters." Along with Agapie and Tsui, Rosalie Tran and Junko Yano of the Lawrence Berkeley National Laboratory are also coauthors. The work was supported by the Searle Scholars Program, an NSF CAREER award, and the NSF Graduate Research Fellowship Program. X-ray spectroscopy work was supported by the NIH and the DOE Office of Basic Energy Sciences. Synchrotron facilities were provided by the Stanford Synchrotron Radiation Lightsource, operated by the DOE Office of Biological and Environmental Research. 

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Counting White Blood Cells at Home

Caltech engineers lead development of a new portable counter

PASADENA, Calif.—White blood cells, or leukocytes, are the immune system's warriors. So when an infection or disease attacks the body, the system typically responds by sending more white blood cells into the fray. This means that checking the number of these cells is a relatively easy way to detect and monitor such conditions.

Currently, most white blood cell counts are performed with large-scale equipment in central clinical laboratories. If a physician collects blood samples from a patient in the office—usually requiring a full vial of blood for each test—it can take days to get the results. But now engineers at the California Institute of Technology (Caltech), working with a collaborator from the Jerusalem-based company LeukoDx, have developed a portable device to count white blood cells that needs less than a pinprick's worth of blood and takes just minutes to run.

"The white blood cell counts from our new system closely match the results from tests conducted in hospitals and other central clinical settings," says Yu-Chong Tai, professor of electrical engineering and mechanical engineering at Caltech and the project's principal investigator. "This could make point-of-care testing possible for the first time."

Portable white blood cell counters could improve outpatient monitoring of patients with chronic conditions such as leukemia or other cancers. The counters could be used in combination with telemedicine to bring medical care to remote areas. The devices could even enable astronauts to evaluate their long-term exposure to radiation while they are still in space. The researchers describe the work in the April 7 issue of the journal Lab on a Chip.

There are five subtypes of white blood cells, and each serves a different function, which means it's useful to know the count for all of them. In general, lymphocytes use antibodies to attack certain viruses and bacteria; neutrophils are especially good at combating bacteria; eosinophils target parasites and certain infections; monocytes respond to inflammation and replenish white blood cells within bodily tissue; and basophils, the rarest of the subtypes, attack certain parasites.

"If we can give you a quick white blood cell count right in the doctor's office," says Wendian Shi, a graduate student in Tai's lab and lead author of the new paper, "you can know right away if you're dealing with a viral infection or a bacterial infection, and the doctor can prescribe the right medication."

The prototype device is able to count all five subtypes of white blood cells within a sample. It provides an accurate differential of the four major subtypes—lymphocytes, monocytes, eosinophils, and neutrophils. In addition, it could be used to flag an abnormally high level of the fifth subtype, basophils, which are normally too rare (representing less than one percent of all white blood cells) for accurate detection in clinical tests.

The entire new system fits in a small suitcase (12" x 9" x 5") and could easily be made into a handheld device, the engineers say.

A major development reported in the new paper is the creation of a detection assay that uses three dyes to stain white blood cells so that they emit light, or fluoresce, brightly in response to laser light. Blood samples are treated with this dye assay before measurement in the new device. The first dye binds strongly to the DNA found in the nucleus of white blood cells, making it simple to distinguish between white blood cells and the red blood cells that surround and outnumber them. The other two dyes help differentiate between the subtypes.

The heart of the new device is a 50-micrometer-long transparent channel made out of a silicone material with a cross section of only 32 micrometers by 28 micrometers—small enough to ensure that only one white blood cell at a time can flow through the detection region. The stained blood sample flows through this microfluidic channel to the detection region, where it is illuminated with a laser, causing it to fluoresce. The resulting emission of the sample is then split by a mirror into two beams, representing the green and red fluorescence.

Thanks to the dye assay, the white blood cell subtypes emit characteristic amounts of red and green light. Therefore, by determining the intensity of the emissions for each detected cell, the device can generate highly accurate differential white blood cell counts.

Shi says his ultimate goal is to develop a portable device that can help patients living with chronic diseases at home. "For these patients, who struggle to find a balance between their treatment and their normal quality of life, we would like to offer a device that will help them monitor their conditions at home," he says. "It would be nice to limit the number of trips they need to make to the hospital for testing."

The Lab on a Chip paper is titled "Four-part leukocyte differential count based on sheathless microflow cytometer and fluorescent dye assay." In addition to Tai and Shi, the coauthors on the paper are Luke Guo, a graduate student at MIT who worked on the project as an undergraduate student at Caltech, and Harvey Kasdan of LeukoDx Inc. in Jerusalem, Israel. The work was supported by the National Space Biomedical Research Institute under a NASA contract.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No

Developing Our Sense of Smell

Caltech biologists pinpoint the origin of olfactory nerve cells

PASADENA, Calif.—When our noses pick up a scent, whether the aroma of a sweet rose or the sweat of a stranger at the gym, two types of sensory neurons are at work in sensing that odor or pheromone. These sensory neurons are particularly interesting because they are the only neurons in our bodies that regenerate throughout adult life—as some of our olfactory neurons die, they are soon replaced by newborns. Just where those neurons come from in the first place has long perplexed developmental biologists.

Previous hypotheses about the origin of these olfactory nerve cells have given credit to embryonic cells that develop into skin or the central nervous system, where ear and eye sensory neurons, respectively, are thought to originate. But biologists at the California Institute of Technology (Caltech) have now found that neural-crest stem cells—multipotent, migratory cells unique to vertebrates that give rise to many structures in the body such as facial bones and smooth muscle—also play a key role in building olfactory sensory neurons in the nose.

"Olfactory neurons have long been thought to be solely derived from a thickened portion of the ectoderm; our results directly refute that concept," says Marianne Bronner, the Albert Billings Ruddock Professor of Biology at Caltech and corresponding author of a paper published in the journal eLIFE on March 19 that outlines the findings.

The two main types of sensory neurons in the olfactory system are ciliated neurons, which detect volatile scents, and microvillous neurons, which usually sense pheromones. Both of these types are found in the tissue lining the inside of the nasal cavity and transmit sensory information to the central nervous system for processing.

In the new study, the researchers showed that during embryonic development, neural-crest stem cells differentiate into the microvillous neurons, which had long been assumed to arise from the same source as the odor-sensing ciliated neurons. Moreover, they demonstrated that different factors are necessary for the development of these two types of neurons. By eliminating a gene called Sox10, they were able to show that formation of microvillous neurons is blocked whereas ciliated neurons are unaffected.

They made this discovery by studying the development of the olfactory system in zebrafish—a useful model organism for developmental biology studies due to the optical clarity of the free-swimming embryo. Understanding the origins of olfactory neurons and the process of neuron formation is important for developing therapeutic applications for conditions like anosmia, or the inability to smell, says Bronner.

"A key question in developmental biology—the extent of neural-crest stem cell contribution to the olfactory system—has been addressed in our paper by multiple lines of experimentation," says Ankur Saxena, a postdoctoral scholar in Bronner's laboratory and lead author of the study. "Olfactory neurons are unique in their renewal capacity across species, so by learning how they form, we may gain insights into how neurons in general can be induced to differentiate or regenerate. That knowledge, in turn, may provide new avenues for pursuing treatment of neurological disorders or injury in humans."

Next, the researchers will examine what other genes, in addition to Sox10, play a role in the process by which neural-crest stem cells differentiate into microvillous neurons. They also plan to look at whether or not neural-crest cells give rise to new microvillous neurons during olfactory regeneration that happens after the embryonic stage of development.

Funding for the research outlined in the eLIFE paper, "Sox10-dependent neural crest origin of olfactory microvillous neurons in zebrafish," was provided by the National Institutes of Health and the Gordon Ross Postdoctoral Fellowship. Brian N. Peng, a former undergraduate student (BS '12) at Caltech, also contributed to the study. A new open-access, high-impact journal, eLIFE is backed by three of the most prestigious biomedical research funders in the world: the Howard Hughes Medical Institute, the Max Planck Society, and the Wellcome Trust. 

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Two Decades of Discoveries

Keck Observatory marks 20th anniversary

Although Keith Matthews was about to make history, he went about his tasks like any others. It was the night of March 16, 1993, nearly 14,000 feet above sea level on Mauna Kea in Hawaii, and he had just installed the first instrument on the brand-new 10-meter telescope at W. M. Keck Observatory. Matthews, who built the instrument—a near-infrared camera, abbreviated NIRC—was set to make the first scientific observations using the newly crowned Biggest Telescope in the World.

This Saturday marks the 20th anniversary of those inaugural observations. Speaking at a symposium on March 7 commemorating the anniversary, Tom Soifer, chair of the Division of Physics, Mathematics and Astronomy, called those initial observations "one of the greatest events in astronomy. It's been a remarkable 20 years of exploration and discovery," he said.

At the time of that first observing run, the telescope had yet to be officially commissioned and wasn't yet optimized, but Matthews—now chief instrument scientist at Caltech—was there to see just what the telescope could do. "Fortunately, it worked right off the bat," he recalls.

The observatory was the culmination of more than a decade of planning, designing, and building made possible by unprecedented financial contributions from the Keck Foundation ($70 million for the first telescope) and by cutting-edge technology. But Matthews didn't feel much reason to jump for joy when he saw that first star, sharp and bright on the computer screen. He was too busy to be excited, he says, and those observations were just another set of steps in a long process that had begun more than a decade prior, when he joined the telescope design team in 1979. Caltech would become an official partner of the observatory in 1985, joining the University of California and the University of Hawaii (NASA would join in 1996).

To be sure, Matthews was happy that everything was working relatively smoothly. But, he says, throughout the whole process of making the telescope and the instrument a reality, there was always something else that he needed to focus on and get done. He was observing alone—a rarity these days—operating on four hours of sleep for roughly nine nights. He slept at 9,000 feet and had to make the drive up the summit into the sun's glare every day. The altitude at the summit made the work even more grueling.

And there were still the inevitable bugs and problems. For one, the Dewar—the container that housed the infrared camera and kept it cold—was leaking liquid helium. Matthews tried everything from rubber cement to glycerol to control the leak. The computers also kept crashing, the monitors going blank one by one. "It was funny," he recalls. "All the screens started to go like a house of cards."

He eventually found stopgap measures to control the leak, and the computers were simply rebooted. The observing run demonstrated that even before the telescope was fully optimized, it was already able to achieve better resolution than the 200-inch Hale Telescope at Palomar Observatory, supplanting the 200-inch as the world's most powerful telescope—a title the 200-inch had held since 1948. A second, identical Keck telescope was built in 1996.

In the two decades since, Keck has become arguably the most prominent and productive observatory in astronomy, helping scientists learn how the universe has evolved since the Big Bang, how galaxies form, and how stars are born. The twin telescopes, Keck I and Keck II, have studied dark matter—the mysterious, unseen stuff that makes up most of the universe's mass—as well as dark energy, the cosmic force that's pushing the universe apart. The telescopes have peered into other planetary systems and revealed insights into the origin of our own solar system. In describing how Keck has surpassed expectations, Caltech's Richard Ellis said at last week's symposium, "Unlike politicians, astronomers deliver much more than they predicted."

The symposium highlighted the fact that Keck has proved indispensible, as a powerful telescope in its own right and as an essential complement to other telescopes. "Keck has been fundamental in establishing partnerships with space telescopes," Ellis said. For example, he has used Keck with the Hubble Space Telescope and the Spitzer Space Telescope to probe some of the most distant galaxies ever observed, revealing a poorly understood period of cosmic history roughly a billion years after the Big Bang. With the help of Keck, Fiona Harrison—a Caltech astronomer and principal investigator of the NuSTAR mission, a space telescope that detects high-energy X rays—discovered bright flares emanating from the supermassive black hole at the center of the galaxy. The flares, she said, could be due to asteroids being ripped apart by the black hole.

And even though NASA's Kepler Space Telescope has been revolutionary in identifying thousands of candidate planets, a ground-based telescope like Keck is needed to verify and characterize those worlds. Caltech's John Johnson, for example, has used Keck to characterize what he says are the most typical kind of planetary system in the galaxy. From his analysis, he estimates that there are at least 100 billion planets in the Milky Way. Keck has also allowed Caltech's Mike Brown to measure detailed spectra of Jupiter's moon Europa, finding evidence that suggests its subsurface ocean may bubble up to its frozen surface.

Of course, none of these discoveries would have been possible without Keck's technological advances. Constructing a telescope as large as Keck using a single mirror would be prohibitively expensive and difficult to engineer. Instead, the Keck telescopes each consist of 36 hexagonal mirrors, forming a total aperture of 10 meters. No one had ever attempted a segmented-mirror telescope before Keck. Ellis was at Cambridge University while the telescope was being developed. "We were looking at this plan with total incredulity," he recalled at the symposium. "The idea of a finely segmented telescope was crazy to us, frankly."

One of the difficulties, for example, was in polishing the mirrors. Because a spherical mirror has rotational symmetry, it's relatively easy to polish. But because each of Keck's segments forms just a part of a parabolic curve, each mirror is asymmetrical, making it near impossible to polish. The solution? Force each segment into a spherical shape. Once polished, the mirror is released and pops back into its original, irregular form.

The telescope is fitted with a suite of instruments that have been constantly upgraded and replaced over the last 20 years—and Caltech has played leading roles with many of those instruments, including NIRC and NIRC2 (the second-generation NIRC) and MOSFIRE (co-led by Caltech's Chuck Steidel), a new spectrometer that was just installed last year. Matthews was also the leader on NIRC2, played a significant role in MOSFIRE, and is now leading the effort on a new instrument, a near-infrared spectrometer called NIRES.

As technology improves, telescopes get bigger and more powerful. Keck's eventual replacement, the Thirty Meter Telescope (TMT), in which Caltech is a partner, won't be ready for at least 10 years. In the meantime, Keck will continue to hold its status as the biggest telescope in the world. And, as Caltech's Judith Cohen pointed out in her symposium talk, even after the TMT is built Keck will remain a useful facility—in much the same way that Palomar Observatory remains productive more than 60 years after it was built. In the last two decades, Keck has had a good run in helping astronomers explore the cosmos—but that run is far from over.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Bursts of Star Formation in the Early Universe

PASADENA, Calif.—Galaxies have been experiencing vigorous bursts of star formation from much earlier in cosmic history than previously thought, according to new observations by a Caltech-led team.

These so-called starburst galaxies produce stars at a prodigious rate—creating the equivalent of a thousand new suns per year. Now the astronomers have found starbursts that were churning out stars when the universe was just a billion years old. Previously, astronomers didn't know whether galaxies could form stars at such high rates so early in time.

The discovery enables astronomers to study the earliest bursts of star formation and to deepen their understanding of how galaxies formed and evolved. The team describes their findings in a paper being published online on March 13 in the journal Nature and in two others that have been accepted for publication in the Astrophysical Journal.

Shining with the energy of over a hundred trillion suns, these newly discovered galaxies represent what the most massive galaxies in our cosmic neighborhood looked like in their star-making youth. "I find that pretty amazing," says Joaquin Vieira, a postdoctoral scholar at Caltech and leader of the study. "These aren't normal galaxies. They were forming stars at an extraordinary rate when the universe was very young—we were very surprised to find galaxies like this so early in the history of the universe."

The astronomers found dozens of these galaxies with the South Pole Telescope (SPT), a 10-meter dish in Antarctica that surveys the sky in millimeter-wavelength light—which is between radio waves and infrared on the electromagnetic spectrum. The team then took a more detailed look using the new Atacama Large Millimeter Array (ALMA) in Chile's Atacama Desert.

The new observations represent some of ALMA's most significant scientific results yet, Vieira says. "We couldn't have done this without the combination of SPT and ALMA," he adds. "ALMA is so sensitive, it is going to change our view of the universe in many different ways."

The astronomers only used the first 16 of the 66 dishes that will eventually form ALMA, which is already the most powerful telescope ever constructed for observing at millimeter and submillimeter wavelengths.

With ALMA, the astronomers found that more than 30 percent of the starburst galaxies are from a time period just 1.5 billion years after the big bang. Previously, only nine such galaxies were known to exist, and it wasn't clear whether galaxies could produce stars at such high rates so early in cosmic history. Now, with the new discoveries, the number of such galaxies has nearly doubled, providing valuable data that will help other researchers constrain and refine theoretical models of star and galaxy formation in the early universe.

But what's particularly special about the new findings, Vieira says, is that the team determined the cosmic distance to these dusty starburst galaxies by directly analyzing the star-forming dust itself. Previously, astronomers had to rely on a cumbersome combination of indirect optical and radio observations using multiple telescopes to study the galaxies. But because of ALMA's unprecedented sensitivity, Vieira and his colleagues were able make their distance measurements in one step, he says. The newly measured distances are therefore more reliable and provide the cleanest sample yet of these distant galaxies.

The measurements were also made possible because of the unique properties of these objects, the astronomers say. For one, the observed galaxies were selected because they could be gravitationally lensed—a phenomenon predicted by Einstein in which another galaxy in the foreground bends the light from the background galaxy like a magnifying glass. This lensing effect makes background galaxies appear brighter, cutting the amount of telescope time needed to observe them by 100 times.

Secondly, the astronomers took advantage of a fortuitous feature in these galaxies' spectra—which is the rainbow of light they emit—dubbed the "negative K correction." Normally, galaxies appear dimmer the farther away they are—in the same way a lightbulb appears fainter the farther away it is. But it turns out that the expanding universe shifts the spectra in such a way that light in millimeter wavelengths doesn't appear dimmer at greater distances. As a result, the galaxies appear just as bright in these wavelengths no matter how far away they are—like a magic lightbulb that appears just as bright no matter how distant it is.

"To me, these results are really exciting because they confirm the expectation that when ALMA is fully available, it can really allow astronomers to probe star formation all the way up to the edge of the observable universe," says Fred Lo, who, while not a participant in the study, was recently a Moore Distinguished Scholar at Caltech. Lo is a Distinguished Astronomer and Director Emeritus at the National Radio Astronomy Observatory, the North American partner of ALMA.

Additionally, observing the gravitational lensing effect will help astronomers map the dark matter—the mysterious unseen mass that makes up nearly a quarter of the universe—in the foreground galaxies. "Making high-resolution maps of the dark matter is one of the future directions of this work that I think is particularly cool," Vieira says.

These results represent only about a quarter of the total number of sources discovered by Vieira and his colleagues with the SPT, and they anticipate finding additional distant, dusty, starburst galaxies as they continue analyzing their data set. The ultimate goal for astronomers, Lo says, is to observe galaxies at all wavelengths throughout the history of the universe, piecing together the complete story of how galaxies have formed and evolved. So far, astronomers have made much progress in creating computer models and simulations of early galaxy formation, he says. But only with data—such as these new galaxies—will we ever truly piece together cosmic history. "Simulations are simulations," he says. "What really counts is what you see."

In addition to Vieira, the other Caltech authors on the Nature paper are Jamie Bock, professor of physics; Matt Bradford, visiting associate in physics; Martin Lueker-Boden, postdoctoral scholar in physics; Stephen Padin, senior research associate in astrophysics; Erik Shirokoff, a postdoctoral scholar in astrophysics with the Keck Institute for Space Studies; and Zachary Staniszewski, a visitor in physics. There are a total of 70 authors on the paper, which is titled "High-redshift, dusty, starburst galaxies revealed by gravitational lensing." This research was funded by the National Science Foundation, the Kavli Foundation, the Gordon and Betty Moore Foundation, NASA, the Natural Sciences and Engineering Research Council of Canada, the Canadian Research Chairs program, and the Canadian Institute for Advanced Research.

The work to measure the distances to the galaxies is described in the Astrophysical Journal paper "ALMA redshifts of millimeter-selected galaxies from the SPT survey: The redshift distribution of dusty star-forming galaxies," by Axel Weiss of the Max-Planck-Institut für Radioastronomie, and others. The study of the gravitational lensing is described in the Astrophysical Journal paper "ALMA observations of strongly lensed dusty star-forming galaxies," by Yashar Hezaveh of McGill University, and others.

ALMA, an international astronomy facility, is a partnership of Europe, North America, and East Asia in cooperation with the Republic of Chile. ALMA construction and operations are led on behalf of Europe by the European Southern Observatory (ESO) organization, on behalf of North America by the National Radio Astronomy Observatory (NRAO), and on behalf of East Asia by the National Astronomical Observatory of Japan (NAOJ). The Joint ALMA Observatory (JAO) provides the unified leadership and management of the construction, commissioning, and operation of ALMA.

The South Pole Telescope (SPT) is a 10-meter telescope located at the National Science Foundation (NSF) Amundsen-Scott South Pole Station, which lies within one kilometer of the geographic south pole. The SPT is designed to conduct low-noise, high-resolution surveys of the sky at millimeter and submillimeter wavelengths, with the particular design goal of making ultrasensitive measurements of the cosmic microwave background (CMB). The first major survey with the SPT was completed in October 2011 and covers 2,500 square degrees of the southern sky in three millimeter-wave observing bands. This is the deepest large millimeter-wave data set in existence and has already led to many groundbreaking science results, including the first detection of galaxy clusters through their Sunyaev-Zel'dovich effect signature, the most sensitive measurement yet of the small-scale CMB power spectrum, and the discovery of a population of ultrabright, high-redshift, star-forming galaxies. The SPT is funded primarily by the Division of Polar Programs in NSF's Geoscience Directorate. Partial support also is provided by the Kavli Institute for Cosmological Physics (KICP), an NSF-funded Physics Frontier Center; the Kavli Foundation; and the Gordon and Betty Moore Foundation. The SPT collaboration is led by the University of Chicago and includes research groups at Argonne National Laboratory, the California Institute of Technology, Cardiff University, Case Western Reserve University, Harvard University, Ludwig-Maximilians-Universität, the Smithsonian Astrophysical Observatory, McGill University, the University of Arizona, the University of California at Berkeley, the University of California at Davis, the University of Colorado at Boulder, and the University of Michigan, as well as individual scientists at several other institutions, including the European Southern Observatory and the Max-Planck-Institut für Radioastronomie in Bonn, Germany.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Astronomers Observe Planets Around Another Star Like Never Before

New instrument reveals exotic nature of four planets orbiting the same nearby star

PASADENA, Calif.—Thanks to a new high-tech gadget, astronomers have observed four planets orbiting a star relatively close to the sun in unprecedented detail, revealing the roughly ten-Jupiter-mass planets to be among the most exotic ones known.

The team, which includes several researchers from the California Institute of Technology (Caltech), describes its findings in a paper accepted for publication by the Astrophysical Journal.

The findings were made possible by a first-of-its-kind telescope imaging system that allowed the astronomers to pick out the planets amidst the bright glare of their parent star and measure their spectra—the rainbows of light that reveal the chemical signatures of planetary atmospheres. The system, dubbed Project 1640, enables astronomers to observe and characterize these kinds of planetary systems quickly and routinely, which has never been done before, the researchers say.

"These warm, red planets are unlike any other known objects in our universe," says Ben R. Oppenheimer, an astronomer at the American Museum of Natural History and the paper's lead author. And the planets are very different from one another as well. "All four planets have different spectra and all four are peculiar."

Astronomers had previously taken images of these four planets, which orbit a star called HR 8799, located 128 light years away. But because a star's light is tens of millions to billions of times brighter than the light from that star's own planets, distinguishing planet light from starlight so as to directly measure the spectra from the planets alone is difficult. "It's like taking a single picture of the Empire State Building from an airplane that reveals the height of the building but also a bump on the sidewalk next to it that is as high as a couple of bacteria," Oppenheimer explains. "Furthermore, we've been able to do this over a range of wavelengths in order to make a spectrum."

In the past, astronomers have been able to take spectra of some planets that pass in front of, or transit, their stars. But with Project 1640, which uses the Hale Telescope at Caltech's Palomar Observatory in Southern California, astronomers can now take the direct spectra of planets orbiting other stars—called exoplanets—that are not transiting. The device blocks the otherwise overwhelming starlight, picks out the faint specks that are planets, and obtains their spectra.  Project 1640 allowed the team to take spectra of all four of the planets around HR 8799 simultaneously, which had never been done for any planetary system before.  

The planets around HR 8799 are at about the same distance from that star as the solar system's gas giants (Jupiter, Saturn, Uranus, and Neptune) are from our sun. But since it's easier to detect transiting planets that are close to their stars, the transiting systems that astronomers have observed have small orbits—often smaller than Mercury's. The new results, therefore, represent the first spectra to be taken of gas giants located so far from their stars—a distance at which the influence of the stars' radiation, flares, and other features are weaker.

"We are now technically capable of obtaining spectra of giant planets in planetary systems like our own, improving on the close-in transiting planet studies done previously," says Lynne Hillenbrand, professor of astronomy at Caltech and a coauthor of the paper. And what the spectra show is that the planets are quite strange. "A remarkable thing about these planets is their unexpected spectroscopic diversity," Hillenbrand says.

One of the most striking abnormalities is an apparent chemical imbalance. Under most circumstances, ammonia and methane should naturally coexist in a planet's atmosphere—where there is one, there is usually the other—unless they are generated in extremely cold or hot environments. Yet the spectra of the HR 8799 planets, all of which have "lukewarm" temperatures of about 1000 Kelvin (1340 degrees Fahrenheit), either have methane or ammonia alone, with little or no sign of their chemical partner. There is also evidence of other chemicals such as acetylene—which has never before been detected on any exoplanet—and carbon dioxide.

The planets also are "redder"—they emit longer wavelengths of light—than celestial objects with similar temperatures. This could be explained by the presence of significant but patchy cloud cover on the planets, the authors say.

HR 8799 itself is very different from our sun, with 1.6 times its mass and five times its brightness. The brightness of this distant star can vary by as much as 8 percent over a period of two days; it produces about 1,000 times more ultraviolet light than the sun. All of these factors could induce complex weather and sooty hazes that would, in turn, cause periodic changes in the spectra.

More data are needed to further explore this solar system's unusual characteristics, the scientists say.

"The spectra of these four worlds clearly show that they are far too toxic and hot to sustain life as we know it," says coauthor Ian Parry, a senior lecturer at the Institute of Astronomy, Cambridge University. "But the really exciting thing is that, one day, the techniques we've developed will give us our first secure evidence of the existence of life on a planet outside our solar system."

The techniques used by Project 1640 require the coordinated operation of four major instruments: the world's most advanced adaptive-optics system, which can make millions of tiny adjustments to the device's two six-inch mirrors every second; a coronagraph that optically dims the star but not other celestial objects in the field of view; an imaging spectrograph that records 30 images in a rainbow of colors simultaneously; and a specialized wavefront sensor that distinguishes between residual starlight that sneaks through the coronagraph and light from planets, allowing scientists to filter out background starlight more effectively.

With these instruments working in concert, the project is able to reveal celestial objects 1 million to 10 million times fainter than the star at the center of one of its images, with only an hour of observations. It is also capable of measuring the orbital motion of objects.

"Our young century has seen seminal advances in exoplanet science, but almost exclusively from indirect observations," says Richard Dekany, a coauthor and associate director for development for Caltech Optical Observatories. "Project 1640 has now added to this revolution the scientific gold standard: directly measured spectra of young giant planets. Our initial findings suggest each of these strange and wonderful giant planets may have a unique story to share."

The researchers are already collecting more data on this system so as to look for changes in the planets over time; they are also surveying other young stars. During its three-year survey at Palomar, which started in June, Project 1640 aims to survey 200 stars within about 150 light years of our solar system.

"In the 19th century, it was thought impossible to know the composition of stars, but the invention of astronomical spectroscopy has revealed detailed information about nearby stars and distant galaxies," says Charles Beichman, executive director of the NASA Exoplanet Science Institute at Caltech. "Now, with Project 1640, we are beginning to turn this tool to the investigation of neighboring exoplanets to learn about the composition, temperature, and other characteristics of their atmospheres."

The title of the the Astrophysical Journal paper is "Reconnaissance of the HR 8799 exosolar system I: Near IR spectroscopy." In addition to Hillenbrand, Dekany, and Beichman, the other Caltech authors are postdocs Christoph Baranec and Sasha Hinkley; former postdoc Justin Crepp (now at the University of Notre Dame); and programmer David Hale, along with a similar number of JPL authors. This work was supported by the National Science Foundation and NASA. Additional funding sources for Project 1640 are listed here.

This story is adapted from a press release by Kendra Snyder of the American Natural History Museum.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Creating Indestructible Self-Healing Circuits

Caltech engineers build electronic chips that repair themselves

PASADENA, Calif.—Imagine that the chips in your smart phone or computer could repair and defend themselves on the fly, recovering in microseconds from problems ranging from less-than-ideal battery power to total transistor failure. It might sound like the stuff of science fiction, but a team of engineers at the California Institute of Technology (Caltech), for the first time ever, has developed just such self-healing integrated chips.

The team, made up of members of the High-Speed Integrated Circuits laboratory in Caltech's Division of Engineering and Applied Science, has demonstrated this self-healing capability in tiny power amplifiers. The amplifiers are so small, in fact, that 76 of the chips—including everything they need to self-heal—could fit on a single penny. In perhaps the most dramatic of their experiments, the team destroyed various parts of their chips by zapping them multiple times with a high-power laser, and then observed as the chips automatically developed a work-around in less than a second.

"It was incredible the first time the system kicked in and healed itself. It felt like we were witnessing the next step in the evolution of integrated circuits," says Ali Hajimiri, the Thomas G. Myers Professor of Electrical Engineering at Caltech. "We had literally just blasted half the amplifier and vaporized many of its components, such as transistors, and it was able to recover to nearly its ideal performance."

The team's results appear in the March issue of IEEE Transactions on Microwave Theory and Techniques.

Until now, even a single fault has often rendered an integrated-circuit chip completely useless. The Caltech engineers wanted to give integrated-circuit chips a healing ability akin to that of our own immune system—something capable of detecting and quickly responding to any number of possible assaults in order to keep the larger system working optimally. The power amplifier they devised employs a multitude of robust, on-chip sensors that monitor temperature, current, voltage, and power. The information from those sensors feeds into a custom-made application-specific integrated-circuit (ASIC) unit on the same chip, a central processor that acts as the "brain" of the system. The brain analyzes the amplifier's overall performance and determines if it needs to adjust any of the system's actuators—the changeable parts of the chip.

Interestingly, the chip's brain does not operate based on algorithms that know how to respond to every possible scenario. Instead, it draws conclusions based on the aggregate response of the sensors. "You tell the chip the results you want and let it figure out how to produce those results," says Steven Bowers, a graduate student in Hajimiri's lab and lead author of the new paper. "The challenge is that there are more than 100,000 transistors on each chip. We don't know all of the different things that might go wrong, and we don't need to. We have designed the system in a general enough way that it finds the optimum state for all of the actuators in any situation without external intervention."

Looking at 20 different chips, the team found that the amplifiers with the self-healing capability consumed about half as much power as those without, and their overall performance was much more predictable and reproducible. "We have shown that self-healing addresses four very different classes of problems," says Kaushik Dasgupta, another graduate student also working on the project. The classes of problems include static variation that is a product of variation across components; long-term aging problems that arise gradually as repeated use changes the internal properties of the system; and short-term variations that are induced by environmental conditions such as changes in load, temperature, and differences in the supply voltage; and, finally, accidental or deliberate catastrophic destruction of parts of the circuits.

The Caltech team chose to demonstrate this self-healing capability first in a power amplifier for millimeter-wave frequencies. Such high-frequency integrated chips are at the cutting edge of research and are useful for next-generation communications, imaging, sensing, and radar applications. By showing that the self-healing capability works well in such an advanced system, the researchers hope to show that the self-healing approach can be extended to virtually any other electronic system.

"Bringing this type of electronic immune system to integrated-circuit chips opens up a world of possibilities," says Hajimiri. "It is truly a shift in the way we view circuits and their ability to operate independently. They can now both diagnose and fix their own problems without any human intervention, moving one step closer to indestructible circuits."

Along with Hajimiri, Bowers, and Dasgupta, former Caltech postdoctoral scholar Kaushik Sengupta (PhD '12), who is now an assistant professor at Princeton University, is also a coauthor on the paper, "Integrated Self-Healing for mm-Wave Power Amplifiers." A preliminary report of this work won the best paper award at the 2012 IEEE Radio Frequency Integrated Circuits Symposium. The work was funded by the Defense Advanced Research Projects Agency and the Air Force Research Laboratory.

Writer: 
Kimm Fesenmaier
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

A Window Into Europa's Ocean Lies Right at the Surface

Caltech and JPL researchers find evidence that a jovian ocean is not isolated

PASADENA, Calif.—If you could lick the surface of Jupiter's icy moon Europa, you would actually be sampling a bit of the ocean beneath. So says Mike Brown, an astronomer at the California Institute of Technology (Caltech). Brown—known as the Pluto killer for discovering a Kuiper-belt object that led to the demotion of Pluto from planetary status—and Kevin Hand from the Jet Propulsion Laboratory (JPL) have found the strongest evidence yet that salty water from the vast liquid ocean beneath Europa's frozen exterior actually makes its way to the surface.

The finding, based on some of the first data of its kind since NASA's Galileo mission (1989–2003) to study Jupiter and its moons, suggests that there is a chemical exchange between the ocean and surface, making the ocean a richer chemical environment, and implies that learning more about the ocean could be as simple as analyzing the moon's surface. The work is described in a paper that has been accepted for publication in the Astronomical Journal.

"We now have evidence that Europa's ocean is not isolated—that the ocean and the surface talk to each other and exchange chemicals," says Brown, the Richard and Barbara Rosenberg Professor and professor of planetary astronomy at Caltech. "That means that energy might be going into the ocean, which is important in terms of the possibilities for life there. It also means that if you'd like to know what's in the ocean, you can just go to the surface and scrape some off."

"The surface ice is providing us a window into that potentially habitable ocean below," says Hand, deputy chief scientist for solar system exploration at JPL.

Since the days of the Galileo mission, when the spacecraft showed that Europa was covered with an icy shell, scientists have debated the composition of Europa's surface. The infrared spectrometer aboard Galileo was not capable of providing the detail needed to definitively identify some of the materials present on the surface. Now, using current technology on ground-based telescopes, Brown and Hand have identified a spectroscopic feature on Europa's surface that indicates the presence of a magnesium sulfate salt, a mineral called epsomite, that could only originate from the ocean below.

"Magnesium should not be on the surface of Europa unless it's coming from the ocean," Brown says. "So that means ocean water gets onto the surface, and stuff on the surface presumably gets into the ocean water."

Europa's ocean is thought to be 100 kilometers deep and covers the entire globe. The moon remains locked in relation to Jupiter, with the same hemisphere always leading and the other trailing in its orbit. The leading hemisphere has a yellowish appearance, while the trailing hemisphere seems to be splattered and streaked with a red material.

The spectroscopic data from that red side has been a cause of scientific debate for 15 years. It is thought that one of Jupiter's largest moons, Io, spews volcanic sulfur from its atmosphere, and Jupiter's strong magnetic field sends some of that sulfur hurtling toward the trailing hemisphere of Europa, where it sticks. It is also clear from Galileo's data that there is something other than pure water ice on the trailing hemisphere's surface. The debate has focused on what that other something is—i.e., what has caused the spectroscopic data to deviate from the signature of pure water ice.

"From Galileo's spectra, people knew something was there besides water. They argued for years over what it might be—sodium sulfate, hydrogen sulfate, sodium hydrogen carbonate, all these things that look more or less similar in this range of the spectrum," says Brown. "But the really difficult thing was that the spectrometer on the Galileo spacecraft was just too coarse."

Brown and Hand decided that the latest spectrometers on ground-based telescopes could improve the data pertaining to Europa, even from a distance of about 400 million miles. Using the Keck II telescope on Mauna Kea—which is outfitted with adaptive optics to adjust for the blurring effect of Earth's atmosphere—and its OH-Suppressing Infrared Integral Field Spectrograph (OSIRIS), they first mapped the distribution of pure water ice versus anything else on the moon. The spectra showed that even Europa's leading hemisphere contains significant amounts of nonwater ice. Then, at low latitudes on the trailing hemisphere—the area with the greatest concentration of the nonwater ice material—they found a tiny dip in the spectrum that had never been detected before.

"We now have the best spectrum of this thing in the world," Brown says. "Nobody knew there was this little dip in the spectrum because no one had the resolution to zoom in on it before."

The two researchers racked their brains to come up with materials that might explain the new spectroscopic feature, and then tested everything from sodium chloride to Drano in Hand's lab at JPL, where he tries to simulate the environments found on various icy worlds. "We tried to think outside the box to consider all sorts of other possibilities, but at the end of the day, the magnesium sulfate persisted," Hand says.

Some scientists had long suspected that magnesium sulfate was on the surface of Europa. But, Brown says, "the interesting twist is that it doesn't look like the magnesium sulfate is coming from the ocean." Since the mineral he and Hand found is only on the trailing side, where the moon is being bombarded with sulfur from Io, they believe that there is a magnesium-bearing mineral everywhere on Europa that produces magnesium sulfate in combination with sulfur. The pervasive magnesium-bearing mineral might also be what makes up the nonwater ice detected on the leading hemisphere's surface.

Brown and Hand believe that this mystery magnesium-bearing mineral is magnesium chloride. But magnesium is not the only unexpected element on the surface of Europa. Fifteen years ago, Brown showed that Europa is surrounded by an atmosphere of atomic sodium and potassium, presumably originating from the surface. The researchers reason that the sodium and potassium chlorides are actually the dominant salts on the surface of Europa, but that they are not detectable because they have no clear spectral features.

The scientists combined this information with the fact that Europa's ocean can only be one of two types—either sulfate-rich or chlorine-rich. Having ruled out the sulfate-rich version since magnesium sulfate was found only on the trailing side, Brown and Hand hypothesize that the ocean is chlorine-rich and that the sodium and potassium must be present as chlorides.

Therefore, Brown says, they believe the composition of Europa's sea closely resembles the salty ocean of Earth. "If you could go swim down in the ocean of Europa and taste it, it would just taste like normal old salt," he says.

Hand emphasizes that, from an astrobiology standpoint, Europa is considered a premier target in the search for life beyond Earth; a NASA-funded study team led by JPL and the Johns Hopkins University Applied Physics Laboratory have been working with the scientific community to identify options to explore Europa further.  "If we've learned anything about life on Earth, it's that where there's liquid water, there's generally life," Hand says. "And of course our ocean is a nice salty ocean. Perhaps Europa's salty ocean is also a wonderful place for life."

The Astronomical Journal paper is titled "Salts and radiation products on the surface of Europa." The work was supported, in part, by the NASA Astrobiology Institute through the Astrobiology of Icy Worlds node at JPL.

Writer: 
Kimm Fesenmaier
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Visualizing Biological Networks in 4D

A unique microscope invented at Caltech captures the motion of DNA structures in space and time

PASADENA, Calif.—Every great structure, from the Empire State Building to the Golden Gate Bridge, depends on specific mechanical properties to remain strong and reliable. Rigidity—a material's stiffness—is of particular importance for maintaining the robust functionality of everything from colossal edifices to the tiniest of nanoscale structures. In biological nanostructures, like DNA networks, it has been difficult to measure this stiffness, which is essential to their properties and functions. But scientists at the California Institute of Technology (Caltech) have recently developed techniques for visualizing the behavior of biological nanostructures in both space and time, allowing them to directly measure stiffness and map its variation throughout the network.

The new method is outlined in the February 4 early edition of the Proceedings of the National Academy of Sciences (PNAS).

"This type of visualization is taking us into domains of the biological sciences that we did not explore before," says Nobel Laureate Ahmed Zewail, the Linus Pauling Professor of Chemistry and professor of physics at Caltech, who coauthored the paper with Ulrich Lorenz, a postdoctoral scholar in Zewail's lab. "We are providing the methodology to find out—directly—the stiffness of a biological network that has nanoscale properties."

Knowing the mechanical properties of DNA structures is crucial to building sturdy biological networks, among other applications. According to Zewail, this type of visualization of biomechanics in space and time should be applicable to the study of other biological nanomaterials, including the abnormal protein assemblies that underlie diseases like Alzheimer's and Parkinson's.

Zewail and Lorenz were able to see, for the first time, the motion of DNA nanostructures in both space and time using the four-dimensional (4D) electron microscope developed at Caltech's Physical Biology Center for Ultrafast Science and Technology. The center is directed by Zewail, who created it in 2005 to advance understanding of the fundamental physics of chemical and biological behavior.

"In nature, the behavior of matter is determined by its structure—the arrangements of its atoms in the three dimensions of space—and by how the structure changes with time, the fourth dimension," explains Zewail. "If you watch a horse gallop in slow motion, you can follow the time of the gallops, and you can see in detail what, for example, each leg is doing over time. When we get to the nanometer scale, that is a different story—we need to improve the spatial resolution to a billion times that of the horse in order to visualize what is happening."

Zewail was awarded the 1999 Nobel Prize in Chemistry for his development of femtochemistry, which uses ultrashort laser flashes to observe fundamental chemical reactions occurring at the timescale of the femtosecond (one millionth of a billionth of a second). Although femtochemistry can capture atoms and molecules in motion, giving the time dimension, it cannot concurrently show the dimensions of space, and thus the structure of the material. This is because it utilizes laser light with wavelengths that far exceed the dimension of a nanostructure, making it impossible to resolve and image nanoscale details in tiny physical structures such as DNA .

To overcome this major hurdle, the 4D electron microscope employs a stream of individual electrons that scatter off objects to produce an image. The electrons are accelerated to wavelengths of picometers, or trillionths of a meter, providing the capability for visualizing the structure in space with a resolution a thousand times higher than that of a nanostructure, and with a time resolution of femtoseconds or longer.

The experiments reported in PNAS began with a structure created by stretching DNA over a hole embedded in a thin carbon film. Using the electrons in the microscope, several DNA filaments were cut away from the carbon film so that a three-dimensional, free-standing structure was achieved under the 4D microscope.

Next, the scientists employed laser heat to excite oscillations in the DNA structure, which were imaged using the electron pulses as a function of time—the fourth dimension. By observing the frequency and amplitude of these oscillations, a direct measure of stiffness was made.

"It was surprising that we could do this with a complex network," says Zewail. "And yet by cutting and probing, we could go into a selective area of the network and find out about its behavior and properties."

Using 4D electron microscopy, Zewail's group has begun to visualize protein assemblies called amyloids, which are believed to play a role in many neurodegenerative diseases, and they are continuing their investigation of the biomechanical properties of these networks. He says that this technique has the potential for broad applications not only to biological assemblies, but also in the materials science of nanostructures.

Funding for the research outlined in the PNAS paper, "Biomechanics of DNA structures visualized by 4D electron microscopy," was provided by the National Science Foundation and the Air Force Office of Scientific Research. The Physical Biology Center for Ultrafast Science and Technology at Caltech is supported by the Gordon and Betty Moore Foundation.

Writer: 
Katie Neith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Creating New Quantum Building Blocks

Caltech researcher says diamond defects could serve as nodes of a quantum network

PASADENA, Calif.—Scientists have long dreamed of creating a quantum computer—a device rooted in the bizarre phenomena that transpire at the level of the very small, where quantum mechanics rules the scene. It is believed that such new computers could process currently unsolvable problems in seconds.  

Researchers have tried using various quantum systems, such as atoms or ions, as the basic, transistor-like units in simple quantum computation devices. Now, laying the groundwork for an on-chip optical quantum network, a team of researchers, including Andrei Faraon from the California Institute of Technology (Caltech), has shown that defects in diamond can be used as quantum building blocks that interact with one another via photons, the basic units of light.

The device is simple enough—it involves a tiny ring resonator and a tunnel-like optical waveguide, which both funnel light. Both structures, each only a few hundred nanometers wide, are etched in a diamond membrane and positioned close together atop a glass substrate. Within the resonator lies a nitrogen-vacancy center (NV center)—a defect in the structure of diamond in which a nitrogen atom replaces a carbon atom, and in which a nearby spot usually reserved for another carbon atom is simply empty. Such NV centers are photoluminescent, meaning they absorb and emit photons.

"These NV centers are like the building blocks of the network, and we need to make them interact—like having an electrical current connecting one transistor to another," explains Faraon, lead author on a paper describing the work in the New Journal of Physics. "In this case, photons do that job."

In recent years, diamond has become a heavily researched material for use in quantum photonic devices in part because the diamond lattice is able to protect impurities from excessive interactions. The so-called quietness it affords enables impurities—such as NV centers—to store information unaltered for relatively long periods of time.  

To begin their experiment, the researchers first cool the device below 10 Kelvin (−441.67 degrees Fahrenheit) and then shine green laser light on the NV center, causing it to reach an excited state and then emit red light. As the red light circles within the resonator, it constructively interferes with itself, increasing its intensity. Slowly, the light then leaks into the nearby waveguide, which channels the photons out through gratings at either end, scattering the light out of the plane of the chip.

The emitted photons have the property of being correlated, or entangled, with the NV center from which they came. This mysterious quality of entanglement, which makes two quantum states inextricably linked in such a way that any information you learn about one provides information about the other, is a necessary ingredient for quantum computation. It enables a large amount of information to be stored and processed by fewer components that take up a small amount of space.

"Right now we only have one nitrogen-vacancy center that's emitting photons, but in the future we envision creating multiple NV centers that emit photons on the same chip," Faraon says. "By measuring these photons we could create entanglement among multiple NV centers on the chip."

And that's important because, in order to make a quantum computer, you would need millions—maybe billions—of these units. "As you can see, we're just working at making one or a few," Faraon says. "But there are other applications down the line that are easier to achieve." For example, a quantum network with a couple hundred units could simulate the behavior of a complex molecule—a task that conventional computers struggle with.

Going forward, Faraon plans to investigate whether other materials can behave similarly to diamond in an optical quantum network.

In addition to Faraon, the authors on the paper, "Quantum photonic devices in single-crystal diamond," are Charles Santori, Zhihong Huang, Kai-Mei Fu, Victor Acosta, David Fattal, and Raymond Beausoleil of Hewlett-Packard Laboratories, in Palo Alto, California. Fu is now an assistant professor at the University of Washington in Seattle, Washington. The work was supported by the Defense Advanced Research Projects Agency and The Regents of the University of California.  

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - research_news