Caltech Physicist Named National Security Science and Engineering Faculty Fellow

PASADENA, Calif.-The Department of Defense (DoD) has named H. Jeff Kimble, William L. Valentine Professor and professor of physics at the California Institute of Technology (Caltech), one of 11 university faculty scientists and engineers in its 2010 class of National Security Science and Engineering Faculty Fellows (NSSEFF).

Up to $4.2 million of direct research support will be given to each NSSEFF fellow for up to five years to conduct unclassified research on topics of interest to the DoD. The grants are intended to engage the next generation of outstanding scientists and engineers in exploring the most challenging technical issues facing the DoD.

"These distinguished researchers have a demonstrated record of success in fields of strategic importance to the DoD. Their NSSEFF work will not only contribute to preparing the DoD and the nation for an uncertain future, but will also develop the necessary high quality science, technology, engineering, and mathematics talent that will be essential to the department's continued success," says the DoD's Zachary J. Lemnios, director of Defense Research and Engineering.

The fellows conduct basic research in core science and engineering disciplines that are expected to underpin future DoD technology development. Kimble's research proposal will be carried out by graduate students and postdoctoral associates in his Quantum Optics Group at Caltech. The research will build on the foundation of an existing advanced laboratory infrastructure for the manipulation of single atoms and photons.

"The receipt of the NSSEFF award is wonderful news for my research group," says Kimble. "In a time of increasingly proscriptive funding of basic research, this grant will enable exciting new opportunities for achieving strong interactions between single atoms, photons, and phonons."

Kimble's research program will attempt to harness strong interactions between light and matter to implement complex quantum networks and thereby to investigate qualitatively new phenomena in quantum information science.

More than 800 nomination letters from academic institutions led to the selection of the 21 NSSEFF semifinalists and the final 11 fellows.

Writer: 
Jon Weiner
Writer: 
Exclude from News Hub: 
Yes

Caltech Mourns the Passing of Andrew Lange

Dr. Andrew Lange, the Marvin L. Goldberger Professor of Physics at Caltech, passed away Friday, January 22, 2010.

Lange had been at Caltech since 1993. He graduated from Princeton University with his BA in 1980 and received his PhD from UC Berkeley in 1987. He first came to the Institute as a visiting associate in 1993-94, was appointed a full professor in 1994, and was named the Goldberger Professor in 2001. In 2006 he was named a senior research scientist at the Jet Propulsion Laboratory and in 2008 was appointed chair of the Division of Physics, Mathematics and Astronomy. He had recently resigned from his chairmanship of the division.

The principal focus of Lange's research was the Cosmic Microwave Background (CMB)-a gas of thermal radiation left over from the Big Bang-that filled the entire universe. He developed a new generation of radio-frequency detectors and led a string of experiments that employed this novel technology to study the CMB. He is perhaps best known for co-leading the BOOMERanG experiment, the first experiment to image the CMB with sufficiently high fidelity and angular resolution to determine that the spatial geometry of the universe is flat. The data further allowed precise measurement of the age of the universe and the abundance of the dark matter known to hold galaxies together. The data also supported previous measurements that suggested that the cosmological expansion is actually accelerating, implying either a violation of Einstein's general relativity or that the Universe is filled with "dark energy," some exotic new negative-pressure fluid. BOOMERanG also confirmed the predictions of inflation, an ambitious theory that aims to explain the very earliest fraction of a nanosecond after the Big Bang.

Lange's subsequent work has improved upon these measurements and aimed also to detect the primordial gas of gravitational waves predicted by inflation through their effect on the CMB polarization. Lange was also one of the leaders of the recently launched Planck satellite, a collaboration between US and European scientists that aims to image the CMB with unprecedented precision.

Lange was a member of the American Academy of Arts and Sciences, the National Academy of Sciences, and the American Physical Society. Lange and Dr. Saul Perlmutter (from the Lawrence Berkeley National Laboratory) were jointly named the 2003 California Scientist of the Year for their seminal contributions to cosmology. Lange shared the 2006 Balzan Prize for Observational Astronomy and Astrophysics with Paolo de Bernardis (of the University of Rome), his BOOMERanG coleader. The two shared the 2009 Dan David Prize with Paul Richards, a coleader of the parallel MAXIMA experiment.

Counseling services are available 24 hours a day to Caltech students at the Counseling Center. Other members of the campus community may visit the Counseling Center website or call the Staff and Faculty Consultation Center at (626) 395-8360.

Writer: 
Jon Weiner
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Caltech Astronomer Spots Second Smallest Exoplanet

Discovery highlights new potential for eventually finding Earth-mass planets

PASADENA, Calif.-Astronomers from the California Institute of Technology (Caltech) and other institutions, using the highly sensitive 10-meter Keck I telescope atop Hawaii's Mauna Kea, have detected an extrasolar planet with a mass just four times that of Earth. The planet, which orbits its parent star HD156668 about once every four days, is the second-smallest world among the more than 400 exoplanets (planets located outside our solar system) that have been found to date. It is located approximately 80 light-years from Earth in the direction of the constellation Hercules.

The find, made possible through NASA's Eta-Earth Survey for Low-Mass Planets was announced last week at the 215th American Astronomical Society meeting held January 4-7, 2010, in Washington, D.C.

Dubbed HD 156668b, the planet-a so-called "super Earth" that would glow with blast-furnace-like temperatures-offers a tantalizing hint of discoveries yet to come. Astronomers hope those discoveries will include Earth-size planets located in the "habitable zone," the area roughly the distance from the earth to the sun, and thus potentially favorable to life.

HD 156668b was discovered with the radial velocity or wobble method, which relies on Keck's High Resolution Echelle Spectrometer (HIRES) to spread light collected from the telescope into its component wavelengths or colors, producing a spectrum. As the planet orbits the star, it causes the star to move back and forth along our line of sight, which causes the starlight to become redder and then bluer in a periodic fashion.

The color shifts give astronomers the mass of the planet and the characteristics of its orbit, such as how much time it takes to orbit the star. The majority of the exoplanets discovered have been found in this way.

The discovery of low-mass planets like HD 156668b has become possible due to the development of techniques to watch stars wobble with increasing clarity, and of software that can pluck the signals of increasingly smaller planets from amid the 'noise' made by their pulsating, wobbling parent stars.

"If the stars themselves have imperfections and are unstable, their wobbling would cause jumps in velocity that could mimic or hide the existence of a planet," says John A. Johnson, assistant professor of astronomy at Caltech and codiscoverer of the new planet along with Andrew Howard and Geoff Marcy of the University of California at Berkeley, Debra Fischer of Yale University, Jason Wright of Penn State University, and the members of the California Planet Survey collaboration.

"We have been doing simulations to understand the astrophysics of these imperfections, and how to distinguish them from the signals from a planet," says Johnson. "We hope to use these simulations to design even better observing strategies and data-analysis techniques."

The discovery of a planet that is comparable in size to Earth and found within the habitable zone, however, "will require a great deal of work," he says. "If we could build the best possible radial-velocity instrument tomorrow, we might have answers in three years, and a solid census of Earthlike planets within a decade. We'll need gigantic leaps in sensitivity to get there, and we're hot on the trail."

Johnson is also currently building a new camera for the 60-inch telescope at Caltech's Palomar Observatory. The camera will allow astronomers to search for the passages-or transits-of low-mass planets like HD156668 across the faces of their stars.

"If we catch the planet in transit, we can measure the planet's radius and density, and therefore address the question of whether the planet has a composition more like Earth, with a solid surface and thin atmosphere, or is a miniature version of Neptune, with a heavy gaseous atmosphere," he says.

The Keck I telescope is part of the Keck Observatory, a joint effort of Caltech and the University of California.

For more information about extrasolar planet discoveries, visit http://exoplanets.org.

Writer: 
Kathy Svitil
Writer: 

Caltech Physicists Propose Quantum Entanglement for Motion of Microscopic Objects

PASADENA, Calif.—Researchers at the California Institute of Technology (Caltech) have proposed a new paradigm that should allow scientists to observe quantum behavior in small mechanical systems. 

Their ideas, described in the early online issue of the Proceedings of the National Academy of Sciences, offer a new means of addressing one of the most fascinating issues in quantum mechanics: the nature of quantum superposition and entanglement in progressively larger and more complex systems. 

A quantum superposition is a state in which a particle, such as a photon or atom, exists simultaneously in two locations. Entanglement, which Albert Einstein called "spooky action at a distance," allows particles to share information even if they are physically separated.

A key challenge in observing quantum behavior in a small mechanical system is suppressing interactions between the system and its noisy environment—i.e., the surrounding material supporting the system or any other external contact. The random thermal vibrations of the system's surroundings, for example, can be transferred to the mechanical object and destroy its fragile quantum properties. To address this issue, a number of groups worldwide have begun to use cryogenic setups in which the immediate environment is cooled down to a very low temperature to reduce the magnitude of these random vibrations.

The Caltech team suggests a fundamentally different approach: using the forces imparted by intense beams of light to "levitate" the entire mechanical object, thereby freeing it from external contact and material supports. This approach, the researchers show, can dramatically reduce environmental noise, to the point where diverse manifestations of quantum behavior should be observable even when the environment is at room temperature. 

Among the scientists involved in the work are Darrick Chang, a postdoctoral scholar at Caltech's Institute for Quantum Information; Oskar Painter, associate professor of applied physics; and H. Jeff Kimble, Caltech's William L. Valentine Professor and professor of physics.

The idea of using optical forces to trap or levitate small particles is actually well established. It was pioneered by Arthur Ashkin of Bell Laboratories in the 1970s and 1980s, and has since formed the basis for scientific advances such as the development of "optical tweezers"—which are frequently used to control the motion of small biological objects—and the use of lasers to cool atoms and trap them in space. These techniques provide an extremely versatile toolbox for manipulating atoms, and have been employed to demonstrate a variety of quantum phenomena at the atomic level. 

In the new work, Chang and his colleagues demonstrate theoretically that similar success can be achieved when an individual atom is replaced by a much more massive—but still nanoscale—mechanical system. A related scheme has been presented simultaneously by a group at the Max Planck Institute of Quantum Optics in Garching, Germany [http://arxiv.org/abs/0909.1469]. 

The system proposed by the Caltech team consists of a small sphere made out of a highly transparent material such as fused silica. When the sphere comes into contact with a laser beam, optical forces naturally push the sphere toward the point where the intensity of light is greatest, trapping the sphere at that point. The sphere typically spans about 100 nm in diameter, or roughly a thousandth the width of a human hair.  Because of its small size, the sphere's remaining interactions with the environment—any that don't involve direct contact with another material, because the sphere is levitating—are sufficiently weak that quantum behavior should easily emerge.

For such behavior to appear, however, the sphere must also be placed inside an optical cavity, which is formed by two mirrors located on either side of the trapped sphere. The light that bounces back and forth between the mirrors both senses the motion of the sphere and is used to manipulate that motion at a quantum-mechanical level.

The researchers describe how this interaction can be used to remove energy from, or cool, the mechanical motion until it reaches its quantum ground state—the lowest energy allowable by quantum mechanics. A fundamental limit to this process is set by the relative strengths of the optical cooling and the rate at which the environment tends to heat (return energy to) the motion, bringing it back to the ambient temperature. 

In principle, the motion of the well-isolated sphere can be cooled starting from room temperature down to a final temperature that is ten million times lower; in that super-cooled state, the center of mass of the sphere moves by only the minimum possible amount set by intrinsic quantum fluctuations. 

The researchers also propose a scheme to observe a feature known as entanglement, which lies at the heart of quantum mechanics. Two remotely located systems that are quantum entangled share correlations between them that are stronger than classically allowed. In certain circumstances, entanglement can be a very valuable resource; it forms the basis for proposals to realize improved metrology and more powerful (quantum) computers.

The proposed scheme consists of sending a pair of initially entangled beams of light —the production of which was first accomplished by Kimble's group at Caltech in 1992—into two separate cavities, each containing a levitated sphere. Through a process known as quantum-state transfer, all of the properties of the light—in particular, the entanglement and its associated correlations—can be mapped onto the motion of the two spheres. 

While the sizes of these nanomechanical objects are still very far from those we associate with everyday experience, the Caltech researchers believe that their proposal presents an exciting opportunity to realize and control quantum phenomena at unprecedented scales—in this case, for objects containing approximately 10 million atoms.

Other researchers involved in this work include graduate student Dalziel Wilson and postdoctoral scholars Cindy Regal and Scott Papp at Caltech; Jun Ye, a fellow at JILA, a joint institute of the University of Colorado at Boulder and the National Institute of Standards and Technology; and Peter Zoller, a professor at the University of Innsbruck. The work was initiated while Ye and Zoller were visiting as Gordon and Betty Moore Distinguished Scholars at Caltech.

The work in the PNAS paper, "Cavity optomechanics using an optically levitated nanosphere," was supported by the Gordon and Betty Moore Foundation, the National Science Foundation, the Army Research Office, Northrop Grumman Space Technology, the Austrian Science Fund, and European Union Projects.

Writer: 
Kathy Svitil
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

High Energy Physicists Set New Record for Network Data Transfer

Caltech-led high-energy physicists show how long range networks can be used to support leading edge science

PASADENA, Calif. - Building on eight years of record-breaking developments, and on the restart of the Large Hadron Collider (LHC), an international team of high-energy physicists, computer scientists, and network engineers led by the California Institute of Technology (Caltech) joined forces to capture the Bandwidth Challenge award for massive data transfers during the SuperComputing 2009 (SC09) conference held in Portland, Oregon.

Caltech's partners in the project include scientists from Michigan (UM), Fermilab, Brookhaven National Laboratory, CERN, San Diego (UCSD), Florida (UF and FIU), Brazil (Rio de Janeiro State University, UERJ, and the São Paulo State University, UNESP), Korea (Kyungpook National University, KISTI), Estonia (NICPB) and Pakistan (NUST).

Caltech's exhibit at SC09 by the High Energy Physics (HEP) group and the Center for Advanced Computing Research (CACR) demonstrated applications for globally distributed data analysis for the LHC at CERN. It also demonstrated Caltech's worldwide collaboration system, EVO (Enabling Virtual Organizations; http://evo.caltech.edu), developed with UPJS in Slovakia; its global-network and grid monitoring system MonALISA (http://monalisa.caltech.edu); and its Fast Data Transfer application (http://monalisa.caltech.edu/FDT), developed in collaboration with the Politechnica University (Bucharest). The CACR team also showed near-real-time simulations of earthquakes in the Southern California region, experiences in time-domain astronomy with Google Sky, and recent results in multiphysics multiscale modeling.
 
The focus of the exhibit was the HEP team's record-breaking demonstration of storage-to-storage data transfer over wide area networks from two racks of servers and a network switch-router on the exhibit floor. The high-energy physics team's demonstration, "Moving Towards Terabit/Sec Transfers of Scientific Datasets: The LHC Challenge," achieved a bidirectional peak throughput of 119 gigabits per second (Gbps) and a data flow of more than 110 Gbps that could be sustained indefinitely among clusters of servers on the show floor and at Caltech, Michigan, San Diego, Florida, Fermilab, Brookhaven, CERN, Brazil, Korea, and Estonia.
 
Following the Bandwidth Challenge, the team continued its tests and established a world-record data transfer between the Northern and Southern hemispheres, sustaining 8.26 Gbps in each direction on a 10 Gbps link connecting São Paulo and Miami.

By setting new records for sustained data transfer among storage systems over continental and transoceanic distances using simulated LHC datasets, the HEP team demonstrated its readiness to enter a new era in the use of state-of-the-art cyber infrastructure to enable physics discoveries at the high energy frontier, while demonstrating some of the groundbreaking tools and systems they have developed to enable a global collaboration of thousands of scientists located at 350 universities and laboratories in more than 100 countries to make the next round of physics discoveries.
  
"By sharing our methods and tools with scientists in many fields, we hope that the research community will be well-positioned to further enable their discoveries, taking full advantage of current networks, as well as next-generation networks with much greater capacity as soon as they become available," says Harvey Newman, Caltech professor of physics, head of the HEP team, colead of the U.S. LHCNet, and chair of the U.S. LHC Users Organization. "In particular, we hope that these developments will afford physicists and young students throughout the world the opportunity to participate directly in the LHC program, and potentially to make important discoveries."

The record-setting demonstrations were made possible through the use of fifteen 10 Gbps links to the SC09 booth provided by SCinet, together with National Lambda Rail (11 links including six dedicated links to Caltech) and CENIC, Internet2 (two links), ESnet, and Cisco. The Caltech HEP team used its dark-fiber connection to Los Angeles provided by Level3 and a pair of DWDM optical multiplexers provided by Ciena Corporation to light the fiber with a series of 10G wavelengths to and from the Caltech campus in Pasadena. Ciena also supported a portion of the Caltech traffic with a single serial 100G wavelength running into the SC09 conference from the Portland Level3 PoP, operating alongside other links into SC09 from Portland. Onward connections to the partner sites included links via Esnet and Internet2 to UCSD; FLR to University of Florida as well as FIU and Brazil; MiLR to Michigan; Starlight and USLHCNet to CERN; AMPATH, together with RNP and ANSP, to Brazil via Southern Light; GLORIAD and KREONet to Korea; and Internet2 and GEANT3 to Estonia.
 
The network equipment at the Caltech booth was a single heavily populated Nexus 7000 series switch-router provided by Cisco and a large number of 10 gigabit Ethernet server-interface cards provided by Myricom. The server equipment on the show floor included five widely available Supermicro 32 core servers using Xeon quad core processors with 12 Seagate SATA disks each, and 18 Sun Fire X4540 servers, each with 12 cores and 48 disks provided by Sun Microsystems.

One of the features of next-generation networks supporting the largest science programs-notably the LHC experiments-is the use of dynamic circuits with bandwidth guarantees crossing multiple network domains. The Caltech team at SC09 used Internet2's recently announced ION service-developed together with ESnet, GEANT and in collaboration with US LHCNet-to create a dynamic circuit between Portland and CERN as part of the bandwidth-challenge demonstrations.

One of the key elements in this demonstration was Fast Data Transfer (FDT), an open-source Java application developed by Caltech in close collaboration with Politechnica University in Bucharest.  FDT runs on all major platforms and uses the NIO libraries to achieve stable disk reads and writes coordinated with smooth data flow using TCP across long-range networks. The FDT application streams a large set of files across an open TCP socket, so that a large data set composed of thousands of files-as is typical in high-energy physics applications-can be sent or received at full speed, without the network transfer restarting between files. FDT can work on its own, or together with Caltech's MonALISA system, to dynamically monitor the capability of the storage systems as well as the network path in real time, and send data out to the network at a moderated rate that achieves smooth data flow across long-range networks. 


Since it was first deployed at SC06, FDT has been shown to reach sustained throughputs among storage systems at 100 percent of network capacity where needed in production use, including among systems on different continents. FDT also achieved a smooth bidirectional throughput of 191 Gbps (199.90 Gbps peak) using an optical system carrying an OTU-4 wavelength over 80 km provided by CIENA last year at SC08.

Another new aspect of the HEP demonstration was large-scale data transfers among multiple file systems widely used in production by the LHC community, with several hundred terabytes per site. This included two recently installed instances of the open-source file system Hadoop, where in excess of 9.9 Gbps was read from Caltech on one 10 Gbps link, and up to 14 Gbps was read on shared ESnet and NLR links, a level just compatible with the production traffic on the same links. The high throughput was achieved through the use of a new FDT/Hadoop adaptor-layer written by NUST in collaboration with Caltech.

The SC09 demonstration also achieved its goal of clearing the way to Terabit/sec (Tbps) data transfers. The 4-way Supermicro servers at the Caltech booth-each with four 10GE Myricom interfaces-provided 8.3Gbps of stable throughput each, reading or writing on 12 disks, using FDT. A system capable of one Tbps to or from storage could therefore be built today in just six racks at relatively low cost, while also providing 3840 processing cores and 3 Petabytes of disk space, which is comparable to the larger LHC centers in terms of computing and storage capacity.

An important ongoing theme of SC09-including at the Caltech booth, where the EVOGreen initiative (www.evogreen.org) was highlighted-was the reduction of carbon footprint through the use of energy-efficient information technologies. A particular focus is the use of systems with a high ratio of computing and I/O performance to energy consumption. In the coming year, in preparation for SC10 in New Orleans, the HEP team will be looking into the design and construction of compact systems with lower power and cost that are capable of delivering data at several hundred Gbps, aiming to reach 1 Tbps by 2011 when multiple 100 Gbps links into SC11 may be available.   

The two largest physics collaborations at the LHC-CMS and ATLAS, each encompassing more than 2,000 physicists, engineers, and technologists from 180 universities and laboratories-are about to embark on a new round of exploration at the frontier of high energies. When the LHC experiments begin to take collision data in a new energy range over the next few months, new ground will be broken in our understanding of the nature of matter and space-time, and in the search for new particles. In order to fully exploit the potential for scientific discoveries during the next year, more than 100 petabytes (1017 bytes) of data will be processed, distributed, and analyzed using a global grid of 300 computing and storage facilities located at laboratories and universities around the world, rising to the exabyte range (1018 bytes) during the following years.

The key to discovery is the analysis phase, where individual physicists and small groups located at sites around the world repeatedly access, and sometimes extract and transport, multi-terabyte data sets on demand from petabyte data stores in order to optimally select the rare "signals" of new physics from the potentially overwhelming "backgrounds" from already-understood particle interactions. The HEP team hopes that the demonstrations at SC09 will pave the way toward more effective distribution and use for discoveries of the masses of LHC data.

The demonstration and the developments leading up to the SC09 Bandwidth Challenge were made possible through the support of the partner network organizations mentioned, the National Science Foundation (NSF), the U.S. Department of Energy (DOE) Office of Science, and the funding agencies of the HEP team's international partners, as well as the U.S. LHC Research Program funded jointly by DOE and NSF.

Further information about the demonstration may be found at http://supercomputing.caltech.edu

Writer: 
Jon Weiner
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

F. Brock Fuller, 82

F. Brock Fuller, emeritus professor of mathematics at the California Institute of Technology (Caltech), died on November 6 at the Rafael Convalescent Hospital in San Rafael, California, four years after being diagnosed with diffuse Lewy body disease. He was 82.

Fuller received his bachelor's, master's, and PhD degrees from Princeton. He came to Caltech in 1952 as a research fellow. He became assistant professor of mathematics in 1955, associate professor in 1959, and professor in 1966. In 1994, he became professor emeritus.

Much of Fuller's work revolved around what are called "writhing numbers," and the way in which these mathematical descriptions of twisting and coiling could describe supercoiled double-stranded DNA helices. (A supercoiled DNA helix is one in which the already-twisted DNA strands twist again, either in the same direction as the original helix, or in the opposite direction.)

In the early 1980s, Fuller-who was also an audiophile-was involved in analyzing digital recording technologies as they began to reach prominence in the audio-entertainment industry. Working alongside Caltech colleagues such as Gary Lorden and James Boyk, Fuller examined music piped in to Thomas Laboratory from Dabney Lounge, comparing various signals.

Fuller moved to San Rafael, in northern California, in 1996. He is survived by his wife, Alison Clark Fuller of San Rafael; his daughter, Lynn D. Fuller of San Francisco, as well as her husband, William Bivins, and their four children, Samuel, Zachary, Elizabeth, and Claire Bivins; and his sister, Cornelia Fuller of Pasadena.

Writer: 
Jon Weiner
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

LIGO Listens for Gravitational Echoes of the Birth of the Universe

Results set new limits on gravitational waves originating from the Big Bang; constrain theories about universe formation

Pasadena, Calif.—An investigation by the LIGO (Laser Interferometer Gravitational-Wave Observatory) Scientific Collaboration and the Virgo Collaboration has significantly advanced our understanding of the early evolution of the universe.

Analysis of data taken over a two-year period, from 2005 to 2007, has set the most stringent limits yet on the amount of gravitational waves that could have come from the Big Bang in the gravitational wave frequency band where LIGO can observe. In doing so, the gravitational-wave scientists have put new constraints on the details of how the universe looked in its earliest moments.

Much like it produced the cosmic microwave background, the Big Bang is believed to have created a flood of gravitational waves—ripples in the fabric of space and time—that still fill the universe and carry information about the universe as it was immediately after the Big Bang. These waves would be observed as the "stochastic background," analogous to a superposition of many waves of different sizes and directions on the surface of a pond. The amplitude of this background is directly related to the parameters that govern the behavior of the universe during the first minute after the Big Bang.

Earlier measurements of the cosmic microwave background have placed the most stringent upper limits of the stochastic gravitational wave background at very large distance scales and low frequencies. The new measurements by LIGO directly probe the gravitational wave background in the first minute of its existence, at time scales much shorter than accessible by the cosmic microwave background.

The research, which appears in the August 20 issue of the journal Nature, also constrains models of cosmic strings, objects that are proposed to have been left over from the beginning of the universe and subsequently stretched to enormous lengths by the universe's expansion; the strings, some cosmologists say, can form loops that produce gravitational waves as they oscillate, decay, and eventually disappear.

Gravitational waves carry with them information about their violent origins and about the nature of gravity that cannot be obtained by conventional astronomical tools. The existence of the waves was predicted by Albert Einstein in 1916 in his general theory of relativity. The LIGO and GEO instruments have been actively searching for the waves since 2002; the Virgo interferometer joined the search in 2007.

The authors of the new paper report that the stochastic background of gravitational waves has not yet been discovered. But the nondiscovery of the background described in the Nature paper already offers its own brand of insight into the universe's earliest history.

The analysis used data collected from the LIGO interferometers, a 2 km and a 4 km detector in Hanford, Washington, and a 4 km instrument in Livingston, Louisiana. Each of the L-shaped interferometers uses a laser split into two beams that travel back and forth down long interferometer arms. The two beams are used to monitor the difference between the two interferometer arm lengths.

According to the general theory of relativity, one interferometer arm is slightly stretched while the other is slightly compressed when a gravitational wave passes by.

The interferometer is constructed in such a way that it can detect a change of less than a thousandth the diameter of an atomic nucleus in the lengths of the arms relative to each other.

Because of this extraordinary sensitivity, the instruments can now test some models of  the evolution of the early universe that are expected to produce the stochastic background.

"Since we have not observed the stochastic background, some of these early-universe models that predict a relatively large stochastic background have been ruled out," says Vuk Mandic, assistant professor at the University of Minnesota.

"We now know a bit more about parameters that describe the evolution of the universe when it was less than one minute old," Mandic adds. "We also know that if cosmic strings or superstrings exist, their properties must conform with the measurements we made-that is, their properties, such as string tension, are more constrained than before."

This is interesting, he says, "because such strings could also be so-called fundamental strings, appearing in string-theory models. So our measurement also offers a way of probing string-theory models, which is very rare today."

"This result was one of the long-lasting milestones that LIGO was designed to achieve," Mandic says. Once it goes online in 2014, Advanced LIGO, which will utilize the infrastructure of the LIGO observatories and be 10 times more sensitive than the current instrument, will allow scientists to detect cataclysmic events such as black-hole and neutron-star collisions at 10-times-greater distances.

"Advanced LIGO will go a long way in probing early universe models, cosmic-string models, and other models of the stochastic background. We can think of the current result as a hint of what is to come," he adds.

"With Advanced LIGO, a major upgrade to our instruments, we will be sensitive to sources of extragalactic gravitational waves in a volume of the universe 1,000 times larger than we can see at the present time. This will mean that our sensitivity to gravitational waves from the Big Bang will be improved by orders of magnitude," says Jay Marx of the California Institute of Technology, LIGO's executive director.

"Gravitational waves are the only way to directly probe the universe at the moment of its birth; they're absolutely unique in that regard. We simply can't get this information from any other type of astronomy. This is what makes this result in particular, and gravitational-wave astronomy in general, so exciting," says David Reitze, a professor of physics at the University of Florida and spokesperson for the LIGO Scientific Collaboration.

"The scientists of the LIGO Scientific Collaboration and the Virgo Collaboration have joined their efforts to make the best use of their instruments. Combining simultaneous data from the LIGO and Virgo interferometers gives information on gravitational-wave sources not accessible by other means. It is very suggestive that the first result of this alliance makes use of the unique feature of gravitational waves being able to probe the very early universe. This is very promising for the future," says Francesco Fidecaro, a professor of physics with the University of Pisa and the Istituto Nazionale di Fisica Nucleare, and spokesperson for the Virgo Collaboration.

Maria Alessandra Papa, senior scientist at the Max Planck Institute for Gravitational Physics and the head of the LSC overall data analysis effort adds, "Hundreds of scientists work very hard to produce fundamental results like this one: the instrument scientists who design, commission and operate the detectors, the teams who prepare the data for the astrophysical searches and the data analysts who develop and implement sensitive techniques to look for these very weak and elusive signals in the data."

The LIGO project, which is funded by the National Science Foundation (NSF), was designed and is operated by Caltech and the Massachusetts Institute of Technology for the purpose of detecting gravitational waves, and for the development of gravitational-wave observations as an astronomical tool.

Research is carried out by the LIGO Scientific Collaboration, a group of 700 scientists at universities around the United States and in 11 foreign countries. The LIGO Scientific Collaboration interferometer network includes the LIGO interferometers and the GEO600 interferometer, which is located near Hannover, Germany, and designed and operated by scientists from the Max Planck Institute for Gravitational Physics, along with partners in the United Kingdom funded by the Science and Technology Facilities Council (STFC).

The Virgo Collaboration designed and constructed the 3 km long Virgo interferometer located in Cascina, Italy, funded by the Centre National de la Recherche Scientifique (France) and by the Istituto Nazionale di Fisica Nucleare (Italy). The Virgo Collaboration consists of 200 scientists from five Europe countries and operates the Virgo detector. Support for the operation comes from the Dutch—French—Italian European Gravitational Observatory Consortium. The LIGO Scientific Collaboration and Virgo work together to jointly analyze data from the LIGO, Virgo, and GEO interferometers.

The next major milestone for LIGO is the Advanced LIGO Project, slated to begin operation in 2014. Advanced LIGO will incorporate advanced designs and technologies that have been developed by the LIGO Scientific Collaboration. It is supported by the NSF, with additional contributions from the U.K.'s STFC and Germany's Max Planck Society.

The paper is entitled "An Upper Limit on the Amplitude of Stochastic Gravitational-Wave Background of Cosmological Origin."

Writer: 
Kathy Svitil
Writer: 

Caltech Physicists Create First Nanoscale Mass Spectrometer

Device can instantly measure the mass of an individual molecule

PASADENA, Calif.-Using devices millionths of a meter in size, physicists at the California Institute of Technology (Caltech) have developed a technique to determine the mass of a single molecule, in real time.

The mass of molecules is traditionally measured using mass spectrometry, in which samples consisting of tens of thousands of molecules are ionized, to produce charged versions of the molecules, or ions. Those ions are then directed into an electric field, where their motion, which is choreographed by both their mass and their charge, allows the determination of their so-called mass-to-charge ratio. From this, their mass can ultimately be ascertained.

The new technique, developed over 10 years of effort by Michael L. Roukes, a professor of physics, applied physics, and bioengineering at the Caltech and codirector of Caltech's Kavli Nanoscience Institute, and his colleagues, simplifies and miniaturizes the process through the use of very tiny nanoelectromechanical system (NEMS) resonators. The bridge-like resonators, which are 2 micrometers long and 100 nanometers wide, vibrate at a high frequency and effectively serve as the "scale" of the mass spectrometer.

"The frequency at which the resonator vibrates is directly proportional to its mass," explains research physicist Askshay Naik, the first author of a paper about the work that appears in the latest issue of the journal Nature Nanotechnology. Changes in the vibration frequency, then, correspond to changes in mass.

"When a protein lands on the resonator, it causes a decrease in the frequency at which the resonator vibrates and the frequency shift is proportional to the mass of the protein," Naik says. 

As described in the paper, the researchers used the instrument to test a sample of the protein bovine serum albumin (BSA), which is known to have a mass of 66 kilodaltons (kDa; a dalton is a unit of mass used to describe atomic and molecular masses, with one dalton approximately equal to the mass of one hydrogen atom).

The BSA protein ions are produced in vapor form using an electrospray ionization (ESI) system.The ions are then sprayed on to the NEMS resonator, which vibrates at a frequency of 450 megahertz. "The flux of proteins reaching the NEMS is such that only one to two protein lands on the resonator in a minute," Naik says.

When the BSA protein molecule is dropped onto the resonator, the resonator's vibration frequency decreases by as much as 1.2 kiloHertz-a small, but readily detectable, change. In contrast, the beta-amylase protein molecule, which has a mass of about 200 kDa, or three times that of BSA, causes a maximum frequency shift of about 3.6 kHz.

Because the location where the protein lands on the resonator also affects the frequency shift-falling onto the center of the resonator causes a larger change than landing on the end or toward the sides, for example-"we can't tell the mass with a single measurement, but needed about 500 frequency jumps in the published work," Naik says. In future, the researchers will decouple measurements of the mass and the landing position of the molecules being sampled. This technique, which they have already prototyped, will soon enable mass spectra for complicated mixtures to be built up, molecule-by molecule.

Eventually, Roukes and colleagues hope to create arrays of perhaps hundreds of thousands of the NEMS mass spectrometers, working in parallel, which could determine the masses of hundreds of thousands of molecules "in an instant," Naik says.

As Roukes points out, "the next generation of instrumentation for the life sciences-especially those for systems biology, which allows us to reverse-engineer biological systems-must enable proteomic analysis with very high throughput. The potential power of our approach is that it is based on semiconductor microelectronics fabrication, which has allowed creation of perhaps mankind's most complex technology."

The paper, "Towards single-molecule nanomechanical mass spectrometry," appears in the July 4 issue of Nature Nanotechnology. The other authors of the paper are graduate student Mehmet S. Hanay and staff scientist Philip Feng, from Caltech, and Wayne K. Hiebert of the National Research Council of Canada. The work was supported by the National Institutes of Health and, indirectly, by the Defense Advanced Research Projects Agency and the Space and Naval Warfare Systems Command.

Writer: 
Kathy Svitil
Writer: 

Mechanics: Nano Meets Quantum

Caltech physicists devise new method to detect quantum mechanical effects in ordinary objects

PASADENA, Calif.—At the quantum level, the atoms that make up matter and the photons that make up light behave in a number of seemingly bizarre ways. Particles can exist in "superposition," in more than one state at the same time (as long as we don't look), a situation that permitted Schrödinger's famed cat to be simultaneously alive and dead; matter can be "entangled"—Albert Einstein called it "spooky action at a distance"—such that one thing influences another thing, regardless of how far apart the two are.

Previously, scientists have successfully measured entanglement and superposition in photons and in small collections of just a few atoms. But physicists have long wondered if larger collections of atoms—those that form objects with sizes closer to what we are familiar with in our day-to-day life—also exhibit quantum effects.

"Atoms and photons are intrinsically quantum mechanical, so it's no surprise if they behave in quantum mechanical ways. The question is, do these larger collections of atoms do this as well," says Matt LaHaye, a postdoctoral research scientist working in the laboratory of Michael L. Roukes, a professor of physics, applied physics, and bioengineering at the California Institute of Technology (Caltech) and codirector of Caltech's Kavli Nanoscience Institute.

"It'd be weird to think of ordinary matter behaving in a quantum way, but there's no reason it shouldn't," says Keith Schwab, an associate professor of applied physics at Caltech, and a collaborator of Roukes and LaHaye. "If single particles are quantum mechanical, then collections of particles should also be quantum mechanical. And if that's not the case—if the quantum mechanical behavior breaks down—that means there's some kind of new physics going on that we don't understand."

The tricky part, however is devising an experiment that can detect quantum mechanical behavior in such ordinary objects—without, for example, those effects being interfered with or even destroyed by the experiment itself.

Now, however, LaHaye, Schwab, Roukes,  and their colleagues have developed a new tool that meets such fastidious demands and that can be used to search for quantum effects in an ordinary object. The researchers describe their work in the latest issue of the journal Nature.

In their experiment, the Caltech scientists used microfabrication techniques to create a very tiny nanoelectromechanical system (NEMS) resonator, a silicon-nitride beam—just 2 micrometers long, 0.2 micrometers wide, and weighing 40 billionths of a milligram—that can resonate, or flex back and forth, at a high frequency when a voltage is applied.

A small distance (300 nanometers, or 300 billionths of a meter) from the resonator, the scientists fabricated a second nanoscale device known as a single-Cooper-pair box, or superconducting "qubit"; a qubit is the basic unit of quantum information.

The superconducting qubit is essentially an island formed between two insulating barriers across which a set of paired electrons can travel. In the Caltech experiments, the qubit has only two quantized energy states: the ground state and an excited state. This energy state can be controlled by applying microwave radiation, which creates an electric field.

Because the NEMS resonator and the qubit are fabricated so closely together, their behavior is tightly linked; this allows the NEMS resonator to be used as a probe for the energy quantization of the qubit. "When the qubit is excited, the NEMS bridge vibrates at a higher frequency than it does when the qubit is in the ground state," LaHaye says.

One of the most exciting aspects of this work is that this same coupling should also enable measurements to observe the discrete energy levels of the vibrating resonator that are predicted by quantum mechanics, the scientists say. This will require that the present experiment be turned around (so to speak), with the qubit used to probe the NEMS resonator. This could also make possible demonstrations of nanomechanical quantum superpositions and Einstein's spooky entanglement

"Quantum jumps are, perhaps, the archetypal signature of behavior governed by quantum effects," says Roukes. "To see these requires us to engineer a special kind of interaction between our measurement apparatus and the object being measured. Matt's results establish a practical and really intriguing way to make this happen."

The paper, "Nanomechanical measurements of a superconducting qubit," was published in the June 18 issue of Nature. In addition to LaHaye, Schwab, and Roukes, its coauthors were Junho Suh, a graduate student at Caltech, and Pierre M. Echternach of the Jet Propulsion Laboratory. The work was funded by the National Science Foundation, the Foundational Questions Institute, and Caltech's Center for the Physics of Information. 

Writer: 
Kathy Svitil
Writer: 

Uncertainty Principle Used to Detect Entanglement of Photon Shared Among Four Locations

PASADENA, Calif.-Scientists at the California Institute of Technology (Caltech) have developed an efficient method to detect entanglement shared among multiple parts of an optical system. They show how entanglement, in the form of beams of light simultaneously propagating along four distinct paths, can be detected with a surprisingly small number of measurements. Entanglement is an essential resource in quantum information science, which is the study of advanced computation and communication based on the laws of quantum mechanics.

In the May 8 issue of the journal Science, H. Jeff Kimble, the William L. Valentine Professor and professor of physics at Caltech, and his colleagues demonstrate for the first time that quantum uncertainty relations can be used to identify entangled states of light that are only available in the realm of quantum mechanics. Their approach builds on the famous Heisenberg uncertainty principle, which places a limit on the precision with which the momentum and position of a particle can be known simultaneously.

Entanglement, which lies at the heart of quantum physics, is a state in which the parts of a composite system are more strongly correlated than is possible for any classical counterparts, regardless of the distances separating them.

Entanglement in a system with more than two parts, or multipartite entanglement, is a critical tool for diverse applications in quantum information science, such as for quantum metrology, computation, and communication. In the future, a "quantum internet" will rely on entanglement for the teleportation of quantum states from place to place (for a recent review see H. J. Kimble, Nature 453, 1023 (2008)).

"For some time physicists have studied the entanglement of two parts-or bipartite entanglement-and techniques for classifying and detecting the entanglement between two parts of a composite system are well known," says Scott Papp, a postdoctoral scholar and one of the authors of the paper. "But that hasn't been the case for multipartite states. Since they contain more than two parts, their classification is much richer, but detecting their entanglement is extremely challenging."

In the Caltech experiment, a pulse of light was generated containing a single photon-a massless bundle, with both wave-like and particle-like properties, that is the basic unit of electromagnetic radiation. The team split the single photon to generate an entangled state of light in which the quantum amplitudes for the photon propagate among four distinct paths, all at once. This so-called W state plays an important role in quantum information science.

To enable future applications of multipartite W states, the entanglement contained in them must be detected and characterized. This task is complicated by the fact that entanglement in W states can be found not only among all the parts, but also among a subset of them.

To distinguish between these two cases in real-world experiments, coauthors Steven van Enk and Pavel Luogovski from the University of Oregon developed a novel approach to entanglement detection based on the uncertainty principle. (See also the recent theoretical article by van Enk, Lougovski, and the Caltech group, "Verifying multi-partite mode entanglement of W states" at http://xxx.lanl.gov/abs/0903.0851.)

The demonstration of the detection of entanglement in multipartite W states is the key breakthrough of the Caltech group's work.

The new approach to entanglement detection makes use of non-local measurements of a photon propagating through all four paths. The measurements indicate whether a photon is present, but give no information about which path it takes.

"The quantum uncertainty associated with these measurements has allowed us to estimate the level of correlations among the four paths through which a single photon simultaneously propagates, by comparing to the minimum uncertainty possible for any less entangled states," says Kyung Soo Choi, a Caltech graduate student and one of the authors of the paper. 

Correlations of the paths above a certain level signify entanglement among all the pathways; even partially entangled W states do not attain a similar level of correlation. A key feature of this approach is that only a relatively small number of measurements must be performed.

Due to their fundamental structure, the entanglement of W states persists even in the presence of some sources of noise. This is an important feature for real-world applications of W states in noisy environments. The Caltech experiments have directly tested this property by disturbing the underlying correlations of the entangled state. When the correlations are purposely weakened, there is a reduction in the number of paths of the optical system that are entangled. And yet, as predicted by the structure of W states, the entanglement remains amongst a subset of the paths.

"Our work introduces a new protocol for detecting an important class of entanglement with single photons," Papp explains. "It signifies the ever-increasing degree of control we have in the laboratory to study and manipulate quantum states of light and matter."

Next, the researchers plan to apply their technique to entangled states of atoms. These efforts will build upon previous advances in the Caltech Quantum Optics Group, including the mapping of photonic entanglement to and from a quantum memory (http://media.caltech.edu/press_releases/13115), and the distribution of entanglement amongst the nodes of a quantum network (http://media.caltech.edu/press_releases/12969).

The paper, "Characterization of Multipartite Entanglement for One Photon Shared Among Four Optical Modes," appears in the May 8 issue of Science. The authors are Scott B. Papp, Kyung Soo Choi (whose contributions to the work were equal to Papp's), and H. Jeff Kimble of Caltech; Hui Deng, a former Caltech postdoctoral scholar, now at the University of Michigan, Ann Arbor; and Pavel Lougovski and S. J. van Enk of the University of Oregon. Van Enk is also an associate of the Institute for Quantum Information at Caltech.

The work was funded by the Intelligence Advanced Research Projects Activity, the National Science Foundation, and Northrop Grumman Space Technology.

Writer: 
Kathy Svitil
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - PMA