Rare 'Star-making Machine' Found in Distant Universe

Astronomers have uncovered an extreme stellar machine -- a galaxy in the very remote universe pumping out stars at a surprising rate of up to 4,000 per year. In comparison, our own Milky Way galaxy turns out an average of just 10 stars per year.

The discovery, made possible by several telescopes including NASA's Spitzer Space Telescope, goes against the most common theory of galaxy formation. According to the theory, called the Hierarchical Model, galaxies slowly bulk up their stars over time by absorbing tiny pieces of galaxies -- and not in one big burst as observed in the newfound "Baby Boom" galaxy.

"This galaxy is undergoing a major baby boom, producing most of its stars all at once," said Peter Capak of NASA's Spitzer Science Center at the California Institute of Technology, Pasadena. "If our human population was produced in a similar boom, then almost all of the people alive today would be the same age." Capak is lead author of a new report detailing the discovery in the July 10th issue of Astrophysical Journal Letters.

The Baby Boom galaxy, which belongs to a class of galaxies called starbursts, is the new record holder for the brightest starburst galaxy in the very distant universe, with brightness being a measure of its extreme star-formation rate. It was discovered and characterized using a suite of telescopes operating at different wavelengths. NASA's Hubble Space Telescope and Japan's Subaru Telescope, atop Mauna Kea in Hawaii, first spotted the galaxy in visible-light images, where it appeared as an inconspicuous smudge due to is great distance.

It wasn't until Spitzer and the James Clerk Maxwell Telescope, also on Mauna Kea in Hawaii, observed the galaxy at infrared and submillimeter wavelengths, respectively, that the galaxy stood out as the brightest of the bunch. This is because it has a huge number of youthful stars. When stars are born, they shine with a lot of ultraviolet light and produce a lot of dust. The dust absorbs the ultraviolet light but, like a car sitting in the sun, it warms up and re-emits light at infrared and submillimeter wavelengths, making the galaxy unusually bright to Spitzer and the James Clerk Maxwell Telescope.

To learn more about this galaxy's unique youthful glow, Capak and his team followed up with a number of telescopes. They used optical measurements from Keck to determine the exact distance to the galaxy -- a whopping12.3 billion light-years. That's looking back to a time when the universe was 1.3 billion years old (the universe is approximately 13.7 billion years old today).

"If the universe was a human reaching retirement age, it would have been about 6 years old at the time we are seeing this galaxy," said Capak.

The astronomers made measurements at radio wavelengths with the National Science Foundation's Very Large Array in New Mexico. Together with Spitzer and James Clerk Maxwell data, these observations allowed the astronomers to calculate a star-forming rate of about 1,000 to 4,000 stars per year. At that rate, the galaxy needs only 50 million years, not very long on cosmic timescales, to grow into a galaxy equivalent to the most massive ones we see today.

While galaxies in our nearby universe can produce stars at similarly high rates, the farthest one known before now was about 11.7 billion light-years away, or a time when the universe was 1.9 billion years old.

"Before now, we had only seen galaxies form stars like this in the teenaged universe, but this galaxy is forming when the universe was only a child," said Capak. "The question now is whether the majority of the very most massive galaxies form very early in the universe like the Baby Boom galaxy, or whether this is an exceptional case. Answering this question will help us determine to what degree the Hierarchical Model of galaxy formation still holds true."

"The incredible star-formation activity we have observed suggests that we may be witnessing, for the first time, the formation of one of the most massive elliptical galaxies in the universe," said co-author Nick Scoville of Caltech, the principal investigator of the Cosmic Evolution Survey, also known as Cosmos. The Cosmos program is an extensive survey of a large patch of distant galaxies across the full spectrum of light.

"The immediate identification of this galaxy with its extraordinary properties would not have been possible without the full range of observations in this survey," said Scoville.

NASA's Jet Propulsion Laboratory, Pasadena, Calif., manages the Spitzer Space Telescope mission for NASA's Science Mission Directorate, Washington. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology, also in Pasadena. Caltech manages JPL for NASA. For more information about Spitzer, visit http://www.spitzer.caltech.edu/spitzer and http://www.nasa.gov/spitzer.

Kathy Svitil
Exclude from News Hub: 
News Type: 
Research News

LIGO Observations Probe the Dynamics of the Crab Pulsar

PASADENA, Calif.-- The search for gravitational waves has revealed new information about the core of one of the most famous objects in the sky: the Crab Pulsar in the Crab Nebula. An analysis by the international LIGO (Laser Interferometer Gravitational-Wave Observatory) Scientific Collaboration to be submitted to Astrophysical Journal Letters has shown that no more than 4 percent of the energy loss of the pulsar is caused by the emission of gravitational waves.

The Crab Nebula, located 6,500 light years away in the constellation Taurus, was formed in a spectacular supernova explosion in 1054. According to ancient sources, including Chinese texts that referred to it as a "guest star," the explosion was visible in daylight for more than three weeks, and may briefly have been brighter than the full moon. At the heart of the nebula remains an incredibly rapidly spinning neutron star that sweeps two narrow radio beams across the Earth each time it turns. The lighthouse-like radio pulses have given the star the name "pulsar."

"The Crab Pulsar is spinning at a rate of 30 times per second. However, its rotation rate is decreasing rapidly relative to most pulsars, indicating that it is radiating energy at a prodigious rate," says Graham Woan of the University of Glasgow, who co-led the science group that used LIGO data to analyze the Crab Pulsar, along with Michael Landry of the LIGO Hanford Observatory. Pulsars are almost perfect spheres made up of neutrons and contain more mass than the sun in an object only 10 km in radius. The physical mechanisms for energy loss and the accompanying braking of the pulsar spin rate have been hypothesized to be asymmetric particle emission, magnetic dipole radiation, and gravitational-wave emission.

Gravitational waves are ripples in the fabric of space and time and are an important consequence of Einstein's general theory of relativity. A perfectly smooth neutron star will not generate gravitational waves as it spins, but the situation changes if its shape is distorted. Gravitational waves would have been detectable even if the star were deformed by only a few meters, which could arise because its semisolid crust is strained or because its enormous magnetic field distorts it. "The Crab neutron star is relatively young and therefore expected to be less symmetrical than most, which means it could generate more gravitational waves," says Graham Woan.

The scenario that gravitational waves significantly brake the Crab pulsar has been disproved by the new analysis.

Using published timing data about the pulsar rotation rate from the Jodrell Bank Observatory, LIGO scientists monitored the neutron star from November 2005 to August 2006 and looked for a synchronous gravitational-wave signal using data from the three LIGO interferometers, which were combined to create a single, highly sensitive detector.

The analysis revealed no signs of gravitational waves. But, say the scientists, this result is itself important because it provides information about the pulsar and its structure.

"We can now say something definite about the role gravitational waves play in the dynamics of the Crab Pulsar based on our observations," says David Reitze, a professor of physics at the University of Florida and spokesperson for the LIGO Scientific Collaboration. "This is the first time the spin-down limit has been broken for any pulsar, and this result is an important milestone for LIGO."

Michael Landry adds, "These results strongly imply that no more than 4 percent of the pulsar's energy loss is due to gravitational radiation. The remainder of the loss must be due to other mechanisms, such as a combination of electromagnetic radiation generated by the rapidly rotating magnetic field of the pulsar and the emission of high-velocity particles into the nebula."

"LIGO has evolved over many years to its present capability to produce scientific results of real significance," says Jay Marx of the California Institute of Technology, LIGO's executive director. "The limit on the Crab Pulsar's emission of gravitational waves is but one of a number of important results obtained from LIGO's recent two-year observing period. These results only serve to further our anticipation for the spectacular science that will come from LIGO in the coming years."

"Neutron stars are very hot when they are formed in a supernova, and then they cool rapidly and form a semisolid crust. Our observation of a relatively young star like the Crab is important because it shows that this skin, if it had irregularities when it first 'froze,' has by now become quite smooth," says Bernard F. Schutz, director of the Albert Einstein Institute in Germany.

Joseph Taylor, a Nobel Prize-winning radio astronomer and professor of physics at Princeton University, says, "The physics world has been waiting eagerly for scientific results from LIGO. It is exciting that we now know something concrete about how nearly spherical a neutron star must be, and we have definite limits on the strength of its internal magnetic field."

The LIGO project, which is funded by the National Science Foundation, was designed and is operated by Caltech and the Massachusetts Institute of Technology for the purpose of detecting gravitational waves, and for the development of gravitational-wave observations as an astronomical tool.

Research is carried out by the LIGO Scientific Collaboration, a group of 600 scientists at universities around the United States and in 11 foreign countries. The LIGO Scientific Collaboration interferometer network includes the LIGO interferometers (including the 2 km and 4 km detectors in Hanford, Washington, and a 4 km instrument in Livingston, Louisiana) and the GEO600 interferometer, located in Hannover, Germany, and designed and operated by scientists from the Max Planck Institute for Gravitational Physics and partners in the United Kingdom funded by the Science and Technology Facilities Council (STFC).

The next major milestone for LIGO is the Advanced LIGO Project, slated for operation in 2014. Advanced LIGO, which will utilize the infrastructure of the LIGO observatories, will be 10 times more sensitive. Advanced LIGO will incorporate advanced designs and technologies that have been developed by the LIGO Scientific Collaboration. It is supported by the NSF, with additional contributions from the U.K. STFC and the German Max Planck Gessellschaft.

The increased sensitivity will be important because it will allow scientists to detect cataclysmic events such as black-hole and neutron-star collisions at ten-times-greater distances and to search for much smaller "hills" on the Crab Pulsar.

Kathy Svitil
Exclude from News Hub: 
News Type: 
Research News

Astrophysicist Wins One of First Kavli Prizes

PASADENA, Calif.--Quasars--now known to be compact halos of matter that surround the massive black holes of distant galaxies--were once thought to be stars in our own galaxy. Now, Maarten Schmidt, who showed that quasars are thousands of millions of light-years away from Earth, has been named one of the first recipients of the $1 million Kavli Prize for his contributions to the field of astrophysics.

Schmidt, the Moseley Professor of Astronomy, Emeritus, at the California Institute of Technology, is one of seven recipients of the new Kavli Prize. He shares the astrophysics award with Donald Lynden-Bell, of Cambridge University, who was also a postdoc at Caltech from 1960 to 1962.

The seven pioneering scientists are being recognized for transforming human knowledge in the fields of nanoscience, neuroscience, and astrophysics. The prize was established through a partnership between the Norwegian Academy of Science and Letters, the Kavli Foundation, and the Norwegian Ministry of Education and Research.

Schmidt and Lynden-Bell are honored for their contributions to understanding the nature of quasars. In making their award, the members of the Kavli Astrophysics Prize Committee said, "Maarten Schmidt and Donald Lynden-Bell's seminal work dramatically expanded the scale of the observable universe and led to our present view of the violent universe in which massive black holes play a key role."

In 1963, using the 200-inch Hale Telescope on Palomar Mountain, Schmidt studied the visible-light spectrum of quasar 3C273. He discovered that it had a very high redshift, which meant it was moving away from Earth at 47,000 kilometers per second. Examination of the spectrum of another quasar revealed a motion double that of 3C273. Schmidt calculated that these objects lay beyond our galaxy, and he immediately realized that they must be emitting not only far more energy than our sun, but hundreds of times more energy than the entire Milky Way galaxy, which contains 10 billion stars. It was later determined that this enormous energy comes from a volume no larger than the size of our own solar system. Subsequent investigations of the evolution and distribution of quasars led Schmidt to discover that they were more abundant when the universe was younger.

"I'm delighted with the award. It is in particular a most pleasant surprise after so many years," Schmidt says. "After all, it's been 45 years since I found the red shift in quasar 3C273."

Schmidt was the executive officer for astronomy at Caltech from 1972 to 1975, the chair of the Division of Physics, Mathematics and Astronomy for the following three years, and then served as the last director of the Hale Observatories from 1978 to 1980. Despite being named an emeritus professor 12 years ago, he has continued his research, working to find the redshift beyond which there are no quasars.

Schmidt's fellow Kavli Prize recipient in astrophysics, Lynden-Bell, is honored for his ideas that the enormous energy of quasars arises from frictional heating in a gaseous disk of material rotating around giant black holes. The prediction that quasars are found at the centers of galaxies was later confirmed by high-resolution observations with the Hubble Space Telescope.

The Kavli Prizes focus on the science of the greatest physical dimensions of space and time, the science of the smallest dimensions of systems of atoms and molecules, and the science of the most complex systems, especially living organisms. Dedicated to the advancement of science for the benefit of humanity, the Kavli Foundation supports scientific research, honors scientific achievement, and promotes public understanding of scientists and their work. Fred Kavli, a Norwegian-born physicist, business leader, inventor, and philanthropist, moved to the U.S. shortly after receiving his college degree in physics and started a company that became one of the world's largest suppliers of sensors for aeronautic, automotive, and industrial applications. He created the Kavli Foundation in 2002, and has since funded the establishment of 15 research institutes worldwide, including the Kavli Nanoscience Institute at Caltech.

This year's Kavli Prize winners are the first to receive the award in a biennial event that will be celebrated in Fred Kavli's native city, Oslo. The prizes will be presented by HRH Crown Prince Haakon at an award ceremony in Oslo Concert Hall on September 9. For more information on the prizes and recipients, please visit http://www.kavliprize.no/


Elisabeth Nadin
Exclude from News Hub: 

Stellar Death Caught in the Act

PALOMAR MOUNTAIN, Calif.--Thanks to a fortuitous observation with NASA's Swift satellite, astronomers for the first time have caught a star in the act of exploding. Astronomers have previously observed thousands of stellar explosions, known as supernovae, but they have always seen them after the fireworks were well underway.

"For years we have dreamed of seeing a star just as it was exploding, but actually finding one is a once-in-a-lifetime event," says Alicia Soderberg, a Hubble and Carnegie-Princeton Fellow at Princeton University, who is leading the group studying this explosion. "This newly born supernova is going to be the Rosetta Stone of supernova studies for years to come."

Led by Shrinivas Kulkarni, MacArthur Professor of Astronomy and Planetary Science and director of Caltech Optical Observatories, Caltech astronomers including graduate student Bradley Cenko and others undertook detailed observations with the automated Palomar 60-inch and the 200-inch telescopes. "It may well be that supernovae occur more commonly than we thought," remarked Kulkarni.

A typical supernova occurs when the core of a massive star runs out of nuclear fuel and collapses under its own gravity to form an ultradense object known as a neutron star. The newborn neutron star compresses and then rebounds, triggering a shock wave that plows through the star's gaseous outer layers and blows the star to smithereens. Astronomers thought for nearly four decades that this shock "breakout" produces bright X-ray emission lasting a few minutes.

But until this discovery, astronomers have never observed this signal. Instead, they have observed supernovae brightening days or weeks later, when the expanding shell of debris is energized by the decay of radioactive elements forged in the explosion. "Seeing the shock breakout in Xrays can give a direct view of the exploding star in the last minutes of its life and also provide a signpost to which astronomers can quickly point their telescopes to watch the explosion unfold," says Edo Berger, also a Hubble and Carnegie-Princeton Fellow.

Soderberg's discovery of the first shock breakout can be attributed to luck and Swift's unique design. On January 9, 2008, Soderberg and Berger were using Swift to observe a supernova known as SN 2007uy in the spiral galaxy NGC 2770, located 90 million light-years from Earth in the constellation Lynx. At 9:33 a.m. EST they spotted an extremely bright five-minute X-ray outburst in NGC 2770. They quickly recognized that the Xrays were coming from another location in the same galaxy.

In a paper appearing in the journal Nature on May 22, Soderberg and 38 colleagues show that the energy and pattern of the X-ray outburst is consistent with a shock wave bursting through the surface of the progenitor star. This marks the birth of the supernova now known as SN 2008D.

Although astronomers were lucky that Swift was observing NGC 2770 just at the moment when SN 2008D's shock wave was blowing up the star, Swift is well equipped to study such an event because of its multiple instruments observing in gamma rays, Xrays, and ultraviolet light. "It was a gift of nature for Swift to be observing that patch of sky when the supernova exploded. But thanks to Swift's flexibility, we have been able to trace its evolution in detail every day since," says Swift lead scientist Neil Gehrels of NASA's Goddard Space Flight Center in Greenbelt, Maryland.

Due to the significance of the X-ray outburst, Soderberg immediately mounted an international observing campaign to study SN 2008D. Observations were made with major telescopes such as the Hubble Space Telescope, the Chandra X-ray Observatory, the Very Large Array in New Mexico, the Gemini North telescope in Hawaii, the Keck I telescope in Hawaii, the 200-inch and 60-inch telescopes at the Palomar Observatory in California, and the 3.5-meter telescope at the Apache Point Observatory in New Mexico.

The combined observations helped Soderberg and her colleagues pin down the energy of the initial X-ray outburst, which will help theorists better understand supernovae. The observations also show that SN 2008D is an ordinary Type Ibc supernova, which occurs when a massive, compact star explodes. Significantly, radio and X-ray observations confirmed that the event was a supernova explosion, and not a related, rare type of stellar outburst known as a gamma-ray burst.

For an animated view of the supernova explosion, visit: http://www.yousendit.com/download/MlZmZGVTVnN5UkUwTVE9PQ

Elisabeth Nadin

Thirty-Meter Telescope Focuses on Two Candidate Sites

PASADENA, Calif.--After completing a worldwide survey unprecedented in rigor and detail of astronomical sites for the Thirty-Meter Telescope (TMT), the TMT Observatory Corporation board of directors has selected two outstanding sites, one in each hemisphere, for further consideration. Cerro Armazones lies in Chile's Atacama Desert, and Mauna Kea is on Hawai'i Island.

The TMT observatory, which will be capable of peering back in space and time to the era when the first stars and galaxies were forming and will be able to directly image planets orbiting other stars, will herald a new generation of telescopes.

To ensure that proposed TMT sites would provide the greatest advantage to the telescope's capabilities, a global satellite survey was conducted, from which a small sample of outstanding sites was chosen for further study using ground-based test equipment. This ground-based study of two sites in the northern hemisphere and three in the southern was the most comprehensive survey of its kind ever undertaken.

Atmospheric turbulence above each candidate site, and wind characteristics, temperature variations, amount of water vapor, and other meteorological data at some of the candidate sites, were continuously monitored for up to four years. Based upon this campaign, the TMT project will now further evaluate the best site in the northern hemisphere and the best site in the southern hemisphere.

"All five sites proved to be outstanding for carrying out astronomical observations," said Edward Stone, Caltech's Morrisroe Professor of Physics and vice chairman of the TMT board. "I want to congratulate the TMT project team for conducting an excellent testing program, not only for TMT but for the benefit of astronomical research in the future." In addition to the "astronomical weather" at the sites, other considerations in the final selection will include the environment, accessibility, operations costs, and complementarities with other nearby astronomy facilities.

The next step in the site analysis process is the preparation of an Environmental Impact Statement (EIS) that will thoroughly evaluate all aspects, including environmental, cultural, socio-economic, and financial, of constructing and operating the Thirty-Meter Telescope in Hawai'i. An environmental impact statement for Cerro Armazones has already been completed and submitted to the Chilean government for their review.

The community-based Mauna Kea Management Board, which oversees the management of the Mauna Kea summit in coordination with the University of Hawai'i at Hilo, concurs that the Thirty-Meter Telescope should proceed with its EIS process. Regardless of whether Mauna Kea is selected as the Thirty-Meter Telescope site, information generated from the EIS will be useful in the management of Mauna Kea.

Henry Yang, TMT board chair and chancellor of UC Santa Barbara, expressed the gratitude of the board. "The selection of these top two candidate sites is an exciting milestone in the Thirty-Meter Telescope's journey from vision to reality. We are grateful for the tireless efforts of our project team and the tremendous vision and support of the Moore Foundation and our international partners that have brought us to this point. We look forward to moving ahead rapidly and with all due diligence toward the selection of our preferred site."

The TMT is currently in the final stages of an $80 million design phase. The plan is to initiate construction in 2010 with first light in early 2018. This project is a partnership between the University of California, California Institute of Technology, and ACURA, an organization of Canadian universities. The Gordon and Betty Moore Foundation has provided $50 million for the design phase of the project and has pledged an additional $200 million for the construction of the telescope, and Caltech and the University of California each will seek to raise matching funds of $50 million to bring the construction total to $300 million.

"We look forward to the discussions with the people of Hawai'i and Chile regarding the opportunities to open a new era in astronomy in one of these two world capitals of astronomy," says Professor Ray Carlberg, the Canadian Large Optical Telescope project director and a TMT board member. "Canadian scientists have partnered in the extensive site testing carried out by TMT and we are very pleased to see that it has led to two great options for TMT."

TMT gratefully acknowledges support for design and development from the following: Gordon and Betty Moore Foundation, Canada Foundation for Innovation, Ontario Ministry of Research and Innovation, National Research Council of Canada, Natural Sciences and Engineering Research Council of Canada, British Columbia Knowledge Development Fund, Association of Universities for Research in Astronomy, and the National Science Foundation (USA). 

Elisabeth Nadin

Caltech Helps Open the Universe in "WorldWide Telescope"

PASADENA, Calif.-- Panoramic images of the sky obtained at Palomar Observatory and by the Two Micron All Sky Survey (2MASS), plus pointed observations from the Spitzer Space Telescope, form a significant part of the "World Wide Telescope" (WWT), a new product released today by Microsoft aimed at bringing exploration of the Universe and its many wonders to the general public.

WorldWide Telescope is a rich Web application that combines imagery from the best ground- and space-based observatories across the world, stitching together terabytes of high-resolution images of celestial bodies and displaying them in a way that relates to their actual relative position in the sky. Using their own computers, people from all walks of life can freely browse through the solar system, galaxy, and beyond. They can choose which telescope they want to look through, including NASA's Hubble, Chandra, and Spitzer Telescopes, to view the locations of planets in the night sky--in the past, present or future--and the universe through different wavelengths of light to reveal hidden structures in other parts of the galaxy. Taken as a whole, the application provides a top-to-bottom view of the science of astronomy.

"The progression from William and Caroline Herschel's visual catalogs in the late 1700s to digital pictures available to anyone with a home computer shows the amazing advances in astronomy over two centuries, and also the continuity of our subject," says Wallace Sargent, Ira S. Bowen Professor of Astronomy at the California Institute of Technology. Scientists at Caltech provided many of the images displayed in WWT and are working with the Microsoft team to enrich and expand the content and the educational possibilities offered by the application.

The WWT combines cosmic imagery and educational content from many sources, including major ground-based sky surveys. One of those was the survey conducted at Palomar Observatory in visible light; another was the 2MASS survey in the infrared. Both projects are managed and distributed at Caltech's Infrared Processing and Analysis Center (IPAC).

Palomar Observatory, which is operated by Caltech, has conducted a number of major sky surveys since the 1950s, initially with photographic plates, and now with modern digital detectors. The surveys are conducted using the 48-inch Samuel Oschin Telescope.

Images of the northern sky used in the WWT are based on the second major photographic Palomar Sky Survey (POSS-II), conducted in the late 1980s and early 1990s. A digital version of this survey was produced in collaboration with the Space Telescope Science Institute in Baltimore, Maryland, and processed and calibrated at Caltech under the leadership of Caltech Professor of Astronomy S. George Djorgovski. This survey has detected over 50 million galaxies and about a billion stars, as well as many other interesting objects. Additional images for the WWT were provided by the currently ongoing Palomar-Quest digital sky survey. All of the images were processed at Caltech's Center for Advanced Computing Research (CACR). "Astronomy is now a computationally intensive field. We hope to use the WWT as a gateway to learning, not just about astronomy, but also about information technology and computational thinking, which are so important for all aspects of modern scholarship and society," says Roy Williams of CACR.

Using data collected from twin 1.3-meter telescopes in Arizona and Chile over a 3.5-year period, 2MASS produced the first high-resolution digital survey of the complete infrared sky, providing the international astronomical community with an unprecedented global view of the Milky Way and nearby galaxies. 2MASS was the most thorough census ever made of the Milky Way galaxy and the nearby universe. It detected infrared wavelengths, which are longer than the red light in the rainbow of visible colors. Infrared light penetrates dust more effectively than visible light, so it is particularly useful for detecting objects obscured within the Milky Way, as well as the faint heat of very cool objects that give off very little visible light of their own.

"Humans have always been fascinated by the universe, by the starry sky," says Djorgovski. "We are hoping to help reignite that sense of wonder and exploration among students and curious people everywhere."

More information is available at the following:

The Digital Palomar Observatory Sky Survey website: http://www.astro.caltech.edu/~george/dposs/

The Palomar-Quest sky survey website: http://palquest.org/

The "Big Picture" outreach website: http://bigpicture.caltech.edu/

Palomar Observatory website: http://www.astro.caltech.edu/palomar/

The Samuel Oschin Telescope: http://www.astro.caltech.edu/palomar/sot.html

The Center for Advanced Computing Research website: http://www.cacr.caltech.edu/

Caltech's Infrared Processing and Analysis Center: http://ipac.caltech.edu.

WorldWide Telescope can be accessed at http://www.worldwidetelescope.org/

Wallace Sargent's description of the POSS-II survey: http://www.astro.caltech.edu/~wws/poss2.html

Kathy Svitil

A New Take on Microbrewing

PASADENA, Calif.--Since Babylonian times, a still has provided the means to turn grain, fruit, or vegetables into an intoxicating drink. Today, a still may provide a solution to the more complex problem of how to detect diseases.

California Institute of Technology researchers have crafted the world's tiniest still to concentrate scant amounts of micromolecules for easier detection. This device may help to overcome difficulties in tracking extremely low-abundance molecular biomarkers, which can indicate disease.

"Distillation has been around for millennia, and it's a well-established technology. There weren't many new avenues to develop because it's so well studied," comments David Boyd, a lecturer in mechanical engineering at Caltech and lead author of a paper describing the new approach to distillation in this month's issue of Analytical Chemistry. "But we've created a new space for distillation because you don't need to boil the fluid anymore."

Stills can separate components of a mixture as well as concentrate materials dissolved in liquid, and are used, among other things, to purify seawater, to separate crude oil, and to amplify alcohol content. Now, with nanoparticles of gold and a microbubble, Boyd and his colleagues have created a microscale still that operates at room temperature and pressure, making it potentially useful in biomedical devices.

The still is a microfluidic chip, with a microns-wide channel, thinner than a hair, etched into silicone rubber and serving as the microplumbing for tiny volumes of fluid. But unlike typical microfluidic chips, the channel is sealed by a glass slide studded with gold nanoparticles. Into the channel is introduced a microbubble wide enough to form an air gap in the fluid. Energy from a laser no more powerful than an average laser pointer heats the gold particles, which quickly transfer the heat to the liquid on one side of the bubble, turning it to vapor.

The vaporized liquid passes from the warmer to the cooler side of the bubble, where it condenses. "Only the most volatile molecules cross over the bubble, but everything else is left behind," Boyd describes. In conventional distillation, the same type of separation is achieved either by heating the entire volume of fluid to boil off individual components, or by reducing the gas pressure above the liquid to allow components to more easily escape, he explains.

With the new setup, the team discovered the same process takes place with a very slight change in fluid temperature and without reducing air pressure. They demonstrated the method with dye in ethanol and water, creating a distilled solution of concentrated dye on one side of the bubble and clear liquid on the other.

This microscopic still overturns some major obstacles in microscience. First, it allows distillation of delicate molecules and organisms that can't survive high temperatures and a lack of dissolved gasses. Second, while nanoparticles have often been useful floating freely in fluid, this can bring unwanted side effects, remarks coauthor David Goodwin, professor of mechanical engineering and applied physics at Caltech. "It's difficult to control the concentrations of nanoparticles, they can interact with organisms or other particles in a way you don't want, and they're hard to get out once they're there," he says.

Instead, the team anchored the particles to the base of the chip, and took advantage of unique heating properties of gold in its nanoform. Just as gold particles in stained glass windows absorb green light strongly, making the windows appear red, in the still they absorb the green frequency of a cheap laser and, as Goodwin describes, "act like antennas for visible light." But a laser is only one option for powering the still; Goodwin notes that any low-power heat source, like a wire or resistor, would work.

The bubble, which is key to the novel distillation method, was also once a dreaded entity. "Typically air bubbles are a real annoyance in microfluidics. They pin the flow in fluid and are hard to get rid of," comments Boyd. "We've learned to love them." The team even managed to use bubbles to pump fluid around corners in the microchannels.

Ultimately, the scientists hope that this tiny still can serve in the detection or monitoring of biological processes. They envision a sensor, perhaps even worn as a patch, that will concentrate larger molecules to detect what they are. Patients with diabetes, for example, could wear one to constantly monitor blood sugar level. As Goodwin describes, "Distillation is hard to do on a chip, but when you put it on a chip, it becomes a biomedical monitor."

Other authors on the paper are James Adleman, a graduate student in electrical engineering, and Demitri Psaltis, Caltech's Myers Professor of Electrical Engineering. 


Elisabeth Nadin
Exclude from News Hub: 
News Type: 
In Our Community

Physicists Transcribe Entanglement into and out of a Quantum Memory

PASADENA, Calif.--Scientists at the California Institute of Technology have laid the groundwork for a crucial step in quantum information science. They show how entanglement, an essential property of quantum mechanics, can be generated between beams of light, stored in a quantum memory, and mapped back into light with the push of a button.

In the March 6 issue of the journal Nature, Caltech Valentine Professor of Physics H. Jeff Kimble and his colleagues demonstrate for the first time an important capability required for the control of quantum information and quantum networks, namely the coherent conversion of photonic entanglement into and out of separated quantum memories.

Entanglement lies at the heart of quantum physics, and is a state where parts of a composite system are more strongly correlated than is possible for any classical counterparts regardless of the distance separating them. Entanglement is a critical resource for diverse applications in quantum information science, such as for quantum metrology, computation, and communication. Quantum networks rely on entanglement for the teleportation of quantum states from place to place.

In a quest to turn these abstract ideas into real laboratory systems and to distribute entanglement to remote locations (even on a continental scale), Kimble explains that quantum physicists have studied ways to propagate photonic information into and out of quantum memory using a system called a quantum repeater, invented in 1998 by H. Briegel, J.I. Cirac, and P. Zoller at the University of Innsbruck. Until now, work in Kimble's group on the realization of a quantum repeater with atomic ensembles relied upon the probabilistic creation of entanglement. In this setting entanglement between two clouds of atoms was generated probabilistically but with an unambiguous heralding event.

While such systems hold the potential for scalable quantum networks, it has been difficult for Kimble's Quantum Optics Group to apply such schemes to certain protocols necessary for quantum networks, such as entanglement connection. Now, with the new protocol and future improvements, "We can push a button and generate entanglement," says physics graduate student Kyung Soo Choi, one of four authors of the Caltech experiment.

While entanglement has been traditionally carried out with photons in attempt to connect two distant systems, these particles of light are difficult to store because of their small interactions with matter when taken one by one. A quantum memory for light is an essential ingredient for achieving scalable quantum networks with photons. Choi says. "The question is now, 'How do you change the entangled state of light into an entanglement of matter and back into light?'" This was not possible for any physical system until now.

The new work, Choi says, "is a proof-of-principle demonstration that entanglement between material systems can be generated deterministically by mapping the entanglement of light to and from two spatially separated quantum memories." The Caltech team separated the processes for generating and storing the entanglement, thereby breaking a previous inherent link between the quality and probability of state preparation. "In a general context, our work represents an important step in laboratory capabilities for the creation and manipulation of entangled states of light and matter. We hope that our results will be useful as a tool in the effort to realize quantum repeaters and thereby scalable quantum networks over long distances," remarks Kimble.

In the Caltech experiment, a single photon is first split, generating an entangled state of light with quantum amplitudes for the photon to propagate two distinct paths, taking both at once. The Caltech team in turn transcribed, or mapped, the entanglement onto distinct atomic ensembles separated by one millimeter. To create the interface between the light and matter, the team employed laser-cooled cesium atoms whose atomic states interact with a control laser to create destructive quantum interference, making the atomic ensembles either invisible or highly opaque to the input light. Called Electromagnetically Induced Transparency and pioneered by S. Harris at Stanford University, the mechanism manipulates the speed of the light for the incoming entangled photon and that kicks off the entire procedure.

"We can reduce the speed of light to the speed of a train, and then in fact stop the light inside the matter by slowly turning off the control laser, where now the quantum information--the entangled state of light--is stored inside the atomic ensembles," Choi describes. "By turning on the control laser again, we can reversibly accelerate the 'stopped' light back to the speed of light and restore the quantum entanglement as propagating beams of light."

In this experiment, the photonic entanglement was mapped into the atomic ensembles in a time ~ 20 nanoseconds and then stored in the atomic ensembles for one microsecond, with storage times extendable up to 10 microseconds. The photonic entanglements of the input and output of the quantum interface were explicitly quantified with a conversion efficiency of 20 percent. However, the researchers emphasize, real-world realization of a quantum network remains far out of reach even with these parameters and the state-of-the-art of quantum controls. Choi comments, "Further improvements in quantum control and storage capabilities in matter-light interfaces will lead to fruitful and exciting discoveries in Quantum Information Science, including for the realization of quantum networks."

In addition to Kimble and Choi, other authors are Hui Deng, a postdoctoral scholar at the Center for the Physics of Information (whose contributions to the work equaled that of Choi's); and Julien Laurat, a former Caltech physics postdoctoral scholar who is now an associate professor at Laboratoire Kastler Brossel (Universite P. et M. Curie, Ecole Normale Superieure and CNRS) in Paris, France.



Elisabeth Nadin

High-Speed Data Transfer System Garners Outreach Award

PASADENA, Calif.--The Corporation for Education Network Initiatives in California (CENIC) has rewarded researchers at the California Institute of Technology for better connecting physicists worldwide. Lead project scientist Harvey Newman, professor of physics at Caltech, Julian Bunn of the Caltech Center for Advanced Computing Research, and their international team of researchers will receive a trophy for Innovations in Networking at a ceremony in Oakland, California, on March 11.

Each year, CENIC, which designs, implements, and operates a high-bandwidth, high-capacity Internet network specifically designed for faculty, staff, and students in California's educational and research communities, solicits nominations for its awards. The Caltech team submitted a nomination for their project, called UltraLight, based on exciting recent developments, Bunn says.

UltraLight was developed in 2004 in large part to support the decades of research that will emerge from the Large Hadron Collider (LHC) at CERN in Geneva, Switzerland. The project provides advanced global systems and networks, and this summer will start transferring data as the LHC becomes operational.

Physicists at the collider face the unprecedented challenges of handling globally distributed datasets that will likely grow to hundreds of petabytes by 2010, as well as petaflops of distributed computing for collaborative data analysis by global communities of thousands of scientists. UltraLight will aid scientists by monitoring, managing, and optimizing use of the network in real time.

UltraLight exhibited its capabilities in a showroom demonstration for CENIC during a supercomputing conference in November 2007, sustaining disk-to-disk data transfers of up to 88 gigabits per second (Gbps) between Caltech and Reno, Nevada, for more than a day. But data flows from the LHC experiments will be the first time that UltraLight will strut its stuff for scientists hungry for data.

"The detector itself is like an onion--each layer is good at detecting different types of particles, and has electronics that read out bits and bytes that go onto an online database," Bunn explains. Those bits and bytes will then travel to storage at Tier 1 computing facilities, whence they can be analyzed at Tier 2 computing centers around the world. With UltraLight, Bunn explains, "physicists can quickly move the data out to these centers to reconstruct at home what was detected at CERN."

Another feature of the UltraLight project is the way it treats all computing resources as part of a worldwide network readily available to anyone who needs it. "To most scientists, the network is someone else's provision," Bunn says. "We want to make it easy for physicists to make their requests on the network. Our collaborators in Rio or São Paolo can now very easily request a dataset and have it delivered in a timely manner."

One of the tools developed in the UltraLight project is an interactive monitoring and control system called MonALISA (Monitoring Agents using a Large Integrated Services Architecture), which can, for example, monitor and display the activity and speed of all network links via a map that looks much like a flight path map.

The CENIC Innovations in Networking awards are split into four categories, and this year for the first time CENIC declared a tie in Experimental/Developmental Applications between UltraLight and another contender, CineGrid, which facilitates the exchange of digital media over a network. Bunn will accept the trophy and present the group's project at the CENIC 2008: Lightpath to the Stars conference in Oakland on Tuesday, March 11.

To learn more about UltraLight, visit http://www.ultralight.org
To explore the interactive system, visit http://monalisa.caltech.edu/

Elisabeth Nadin

U. S. Experiment Takes the Lead in the Competitive Race to Find Dark Matter

PASADENA, Calif.-- Scientists of the Cryogenic Dark Matter Search (CDMS) experiment, including researchers from the California Institute of Technology, today announced that they have regained the lead in the worldwide race by a number of different research groups to find the particles that make up dark matter. The CDMS experiment, which is being conducted a half-mile underground in a mine in Soudan, Minnesota, again sets the world's best constraints on the properties of dark matter candidates.

Weakly interacting massive particles, or WIMPs, are leading candidates for the building blocks of dark matter, the as-yet-unknown form of matter that accounts for 85 percent of the entire mass of the universe. Hundreds of billions of WIMPs may have passed through your body as you read these sentences.

The CDMS experiment is located in the Soudan Underground Laboratory, shielded from cosmic rays and other particles that could mimic the signals expected from dark matter particles. Scientists operate the ultrasensitive CDMS detectors under clean-room conditions at a temperature of about 40 millikelvins, or .04 degrees Celsius above absolute zero. Physicists believe that WIMPs, if they exist, would travel right through ordinary matter, rarely leaving a trace. If WIMPs were to cross the CDMS detector, occasionally one would hit the nucleus of an atom of the element germanium in the crystal grid of the detector. Like a hammer hitting a bell, the collision would create vibrations of the grid, which scientists would be able to detect. The experiment is sensitive enough to hear WIMPs if they hit the crystal germanium detector only twice per year.

The scientists did not observe such signals, allowing the CDMS experiment to set limits on the properties of WIMPs.

Scientists predict that WIMPs might interact with ordinary matter at rates similar to those of low-energy neutrinos, elusive subatomic particles discovered in 1956. But to account for all of the dark matter in the universe and the gravitational pull it produces, WIMPs must have masses about a billion times larger than those of neutrinos. The CDMS collaboration found that if WIMPs have 100 times the mass of protons (about 100 GeV/c^2) they collide with one kilogram of germanium less than a few times per year; otherwise, the CDMS experiment would have detected them.

"With our new result we are leapfrogging the competition," says CDMS cospokesperson Blas Cabrera, of Stanford University. The Department of Energy's Fermi National Accelerator Laboratory hosts the project management for the CDMS experiment. "We have achieved the world's most stringent limits on how often dark matter particles interact with ordinary matter and how heavy they are, in particular in the theoretically favored mass range of more than 40 times the proton mass."

"The CDMS experiment is unique in bringing so many different disciplines to bear on the search for dark matter, from astro- and particle physics in the expected WIMP signature to low-temperature and condensed-matter physics in the operation of our novel detectors," says Sunil Golwala, assistant professor of physics at Caltech. "Our work continues Caltech's long-standing role in the dark matter story, ranging from the first evidence for dark matter obtained by Fritz Zwicky in 1933 to the detailed maps of dark matter made recently by Caltech astronomy colleagues Nick Scoville, Richard Ellis, and Richard Massey."

"Observations made with telescopes have repeatedly shown that dark matter exists. It is the stuff that holds together all cosmic structures, including our own Milky Way. The observation of WIMPs would finally reveal the underlying nature of this dark matter, which plays such a crucial role in the formation of galaxies and the evolution of our universe," says Joseph Dehmer, director of the Division of Physics for the National Science Foundation.

The discovery of WIMPs would require extensions to the theoretical framework known as the standard model of particles and their forces. The CDMS result, presented to the scientific community at the Eighth UCLA Dark Matter and Dark Energy symposium on February 22, tests the viability of new theoretical concepts that have been proposed.

"Our results constrain theoretical models such as supersymmetry and models based on extra dimensions of space-time, which predict the existence of WIMPs," says CDMS project manager Dan Bauer, of DOE's Fermilab. "For WIMP masses expected from these theories, we are again the most sensitive in the world, retaking the lead from the Xenon 10 experiment at the Italian Gran Sasso laboratory. We will gain another factor of three in sensitivity by continuing to take more data with our detector in the Soudan laboratory until the end of 2008."

A new phase of the CDMS experiment with 25 kilograms of germanium is planned for the Sudbury Neutrino Observatory's underground laboratory facility in Canada. "The 25-kilogram experiment has clear discovery potential," says Fermilab director Pier Oddone. "It covers a lot of the territory predicted by supersymmetric theories."

The CDMS collaboration includes more than 50 scientists from 15 institutions and receives funding from the U.S. Department of Energy, the National Science Foundation, foreign funding agencies in Canada and Switzerland, and member institutions.

In addition to participating in CDMS, Golwala's dark matter group at Caltech, comprising physics graduate students Zeeshan Ahmed and David Moore and postdoctoral fellow in experimental physics Walt Ogburn, is developing a new kind of WIMP detector based on the microwave kinetic inductance sensors developed by Professor of Physics Jonas Zmuidzinas, with funding from a grant by the Gordon and Betty Moore Foundation.

Additional information:

CDMS home page: http://cdms.berkeley.edu/index.html

Fermilab CMMS press page http://www.fnal.gov/pub/presspass/press_releases/CDMS_Photos2008/index.html

Institutions participating in CDMS:

Brown University California Institute of Technology Case Western Reserve University Fermi National Accelerator Laboratory Lawrence Berkeley National Laboratory Massachusetts Institute of Technology Queens University Santa Clara University Stanford University Syracuse University University of California, Berkeley University of California, Santa Barbara University of Colorado Denver University of Florida University of Minnesota University of Zurich

Kathy Svitil