Andromeda's Stellar Halo Shows Galaxy's Origin to Be Similar to That of Milky Way

PASADENA, Calif.-For the last decade, astronomers have thought that the Andromeda galaxy, our nearest galactic neighbor, was rather different from the Milky Way. But a group of researchers have determined that the two galaxies are probably quite similar in the way they evolved, at least over their first several billion years.

In an upcoming issue of the Astrophysical Journal, Scott Chapman of the California Institute of Technology, Rodrigo Ibata of the Observatoire de Strasbourg, and their colleagues report that their detailed studies of the motions and metals of nearly 10,000 stars in Andromeda show that the galaxy's stellar halo is "metal-poor." In astronomical parlance, this means that the stars lying in the outer bounds of the galaxy are pretty much lacking in all the elements heavier than hydrogen.

This is surprising, says Chapman, because one of the key differences thought to exist between Andromeda and the Milky Way was that the former's stellar halo was metal-rich and the latter's was metal-poor. If both galaxies are metal-poor, then they must have had very similar evolutions.

"Probably, both galaxies got started within a half billion years of the Big Bang, and over the next three to four billion years, both were building up in the same way by protogalactic fragments containing smaller groups of stars falling into the two dark-matter haloes," Chapman explains.

While no one yet knows what dark matter is made of, its existence is well established because of the mass that must exist in galaxies for their stars to orbit the galactic centers the way they do. Current theories of galactic evolution, in fact, assume that dark-matter wells acted as a sort of "seed" for today's galaxies, with the dark matter pulling in smaller groups of stars as they passed nearby. What's more, galaxies like Andromeda and the Milky Way have each probably gobbled up about 200 smaller galaxies and protogalactic fragments over the last 12 billion years.

Chapman and his colleagues arrived at the conclusion about the metal-poor Andromeda halo by obtaining careful measurements of the speed at which individual stars are coming directly toward or moving directly away from Earth. This measure is called the radial velocity, and can be determined very accurately with the spectrographs of major instruments such as the 10-meter Keck-II telescope, which was used in the study.

Of the approximately 10,000 Andromeda stars for which the researchers have obtained radial velocities, about 1,000 turned out to be stars in the giant stellar halo that extends outward by more than 500,000 light-years. These stars, because of their lack of metals, are thought to have formed quite early, at a time when the massive dark-matter halo had captured its first protogalactic fragments.

The stars that dominate closer to the center of the galaxy, by contrast, are those that formed and merged later, and contain heavier elements due to stellar evolution processes.

In addition to being metal-poor, the stars of the halo follow random orbits and are not in rotation. By contrast, the stars of Andromeda's visible disk are rotating at speeds upwards of 200 kilometers per second.

According to Ibata, the study could lead to new insights on the nature of dark matter. "This is the first time we've been able to obtain a panoramic view of the motions of stars in the halo of a galaxy," says Ibata. "These stars allow us to weigh the dark matter, and determine how it decreases with distance."

In addition to Chapman and Ibata, the other authors are Geraint Lewis of the University of Sydney; Annette Ferguson of the University of Edinburgh; Mike Irwin of the Institute of Astronomy in Cambridge, England; Alan McConnachie of the University of Victoria; and Nial Tanvir of the University of Hertfordshire.



Robert Tindol

Dust Found in Earth Sediment Traced to Breakup of the Asteroid Veritas 8.2 Million Years Ago

PASADENA, Calif.—In a new study that provides a novel way of looking at our solar system's past, a group of planetary scientists and geochemists announce that they have found evidence on Earth of an asteroid breakup or collision that occurred 8.2 million years ago.

Reporting in the January 19 issue of the journal Nature, scientists from the California Institute of Technology, the Southwest Research Institute (SwRI), and Charles University in the Czech Republic show that core samples from oceanic sediment are consistent with computer simulations of the breakup of a 100-mile-wide body in the asteroid belt between Mars and Jupiter. The larger fragments of this asteroid are still orbiting the asteroid belt, and their hypothetical source has been known for years as the asteroid "Veritas."

Ken Farley of Caltech discovered a spike in a rare isotope known as helium 3 that began 8.2 million years ago and gradually decreased over the next 1.5 million years. This information suggests that Earth must have been dusted with an extraterrestrial source.

"The helium 3 spike found in these sediments is the smoking gun that something quite dramatic happened to the interplanetary dust population 8.2 million years ago," says Farley, the Keck Foundation Professor of Geochemistry at Caltech and chair of the Division of Geological and Planetary Sciences. "It's one of the biggest dust events of the last 80 million years."

Interplanetary dust is composed of bits of rock from a few to several hundred microns in diameter produced by asteroid collisions or ejected from comets. Interplanetary dust migrates toward the sun, and en route some of this dust is captured by the Earth's gravitational field and deposited on its surface.

Presently, more than 20,000 tons of this material accumulates on Earth each year, but the accretion rate should fluctuate with the level of asteroid collisions and changes in the number of active comets. By looking at ancient sediments that include both interplanetary dust and ordinary terrestrial sediment, the researchers for the first time have been able to detect major dust-producing solar system events of the past.

Because interplanetary dust particles are so small and rare in sediment-significantly less than a part per million-they are difficult to detect using direct measurements. However, these particles are extremely rich in helium 3, in comparison with terrestrial materials. Over the last decade, Ken Farley has measured helium 3 concentrations in sediments formed over the last 80 million years to create a record of the interplanetary dust flux.

To assure that the peak was not a fluke present at only one site on the seafloor, Farley studied two different localities: one in the Indian Ocean and one in the Atlantic. The event is recorded clearly at both sites.

To find the source of these particles, William F. Bottke and David Nesvorny of the SwRI Space Studies Department in Boulder, Colorado, along with David Vokrouhlicky of Charles University, studied clusters of asteroid orbits that are likely the consequence of ancient asteroidal collisions.

"While asteroids are constantly crashing into one another in the main asteroid belt," says Bottke, "only once in a great while does an extremely large one shatter."

The scientists identified one cluster of asteroid fragments whose size, age, and remarkably similar orbits made it a likely candidate for the Earth-dusting event. Tracking the orbits of the cluster backwards in time using computer models, they found that, 8.2 million years ago, all of its fragments shared the same orbital orientation in space. This event defines when the 100-mile-wide asteroid called Veritas was blown apart by impact and coincides with the spike in the interplanetary seafloor sediments Farley had found.

"The Veritas disruption was extraordinary," says Nesvorny. "It was the largest asteroid collision to take place in the last 100 million years."

As a final check, the SwRI-Czech team used computer simulations to follow the evolution of dust particles produced by the 100-mile-wide Veritas breakup event. Their work shows that the Veritas event could produce the spike in extraterrestrial dust raining on the Earth 8.2 million years ago as well as a gradual decline in the dust flux.

"The match between our model results and the helium 3 deposits is very compelling," Vokrouhlicky says. "It makes us wonder whether other helium 3 peaks in oceanic cores can also be traced back to asteroid breakups."

This research was funded by NASA's Planetary Geology & Geophysics program and received additional financial support from Czech Republic grant agency and the National Science Foundation's COBASE program. The Nature paper is titled "A late Miocene dust shower from the breakup of an asteroid in the main belt."



Robert Tindol

Astrophysical Device Will Sniff Out Terrorism

PASADENA, Calif.—Astrophysicists spend most of their time looking for objects in the sky, but 9/11 changed Ryan McLean's orientation.

Right after the terrorist attacks, the Caltech staff scientist began applying his knowledge about detectors that study galaxies to the design of new sensors for detecting radioactive materials near possible terrorist targets. A few months ago, the U.S. Department of Homeland Security awarded McLean the first phase of a $2.2 million contract to develop a radiation-detection module.

"Before 9/11, I had a safe feeling that life was great," says McLean, who came to Caltech in 1999 to work for Professor of Physics Christopher Martin, developing projects in which rockets were launched with instruments that, during their five minutes above the atmosphere, observed the dust and hot gases in the Milky Way. "But I have two young kids, and now I realize that things may not be so stable."

The first part of McLean's project is to create a specialized chip that turns a semiconducting crystal into a detector that can find a radiation source up to 100 meters away and tell whether it's harmful radiation from a dirty bomb, or harmless radiation from, say, a truckload of fertilizer. In the second phase, which could begin by the middle of 2006, he'll build a workable device.

The problem with current detectors is that they are often set off by essentially benign materials. They also tend to be large pieces of equipment located only at the nation's entry points, such as ports.

McLean wants to make detectors that will ignore natural radiation sources like fertilizer and that will also be small and mobile, so that security officers can take them anywhere and target any ship, truck, or building.

McLean, who has also contributed to a project at the Lawrence Livermore National Laboratory (LLNL) to build a radiation detector the size of a cell phone, plans to use a sensor made of cadmium zinc telluride, which has been used in telescopes to detect gamma rays and X rays. The advantage of these crystals is that they work at room temperature, unlike other sensors that work only at very low temperatures.

To accomplish this, McLean teamed with the X-ray/gamma-ray group at Caltech's Space Radiation Laboratory (SRL), which is led by Professor of Physics Fiona Harrison. The SRL has been developing cadmium zinc telluride gamma-ray sensors, as well as custom, low-noise, low-power electronic chips for X-ray and gamma-ray instruments, for more than 10 years. While SRL's efforts have largely focused on developing these sensors for space missions, after 9/11 SRL teamed with LLNL to develop a chip for a handheld radiation monitor for Homeland Security.

Surprisingly, looking for radiation on the ground is not much different from searching for it in space. "What we are doing with Ryan is taking the best of what we developed for the previous Homeland Security device, and combining it with the best of what we developed for our space instruments," says senior SRL engineer Rick Cook. Everything SRL has learned about the pros and cons of the cadmium zinc telluride itself will also be key to making this project a success.

McLean says that he does not expect the project to put Caltech into the antiterrorism radiation-detection business. If his device shows promise, the technology could be licensed to a company that would manufacture a range of detection products at relatively low cost, making widespread use feasible.

"The idea is that if you could have lots of small detectors, you might have a better chance of detecting harmful nuclear material than if you're stationed only at central locations, like bridges and ports," he says.

Given government officials' warnings that it is only a matter of time before the next terrorist attack in the United States, McLean says that there is a lot of pressure to complete the work quickly. "It helps push the project along."

Contact: Mike Rogers (626) 395-6083


Quasar Study Provides Insights into Composition of the Stars That Ended the "Dark Ages"

WASHINGTON, D.C.-A team of astronomers has uncovered new evidence about the stars whose formation ended the cosmic "Dark Ages" a few hundred million years after the Big Bang.

In a presentation today at the annual winter meeting of the American Astronomical Society (AAS), California Institute of Technology graduate student George Becker is scheduled to discuss his team's investigation of several faraway quasars and the gas between the quasars and Earth. The paper on which his lecture is based will be published in the Astrophysical Journal in March.

One quasar in the study seems to reveal numerous patches of "neutral" gas, made up of atoms where the nucleus and electrons cling together, floating in space when the universe was only about 10 percent of its present age. This gas is thought to have existed in significant quantities only within a certain time-frame in the early universe. Prior to the Dark Ages, all material would have been too hot for atomic nuclei to combine with their electrons; after, the light from newly-formed stars would have reached the atoms and stripped off the electrons.

"There should have been a period when most of the atoms in the universe were neutral," Becker explains. "This would have continued until stars and galaxies began forming."

In other words, the universe went from a very hot, very dense state following the Big Bang where all atomic nuclei and electrons were too energetic to combine, to a less dense and cooler phase-albeit a dark one-where the nuclei and the electrons were cool enough to hold onto each other and form neutral atoms, to today's universe where the great majority of atoms are ionized by energetic particles of light.

Wallace Sargent, who coined the term Dark Ages in 1985 and who is Becker's supervising professor, adds that analyzing the quasars to learn about the early universe is akin to looking at a lighthouse in order to study the air between you and it. During the Dark Ages, neutral atoms filling the universe would have acted like a fog, blocking out the light from distant objects. To end the Dark Ages, enough stars and galaxies needed to form to burn this "cosmic fog" away.

"We may have detected the last wisps of the fog," explains Sargent, who is Bowen Professor of Astronomy at Caltech.

The uniqueness of the new study is the finding that the chemical elements of the cool, un-ionized gas seem to have come from relatively ordinary stars. The researchers think this is so because the elements they detect in the gas- oxygen, carbon, and silicon-are in proportions that suggest the materials came from Type II supernovae.

These particular explosions are caused when massive stars collapse and then rebound to form a gigantic explosion. The stars needed to create these explosions can be more than ten times the mass of the sun, yet they are common over almost the entire history of the universe.

However, astronomers believe that the very first stars in the universe would have been much more massive, up to hundreds of times the mass of the sun, and would have left behind a very different chemical signature.

"If the first stars in the universe were indeed very massive stars," Becker explains, "then their chemical signature was overwhelmed by smaller, more typical stars very soon after."

Becker and his colleagues believe they are seeing material from stars that was blown into space by the supernovae explosions and mixed with the pristine gas produced by the Big Bang. Specifically, they are looking at the spectra of the light from quasars as it is absorbed during its journey through the mixed-up gas.

The quasars in this particular study are from the Sloan Digital Sky Survey, an ongoing mapping project that seeks, in part, to determine the distances of 100,000 quasars. The researchers focused on nine of the most distant quasars known, with redshifts greater than 5, meaning that the light we see from these objects would have been emitted when the universe was at most 1.2 billion years old.

Of the nine, three are far enough away that they may have been at the edge of the dark period. Those three have redshifts greater than 6, meaning that the universe was less than 1 billion years old when they emitted the light we observe. By comparison, the present age of the universe is believed to be about 13.7 billion years.

Becker says that the study in part promises a new tool to investigate the nature of stars in the early universe. "Now that we've seen these systems, it's reasonable to ask if their composition reflects the output of those first very massive stars, or whether the mix of chemicals is what you would expect from more ordinary stars that ended in Type II supernovae.

"It turns out that the latter is the case," Becker says. "The chemical composition appears to be very ordinary."

Thus, the study provides a new window into possible transitions in the early universe, Sargent adds. "The relative abundance of these elements gives us in principle a way of finding out what the first stars were.

"This gives us insight into what kind of stars ended the Dark Ages."

Observations for this study were performed using the 10-meter (400-inch) Keck I Telescope on Mauna Kea, Hawaii. In addition to Becker and Sargent, the other authors are Michael Rauch of the Carnegie Observatories and Robert A. Simcoe of the MIT Center for Space Research.

This work was supported by the National Science Foundation.

Robert Tindol

Kuiper Belt Moons Are Starting to Seem Typical

WASHINGTON, D.C.-In the not-too-distant past, the planet Pluto was thought to be an odd bird in the outer reaches of the solar system because it has a moon, Charon, that was formed much like Earth's own moon was formed. But Pluto is getting a lot of company these days. Of the four largest objects in the Kuiper belt, three have one or more moons.

"We're now beginning to realize that Pluto is one of a small family of similar objects, nearly all of which have moons in orbit around them," says Antonin Bouchez, a California Institute of Technology astronomer.

Bouchez discussed his work on the Kuiper belt Tuesday, January 10, at the winter meeting of the American Astronomical Society (AAS).

Bouchez says that the puzzle for planetary scientists is that, as a whole, the hundreds of objects now known to inhabit the Kuiper belt beyond the orbit of Neptune have only about an 11 percent chance of possessing their own satellites. But three of the four largest objects now known in the region have satellites, which means that different processes are at work for the large and small bodies.

Experts have been fairly confident for a decade or more that Pluto's moon Charon was formed as the result of an impact, but that the planet seemed unique in this. According to computer models, Pluto was hit by an object roughly one-half its own size, vaporizing some of the planet's material. A large piece, however, was cleaved off nearly intact, forming Pluto's moon Charon.

Earth's moon is thought to have been formed in a similar way, though our moon most likely formed out of a hot disk of material left in orbit after such a violent impact.

Just in the last year, astronomers have discovered two additional moons for Pluto, but the consensus is still that the huge Charon was formed by a glancing blow with another body, and that all three known satellites-as well as anything else not yet spotted from Earth-were built up from the debris.

As for the other Kuiper belt objects, experts at first thought that the bodies acquired their moons only occasionally by snagging them through gravitational capture. For the smaller bodies, the 11 percent figure would be about right.

But the bigger bodies are another story. The biggest of all-and still awaiting designation as the tenth planet-is currently nicknamed "Xena." Discovered by Caltech's Professor of Planetary Science Mike Brown and his associates, Chad Trujillo of the Gemini Observatory and David Rabinowitz of Yale University, Xena is 25 percent larger than Pluto and is known to have at least one moon.

The second-largest Kuiper belt object is Pluto, which has three moons and counting. The third-largest is nicknamed "Santa" because of the time of its discovery by the Mike Brown team, and is known to have two moons.

"Santa is an odd one," says Bouchez. "You normally would expect moons to form in the same plane because they would have accreted from a disk of material in orbit around the main body.

"But Santa's moons are 40 degrees apart. We can't explain it yet."

The fourth-largest Kuiper belt object is nicknamed "Easterbunny"-again, because of the time the Brown team discovered it-and is not yet known to have a moon. But in April, Bouchez and Brown will again be looking at Easterbunny with the adaptive-optics rig on one of the 10-meter Keck telescopes, and a moon might very well turn up.

The search for new planets and other bodies in the Kuiper belt is funded by NASA. For more information on the program, see the Samuel Oschin Telescope's website at

For more information on Mike Brown's research, see

For more information on the Keck laser-guide-star adaptive optics system, see



Robert Tindol

Experimental Economists Find Brain Regions That Govern Fear of the Economic Unknown

PASADENA, Calif.—Do you have second thoughts when ordering a strange-sounding dish at an exotic restaurant? Afraid you'll get fricasseed eye of newt, or something even worse? If you do, it's because certain neurons in the brain are saying that the potential reward for the risk is unknown. These regions of the brain have now been pinpointed by experimental economists at the California Institute of Technology and the University of Iowa College of Medicine.

In the December 9 issue of the journal Science, Caltech's Axline Professor of Business Economics Colin Camerer and his colleagues report on a series of experiments involving Caltech student volunteers and patients with specific types of brain damage at the University of Iowa. The object of the experiments was to see how the brain responded to degrees of economic uncertainty by having the test subjects make wagers while being scanned by a functional magnetic resonance imager (fMRI).

The results show that there is a definite difference in the brain when the wagers add a degree of ambiguity to the risk. In cases where the game involves a simple wager in which the chance of getting a payoff is very clearly known, the dorsal striatum tends to light up. But in a nearly identical game in which the chances of winning are unknown, the more emotional parts of the brain known as the amygdala and orbitofrontal cortex (OFC) are involved.

According to Camerer, this is a clear advancement in understanding the neural basis of economic decision making. Much is already known about how people deal with risk from the standpoint of social sciences and behavioral ecology, but greater understanding of how the brain structures are involved provides new insights on how certain behaviors are connected.

"The amygdala has been hypothesized as a generalized vigilance module in the brain," he explains. "We know, for example, that anyone with damage to the amygdala cannot pick up certain facial cues that normally allow humans to know whether they should trust someone else."

Problems with the amygdala are also known to be associated with autism, a brain disorder that causes sufferers to have trouble recognizing emotions in other people's faces. One of the authors of the paper, Ralph Adolphs, the Bren Professor of Psychology and Neuroscience at Caltech, has done extensive work in this area.

As for the OFC, the structure is associated with the integration of emotional and cognitive input. Therefore the OFC and amygdala presumably work together when a person is confronted with a wager for which the odds are unknown-the amygdala sends a "caution" message and the OFC processes the message.

The researchers set up the experiments so that the "risk" games and "ambiguity" games looked similar, to control for activity in the visual system so they could focus only on differences in decision making. In the "risk" games, each test subject was provided an opportunity to either choose a certain amount, like $3, or else choose a card that could be either red or blue. If the card was red, the test subject got $10, but if it came up blue, the test subject got nothing for that particular card.

In the risk games, each test subject was informed that the chance of drawing a red card was 50 percent, that there would be 10 of each color out of the total of 20 cards. Subjects made a series of 24 choices, with different sums of money at risk and different numbers of cards. In the ambiguity games, however, each test subject was told that the deck contained 20 cards, but was told nothing about how many were red and how many were blue.

As predicted from past experiments in which this type of risk was observed in test subjects, the researchers knew that the Caltech subjects with no brain damage would be more likely to draw cards in the risk game than in the ambiguity game, because people dislike betting when they do not know the odds. They were more likely to take sure amounts, which meant that their fear cost them money in expected value terms.

The patients at the University of Iowa Medical School, on the other hand, who had lesions to the OFC, played the game entirely differently. On average, these subjects with damage to the OFC were much more tolerant of risk and ambiguity.

Camerer says that the result with the brain-damaged test subjects fits well with the observation that many have suffered in their personal lives due to reckless financial decisions.

The research also addressed the intensity of the response in the brain as it correlates with degrees of risk. The results for the Caltech students showed more intense activity in the amygdala and OFC when the chance of winning is ambiguous, but there would be no such difference in patients with damage to those areas.

In sum, the results provide an important neurological understanding of how we humans handle risk in the real world, Camerer says.

"If you think about it, how often do you know the probability of success? Probably, the situation we modeled with the risk game is more the exception than the rule," he says. "In most situations, I think you are confronted with a risky choice in which you have little idea of the chances of different payoffs."

Does the study have any applications for society? Camerer says that our knowing what is happening at the most microscopic level in the neurons of the brain could lead to better understanding of bigger social effects. For example, a fear of the economic unknown will also create a strong preference for the familiar. In every country in the world, investors hold too many stocks they are familiar with, from their own countries, and do not diversify their stock holdings enough by buying ambiguous foreign stocks. The opposite of fear of the economic unknown may be driving entrepreneurs, who often thrive under uncertainty.

"It could be that aversion to ambiguity is like a primitive freezing response that we've had for millions of years," Camerer says. "In this case, it would be an economic freezing response."

The study is titled "Neural Systems Responding to Degrees of Uncertainty in Human Decision Making."

In addition to Camerer and Adolphs, the other authors are Ming Hsu and Meghana Bhatt, both graduate students in economics at Caltech; and Daniel Tranel of the University of Iowa College of Medicine.

Robert Tindol

Physicists Achieve Quantum Entanglement Between Remote Ensembles of Atoms

PASADENA, Calif.—Physicists have managed to "entangle" the physical state of a group of atoms with that of another group of atoms across the room. This research represents an important advance relevant to the foundations of quantum mechanics and to quantum information science, including the possibility of scalable quantum networks (i.e., a quantum Internet) in the future.

Reporting in the December 8 issue of the journal Nature, California Institute of Technology physicist H. Jeff Kimble and his colleagues announce the first realization of entanglement for one "spin excitation" stored jointly between two samples of atoms. In the Caltech experiment, the atomic ensembles are located in a pair of apparatuses 2.8 meters apart, with each ensemble composed of about 100,000 individual atoms.

The entanglement generated by the Caltech researchers consisted of a quantum state for which, when one quantum spin (i.e., one quantum bit) flipped for the atoms at the site L of one ensemble, invariably none flipped at the site R of the other ensemble, and when one spin flipped at R, invariably none flipped at L. Yet, remarkably, because of the entanglement, both possibilities existed simultaneously.

According to Kimble, who is the Valentine Professor and professor of physics at Caltech, this research significantly extends laboratory capabilities for entanglement generation, with now-entangled "quantum bits" of matter stored with separation several thousand times greater than was heretofore possible.

Moreover the experiment provides the first example of an entangled state stored in a quantum memory that can be transferred from the memory to another physical system (in this case, from matter to light). Since the work of Schrödinger and Einstein in the 1930s, entanglement has remained one of the most profound aspects and persistent mysteries of quantum theory. Entanglement leads to strong correlations between the various components of a physical system, even if those components are very far apart. Such correlations cannot be explained by classical physics and have been the subject of active experimental investigation for more than 40 years, including pioneering demonstrations that used entangled states of photons, carried out by John Clauser (son of Caltech's Millikan Professor of Engineering, Emeritus, Francis Clauser).

In more recent times, entangled quantum states have emerged as a critical resource for enabling tasks in information science that are otherwise impossible in the classical realm of conventional information processing and distribution. Some tasks in quantum information science (for instance, the implementation of scalable quantum networks) require that entangled states be stored in massive particles, which was first accomplished for trapped ions separated by a few hundred micrometers in experiments at the National Institute of Standards and Technology in Boulder, Colorado, in 1998.

In the Caltech experiment, the entanglement involves "collective atomic spin excitations." To generate such excitations, an ensemble of cold atoms initially all in level "a" of two possible ground levels is addressed with a suitable "writing" laser pulse. For weak excitation with the write laser, one atom in the sample is sometimes transferred to ground level "b," thereby emitting a photon.

Because of the impossibility of determining which particular atom emitted the photon, detection of this first write photon projects the ensemble of atoms into a state with a single collective spin excitation distributed over all the atoms. The presence (one atom in state b) or absence (all atoms in state a) of this symmetrized spin excitation behaves as a single quantum bit.

To generate entanglement between spatially separated ensembles at sites L and R, the write fields emitted at both locations are combined together in a fashion that erases any information about their origin. Under this condition, if a photon is detected, it is impossible in principle to determine from which ensemble's L or R it came, so that both possibilities must be included in the subsequent description of the quantum state of the ensembles.

The resulting quantum state is an entangled state with "1" stored in the L ensemble and "0" in the R ensemble, and vice versa. That is, there exist simultaneously the complimentary possibilities for one spin excitation to be present in level b at site L ("1") and all atoms in the ground level a at site R ("0"), as well as for no spin excitations to be present in level b at site L ("0") and one excitation to be present at site R ("1").

This entangled state can be stored in the atoms for a programmable time, and then transferred into propagating light fields, which had not been possible before now. The Caltech researchers devised a method to determine unambiguously the presence of entanglement for the propagating light fields, and hence for the atomic ensembles.

The Caltech experiment confirms for the first time experimentally that entanglement between two independent, remote, massive quantum objects can be created by quantum interference in the detection of a photon emitted by one of the objects.

In addition to Kimble, the other authors are Chin-Wen Chou, a graduate student in physics; Hugues de Riedmatten, Daniel Felinto, and Sergey Polyakov, all postdoctoral scholars in Kimble's group; and Steven J. van Enk of Bell Labs, Lucent Technologies.

Robert Tindol

World Network Speed Record Shattered for Third Consecutive Year

Caltech, SLAC, Fermilab, CERN, Michigan, Florida, Brookhaven, Vanderbilt and Partners in the UK, Brazil, Korea and Japan Set 131.6 Gigabit Per Second Mark During the SuperComputing 2005 Bandwidth Challenge

SEATTLE, Wash.—An international team of scientists and engineers for the third consecutive year has smashed the network speed record, moving data along at an average rate of 100 gigabits per second (Gbps) for several hours at a time. A rate of 100 Gbps is sufficient for transmitting five feature-length DVD movies on the Internet from one location to another in a single second.

The winning "High-Energy Physics" team is made up of physicists, computer scientists, and network engineers led by the California Institute of Technology, the Stanford Linear Accelerator Center (SLAC), Fermilab, CERN, and the University of Michigan and partners at the University of Florida, Vanderbilt, and the Brookhaven National Lab, as well as international participants from the UK (University of Manchester and UKLight), Brazil (Rio de Janeiro State University, UERJ, and the State Universities of São Paulo, USP and UNESP), Korea (Kyungpook National University, KISTI) and Japan (the KEK Laboratory in Tsukuba), who joined forces to set a new world record for data transfer, capturing first prize at the Supercomputing 2005 (SC|05) Bandwidth Challenge (BWC).

The HEP team's demonstration of "Distributed TeraByte Particle Physics Data Sample Analysis" achieved a peak throughput of 151 Gbps and an official mark of 131.6 Gbps measured by the BWC judges on 17 of the 22 optical fiber links used by the team, beating their previous mark for peak throughput of 101 Gbps by 50 percent. In addition to the impressive transfer rate for DVD movies, the new record data transfer speed is also equivalent to serving 10,000 MPEG2 HDTV movies simultaneously in real time, or transmitting all of the printed content of the Library of Congress in 10 minutes.

The team sustained average data rates above the 100 Gbps level for several hours for the first time, and transferred a total of 475 terabytes of physics data among the team's sites throughout the U.S. and overseas within 24 hours. The extraordinary data transport rates were made possible in part through the use of the FAST TCP protocol developed by Associate Professor of Computer Science and Electrical Engineering Steven Low and his Caltech Netlab team, as well as new data transport applications developed at SLAC and Fermilab and an optimized Linux kernel developed at Michigan.

Professor of Physics Harvey Newman of Caltech, head of the HEP team and US CMS Collaboration Board Chair, who originated the LHC Data Grid Hierarchy concept, said, "This demonstration allowed us to preview the globally distributed Grid system of more than 100 laboratory and university-based computing facilities that is now being developed in the U.S., Latin America, and Europe in preparation for the next generation of high-energy physics experiments at CERN's Large Hadron Collider (LHC) that will begin operation in 2007.

"We used a realistic mixture of streams, including the organized transfer of multiterabyte datasets among the laboratory centers at CERN, Fermilab, SLAC, and KEK, plus numerous other flows of physics data to and from university-based centers represented by Caltech, Michigan, Florida, Rio de Janeiro and São Paulo in Brazil, and Korea, to effectively use the remainder of the network capacity.

"The analysis of this data will allow physicists at CERN to search for the Higgs particles thought to be responsible for mass in the universe, supersymmetry, and other fundamentally new phenomena bearing on the nature of matter and space-time, in an energy range made accessible by the LHC for the first time."

The largest physics collaborations at the LHC, CMS and ATLAS each encompass more than 2,000 physicists and engineers from 160 universities and laboratories. In order to fully exploit the potential for scientific discoveries, the many petabytes of data produced by the experiments will be processed, distributed, and analyzed using a global Grid. The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport terabyte-scale data samples on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" of already-understood particle interactions. This data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Matt Crawford, head of the Fermilab network team at SC|05 said, "The realism of this year's demonstration represents a major step in our ability to show that the unprecedented systems required to support the next round of high-energy physics discoveries are indeed practical. Our data sources in the bandwidth challenge were some of our mainstream production storage systems and file servers, which are now helping to drive the searches for new physics at the high-energy frontier at Fermilab's Tevatron, as well the explorations of the far reaches of the universe by the Sloan Digital Sky Survey."

Les Cottrell, leader of the SLAC team and assistant director of scientific computing and computing services, said, "Some of the pleasant surprises at this year's challenge were the advances in throughput we achieved using real applications to transport physics data, including bbcp and xrootd developed at SLAC. The performance of bbcp used together with Caltech's FAST protocol and an optimized Linux kernel developed at Michigan, as well as our xrootd system, were particularly striking. We were able to match the performance of the artificial data transfer tools we used to reach the peak rates in past years."

Future optical networks incorporating multiple 10 Gbps links are the foundation of the Grid system that will drive the scientific discoveries. A "hybrid" network integrating both traditional switching and routing of packets and dynamically constructed optical paths to support the largest data flows is a central part of the near-term future vision that the scientific community has adopted to meet the challenges of data-intensive science in many fields. By demonstrating that many 10 Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic Grid supporting many terabyte and larger data transactions is practical.

Shawn McKee, associate research scientist in the University of Michigan Department of Physics and leader of the UltraLight Network technical group, said, "This achievement is an impressive example of what a focused network effort can accomplish. It is an important step towards the goal of delivering a highly capable end-to-end network-aware system and architecture that meet the needs of next-generation e-science."

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and plan to deploy a new generation of revolutionary Internet applications. Multigigabit end-to-end network performance will empower scientists to form "virtual organizations" on a planetary scale, sharing their collective computing and data resources in a flexible way. In particular, this is vital for projects on the frontiers of science and engineering in "data intensive" fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

The new bandwidth record was achieved through extensive use of the SCInet network infrastructure at SC|05. The team used 15 10 Gbps links to Cisco Systems Catalyst 6500 Series Switches at the Caltech Center for Advanced Computing Research (CACR) booth, and seven 10 Gbps links to a Catalyst 6500 Series Switch at the SLAC/Fermilab booth, together with computing clusters provided by Hewlett Packard, Sun Microsystems, and IBM, and a large number of 10 gigabit Ethernet server interfaces-more than 80 provided by Neterion, and 14 by Chelsio.

The external network connections to Los Angeles, Sunnyvale, the Starlight facility in Chicago, and Florida included the Cisco Research, Internet2/HOPI, UltraScience Net and ESnet wavelengths carried by National Lambda Rail (NLR); Internet2's Abilene backbone; the three wavelengths of TeraGrid; an ESnet link provided by Qwest; the Pacific Wave link; and Canada's CANARIE network. International connections included the US LHCNet links (provisioned by Global Crossing and Colt) between Chicago, New York, and CERN, the CHEPREO/WHREN link (provisioned by LANautilus) between Miami and Sao Paulo, the UKLight link, the Gloriad link to Korea, and the JGN2 link to Japan.

Regional connections included six 10 Gbps wavelengths provided with the help of CIENA to Fermilab; two 10 Gbps wavelengths to the Caltech campus provided by Cisco Systems' research waves across NLR and California's CENIC network; two 10 Gbps wavelengths to SLAC provided by ESnet and UltraScienceNet; three wavelengths between Starlight and the University of Michigan over Michigan Lambda Rail (MiLR); and wavelengths to Jacksonville and Miami across Florida Lambda Rail (FLR). During the test, several of the network links were shown to operate at full capacity for sustained periods.

While the SC|05 demonstration required a major effort by the teams involved and their sponsors, in partnership with major research and education network organizations in the U.S., Europe, Latin America, and Pacific Asia, it is expected that networking on this scale in support of the largest science projects (such as the LHC) will be commonplace within the next three to five years. The demonstration also appeared to stress the network and server systems used, so the team is continuing its test program to put the technologies and methods used at SC|05 into production use, with the goal of attaining the necessary level of reliability in time for the start of the LHC research program.

As part of the SC|05 demonstrations, a distributed analysis of simulated LHC physics data was done using the Grid-enabled Analysis Environment (GAE) developed at Caltech for the LHC and many other major particle physics experiments, as part of the Particle Physics Data Grid (PPDG), GriPhyN/iVDGL, Open Science Grid, and DISUN projects. This involved transferring data to CERN, Florida, Fermilab, Caltech, and Brazil for processing by clusters of computers, and finally aggregating the results back to the show floor to create a dynamic visual display of quantities of interest to the physicists. In another part of the demonstration, file servers at the SLAC/FNAL booth and in Manchester also were used for disk-to-disk transfers between Seattle and the UK.

The team used Caltech's MonALISA (MONitoring Agents using a Large Integrated Services Architecture) system to monitor and display the real-time data for all the network links used in the demonstration. It simultaneously monitored more than 14,000 grid nodes in 200 computing clusters. MonALISA ( is a highly scalable set of autonomous self-describing agent-based subsystems that are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and Grid systems, as well as the scientific applications themselves.

The network has been deployed through exceptional support by Cisco Systems, Hewlett Packard, Neterion, Chelsio, Sun Microsystems, IBM, and Boston Ltd., as well as the network engineering staffs of National LambdaRail, Internet2's Abilene Network, ESnet, TeraGrid, CENIC, MiLR, FLR, Pacific Wave, AMPATH, RNP and ANSP/FAPESP in Brazil, KISTI in Korea, UKLight in the UK, JGN2 in Japan, and the Starlight international peering point in Chicago.

The demonstration and the developments leading up to it were made possible through the strong support of the U.S. Department of Energy Office of Science and the National Science Foundation, in cooperation with the funding agencies of the international partners.

Further information about the demonstration may be found at: and

About Caltech: With an outstanding faculty, including five Nobel laureates, and such off-campus facilities as the Jet Propulsion Laboratory, Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities.

About SLAC: The Stanford Linear Accelerator Center (SLAC) is one of the world's leading research laboratories. Its mission is to design, construct, and operate state-of-the-art electron accelerators and related experimental facilities for use in high-energy physics and synchrotron radiation research. In the course of doing so, it has established the largest known database in the world, which grows at 1 terabyte per day. That, and its central role in the world of high-energy physics collaboration, places SLAC at the forefront of the international drive to optimize the worldwide, high-speed transfer of bulk data.

About CACR: Caltech's Center for Advanced Computing Research (CACR) performs research and development on leading edge networking and computing systems, and methods for computational science and engineering. Some current efforts at CACR include the National Virtual Observatory, ASC Center for Simulation of Dynamic Response of Materials, Particle Physics Data Grid, GriPhyN, Computational Infrastructure for Geophysics, Cascade High Productivity Computing System, and the TeraGrid.

About Netlab: Netlab is the Networking Laboratory at Caltech led by Steven Low, where FAST TCP has been developed. The group does research in the control and optimization of protocols and networks, and designs, analyzes, implements, and experiments with new algorithms and systems.

About the University of Michigan: The University of Michigan, with its size, complexity, and academic strength, the breadth of its scholarly resources and the quality of its faculty and students, is one of America's great public universities and one of the world's premiere research institutions. The university was founded in 1817 and has a total enrollment of 54,300 on all campuses. The main campus is in Ann Arbor, Michigan, and has 39,533 students (fall 2004). With over 600 degree programs and $739M in FY05 research funding, the university is one of the leaders in innovation and research. For more information, see

About the University of Florida: The University of Florida (UF), located in Gainesville, is a major public, comprehensive, land-grant, research university. The state's oldest, largest, and most comprehensive university, UF is among the nation's most academically diverse public universities. It has a long history of established programs in international education, research, and service and has a student population of approximately 49,000. UF is the lead institution for the GriPhyN and iVDGL projects and is a Tier-2 facility for the CMS experiment. For more information, see

About Fermilab: Fermi National Accelerator Laboratory (Fermilab) is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Universities Research Association, Inc. Experiments at Fermilab's Tevatron, the world's highest-energy particle accelerator, generate petabyte-scale data per year, and involve large, international collaborations with requirements for high-volume data movement to their home institutions. The laboratory actively works to remain on the leading edge of advanced wide-area network technology in support of its collaborations.

About CERN: CERN, the European Organization for Nuclear Research, has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see

About StarLight: StarLight is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications. Operational since summer 2001, StarLight is a 1 GE and 10 GE switch/router facility for high-performance access to participating networks, and also offers true optical switching for wavelengths. StarLight is being developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), the International Center for Advanced Internet Research (iCAIR) at Northwestern University, and the Mathematics and Computer Science Division at Argonne National Laboratory, in partnership with Canada's CANARIE and the Netherlands' SURFnet. STAR TAP and StarLight are made possible by major funding from the U.S. National Science Foundation to UIC. StarLight is a service mark of the Board of Trustees of the University of Illinois. See

About the University of Manchester: The University of Manchester has been created by combining the strengths of UMIST (founded in 1824) and the Victoria University of Manchester (founded in 1851) to form the largest single-site university in the UK, with 34,000 students. On Friday, October 22, 2004, it received its Royal Charter from Queen Elizabeth II, with an unprecedented £300M capital investment program. Twenty-three Nobel Prize winners have studied at Manchester, continuing a proud tradition of innovation and excellence. Rutherford conducted the research that led to the splitting of the atom there, and the world's first stored-program electronic digital computer successfully executed its first program there in June 1948. The Schools of Physics, Computational Science, Computer Science and the Network Group, together with the E-Science North West Centre research facility, are very active in developing a wide range of e-science projects and Grid technologies. See

About UERJ (Rio de Janeiro): Founded in 1950, the Rio de Janeiro State University (UERJ; http// ranks among the ten largest universities in Brazil, with more than 23,000 students. UERJ's five campuses are home to 22 libraries, 412 classrooms, 50 lecture halls and auditoriums, and 205 laboratories. UERJ is responsible for important public welfare and health projects through its centers of medical excellence, the Pedro Ernesto University Hospital (HUPE) and the Piquet Carneiro Day-care Policlinic Centre, and it is committed to the preservation of the environment. The UERJ High Energy Physics group includes 15 faculty, postdoctoral, and visiting PhD physicists, and 12 PhD and master's students, working on experiments at Fermilab (D0) and CERN (CMS). The group has constructed a Tier2 center to enable it to take part in the Grid-based data analysis planned for the LHC, and has originated the concept of a Brazilian "HEP Grid," working in cooperation with USP and several other universities in Rio and São Paulo.

About UNESP (São Paulo): Created in 1976 with the administrative union of several isolated institutes of higher education in the state of Saõ Paulo, the São Paulo State University, UNESP, has campuses in 24 different cities in the State of São Paulo. The university has 25,000 undergraduate students and almost 10,000 graduate students. Since 1999 the university has had a group participating in the DZero Collaboration of Fermilab, which is operating the São Paulo Regional Analysis Center (SPRACE). See

About USP (São Paulo): The University of São Paulo, USP, is the largest institution of higher education and research in Brazil, and the third largest in Latin America. The university has most of its 35 units located on the campus in the state capital. It has around 40,000 undergraduate students and around 25,000 graduate students. It is responsible for almost 25 percent of all Brazilian papers and publications indexed on the Institute for Scientific Information (ISI). The SPRACE cluster is located at the Physics Institute. See

About Kyungpook National University (Daegu): Kyungpook National University is one of the leading universities in Korea, especially in physics and information science. The university has 13 colleges and nine graduate schools with 24,000 students. It houses the Center for High-Energy Physics (CHEP) in which most Korean high-energy physicists participate. CHEP ( was approved as one of the designated Excellent Research Centers supported by the Korean Ministry of Science.

About Vanderbilt: One of America's top 20 universities, Vanderbilt University is a private research university of 6,319 undergraduates and 4,566 graduate and professional students. The university comprises 10 schools, a public policy institute, a distinguished medical center, and the Freedom Forum First Amendment Center. Located a mile and a half southwest of downtown Nashville, the campus is in a park-like setting. Buildings on the original campus date to its founding in 1873, and the Peabody section of campus has been registered as a National Historic Landmark since 1966. Vanderbilt ranks 24th in the value of federal research grants awarded to faculty members, according to the National Science Foundation.

About the Particle Physics Data Grid (PPDG): The Particle Physics Data Grid; see is developing and deploying production Grid systems that vertically integrate experiment-specific applications, Grid technologies, Grid and facility computation, and storage resources to form effective end-to-end capabilities. PPDG is a collaboration of computer scientists with a strong record in Grid technology and physicists with leading roles in the software and network infrastructures for major high-energy and nuclear experiments. PPDG's goals and plans are guided by the immediate and medium-term needs of the physics experiments and by the research and development agenda of the computer science groups.

About GriPhyN and iVDGL: GriPhyN ( and iVDGL ( are developing and deploying Grid infrastructure for several frontier experiments in physics and astronomy. These experiments together will utilize petaflops of CPU power and generate hundreds of petabytes of data that must be archived, processed, and analyzed by thousands of researchers at laboratories, universities, and small colleges and institutes spread around the world. The scale and complexity of this "petascale" science drive GriPhyN's research program to develop Grid-based architectures, using "virtual data" as a unifying concept. IVDGL is deploying a Grid laboratory where these technologies can be tested at large scale and where advanced technologies can be implemented for extended studies by a variety of disciplines.

About CHEPREO: Florida International University (FIU), in collaboration with partners at Florida State University, the University of Florida, and the California Institute of Technology, has been awarded an NSF grant to create and operate an interregional Grid-enabled Center for High-Energy Physics Research and Educational Outreach (CHEPREO; at FIU. CHEPREO encompasses an integrated program of collaborative physics research on CMS, network infrastructure development, and educational outreach at one of the largest minority universities in the US. The center is funded by four NSF directorates: Mathematical and Physical Sciences, Scientific Computing Infrastructure, Elementary, Secondary and Informal Education, and International Programs.

About the Open Science Grid: The OSG makes innovative science possible by bringing multidisciplinary collaborations together with the latest advances in distributed computing technologies. This shared cyberinfrastructure, built by research groups from U.S. universities and national laboratories, receives support from the National Science Foundation and the U.S. Department of Energy's Office of Science. For more information about the OSG, visit

About Internet2®: Led by more than 200 U.S. universities working with industry and government, Internet2 develops and deploys advanced network applications and technologies for research and higher education, accelerating the creation of tomorrow's Internet. Internet2 recreates the partnerships among academia, industry, and government that helped foster today's Internet in its infancy. For more information, visit:

About the Abilene Network: Abilene, developed in partnership with Qwest Communications, Juniper Networks, Nortel Networks and Indiana University, provides nationwide high-performance networking capabilities for more than 225 universities and research facilities in all 50 states, the District of Columbia, and Puerto Rico. For more information on Abilene, see

About the TeraGrid: The TeraGrid, funded by the National Science Foundation, is a multiyear effort to build a distributed national cyberinfrastructure. TeraGrid entered full production mode in October 2004, providing a coordinated set of services for the nation's science and engineering community. TeraGrid's unified user support infrastructure and software environment allow users to access storage and information resources as well as over a dozen major computing systems at nine partner sites via a single allocation, either as stand-alone resources or as components of a distributed application using Grid software capabilities. Over 40 teraflops of computing power, 1.5 petabytes of online storage, and multiple visualization, data collection, and instrument resources are integrated at the nine TeraGrid partner sites. Coordinated by the University of Chicago and Argonne National Laboratory, the TeraGrid partners include the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UIUC), San Diego Supercomputer Center (SDSC) at the University of California, San Diego (UCSD), the Center for Advanced Computing Research (CACR) at the California Institute of Technology (Caltech), the Pittsburgh Supercomputing Center (PSC), Oak Ridge National Laboratory, Indiana University, Purdue University, and the Texas Advanced Computing Center (TACC) at the University of Texas-Austin.

About National LambdaRail: National LambdaRail (NLR) is a major initiative of U.S. research universities and private sector technology companies to provide a national-scale infrastructure for research and experimentation in networking technologies and applications. NLR puts the control, the power, and the promise of experimental network infrastructure in the hands of the nation's scientists and researchers. Visit for more information.

About CENIC: CENIC ( is a not-for-profit corporation serving the California Institute of Technology, California State University, Stanford University, University of California, University of Southern California, California Community Colleges, and the statewide K-12 school system. CENIC's mission is to facilitate and coordinate the development, deployment, and operation of a set of robust multi-tiered advanced network services for this research and education community.

About ESnet: The Energy Sciences Network (ESnet; is a high-speed network serving thousands of Department of Energy scientists and collaborators worldwide. A pioneer in providing high-bandwidth, reliable connections, ESnet enables researchers at national laboratories, universities, and other institutions to communicate with each other using the collaborative capabilities needed to address some of the world's most important scientific challenges. Managed and operated by the ESnet staff at Lawrence Berkeley National Laboratory, ESnet provides direct high-bandwidth connections to all major DOE sites, multiple cross connections with Internet2/Abilene, connections to Europe via GEANT and to Japan via SuperSINET, and fast interconnections to more than 100 other networks. Funded principally by DOE's Office of Science, ESnet services allow scientists to make effective use of unique DOE research facilities and computing resources, independent of time and geographic location.

About Qwest: Qwest Communications International Inc. (NYSE: Q) is a leading provider of voice, video, and data services. With more than 40,000 employees, Qwest is committed to the "Spirit of Service" and to providing world-class services that exceed customers' expectations for quality, value, and reliability. For more information, please visit the Qwest Web site at

About UKLight: The UKLight facility ( was set up in 2003 with a grant of £6.5M from HEFCE (the Higher Education Funding Council for England) to provide an international experimental testbed for optical networking and support projects working on developments towards optical networks and the applications that will use them. UKLight will bring together leading-edge applications, Internet engineering for the future, and optical communications engineering, and enable UK researchers to join the growing international consortium that currently spans Europe and North America. A "Point of Access" (PoA) in London provides international connectivity with 10 Gbit network connections to peer facilities in Chicago (StarLight) and Amsterdam (NetherLight). UK research groups gain access to the facility via extensions to the 10Gbit SuperJANET development network, and a national dark fiber facility is under development for use by the photonics research community. Management of the UKLight facility is being undertaken by UKERNA on behalf of the Joint Information Systems Committee (JISC).

About AMPATH: Florida International University's Center for Internet Augmented Research and Assessment (CIARA) has developed an international, high-performance research connection point in Miami, Florida, called AMPATH (AMericasPATH; AMPATH's goal is to enable wide-bandwidth digital communications between U.S. and international research and education networks, as well as between a variety of U.S. research programs in the region. AMPATH in Miami acts as a major international exchange point (IXP) for the research and education networks in South America, Central America, Mexico, and the Caribbean. The AMPATH IXP is home for the WHREN-LILA high-performance network link connecting Latin America to the U.S., funded by the NSF (award #0441095) and the Academic Network of São Paulo (award #2003/13708-0).

About the Academic Network of São Paulo (ANSP): ANSP unites São Paulo's University networks with Scientific and Technological Research Centers in São Paulo, and is managed by the State of São Paulo Research Foundation (FAPESP). The ANSP Network is another example of international collaboration and exploration. Through its connection to WHREN-LILA, all of the institutions connected to ANSP will be involved in research with U.S. universities and research centers, offering significant contributions and the potential to develop new applications and services. This connectivity with WHREN-LILA and ANSP will allow researchers to enhance the quality of current data, inevitably increasing the quality of new scientific development. See

About RNP: RNP, the National Education and Research Network of Brazil, is a not-for-profit company that promotes the innovative use of advanced networking with the joint support of the Ministry of Science and Technology and the Ministry of Education. In the early 1990s, RNP was responsible for the introduction and adoption of Internet technology in Brazil. Today, RNP operates a nationally deployed multigigabit network used for collaboration and communication in research and education throughout the country, reaching all 26 states and the Federal District, and provides both commodity and advanced research Internet connectivity to more than 300 universities, research centers, and technical schools. See

About KISTI: KISTI (Korea Institute of Science and Technology Information), which was assigned to play the pivotal role in establishing the national science and technology knowledge information infrastructure, was founded through the merger of the Korea Institute of Industry and Technology Information (KINITI) and the Korea Research and Development Information Center (KORDIC) in January, 2001. KISTI is under the supervision of the Office of the Prime Minister and will play a leading role in building the nationwide infrastructure for knowledge and information by linking the high-performance research network with its supercomputers.

About Hewlett Packard: HP is a technology solutions provider to consumers, businesses, and institutions globally. The company's offerings span IT infrastructure, global services, business and home computing, and imaging, and printing. More information about HP (NYSE, Nasdaq: HPQ) is available at

About Sun Microsystems: Since its inception in 1982, a singular vision-"The Network Is The Computer(TM)"-has propelled Sun Microsystems, Inc. (Nasdaq: SUNW) to its position as a leading provider of industrial-strength hardware, software, and services that make the Net work. Sun can be found in more than 100 countries and on the World Wide Web at

About IBM: IBM is the world's largest information technology company, with 80 years of leadership in helping businesses innovate. Drawing on resources from across IBM and key business partners, IBM offers a wide range of services, solutions, and technologies that enable customers, large and small, to take full advantage of the new era of e-business. For more information about IBM, visit

About Boston Limited: With over 12 years of experience, Boston Limited ( is a UK-based specialist in high-end workstation, server, and storage hardware. Boston's solutions bring the latest innovations to market, such as PCI-Express, DDR II, and Infiniband technologies. As the pan-European distributor for Supermicro, Boston Limited works very closely with key manufacturing partners, as well as strategic clients within the academic and commercial sectors, to provide cost-effective solutions with exceptional performance.

About Neterion, Inc.: Founded in 2001, Neterion Inc. has locations in Cupertino, California, and Ottawa, Canada. Neterion delivers 10 Gigabit Ethernet hardware and software solutions that solve customers' high-end networking problems. The Xframe(r) line of products is based on Neterion-developed technologies that deliver new levels of performance, availability and reliability in the datacenter. Xframe, Xframe II, and Xframe E include full IPv4 and IPv6 support and comprehensive stateless offloads that preserve the integrity of current TCP/IP implementations without "breaking the stack." Xframe drivers are available for all major operating systems, including Microsoft Windows, Linux, Hewlett-Packard's HP-UX, IBM's AIX, Sun's Solaris and SGI's Irix. Neterion has raised over $42M in funding, with its latest C round taking place in June 2004. Formerly known as S2io, the company changed its name to Neterion in January 2005. Further information on the company can be found at

About Chelsio Communications: Chelsio Communications is leading the convergence of networking, storage, and clustering interconnects with its robust, high-performance, and proven protocol acceleration technology. Featuring a highly scalable and programmable architecture, Chelsio is shipping 10-Gigabit Ethernet adapter cards with protocol offload, delivering the low latency and superior throughput required for high-performance computing applications. For more information, visit the company online at

About the National Science Foundation: The NSF is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense...." With an annual budget of about $5.5 billion, it is the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science, and the social sciences, NSF is the major source of federal backing.

About the DOE Office of Science: DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the nation, and ensures U.S. world leadership across a broad range of scientific disciplines. The Office of Science also manages 10 world-class national laboratories with unmatched capabilities for solving complex interdisciplinary problems, and it builds and operates some of the nation's most advanced R&D user facilities, located at national laboratories and universities. These facilities are used by more than 19,000 researchers from universities, other government agencies, and private industry each year.




Robert Tindol
Exclude from News Hub: 

Caltech Researchers Achieve First Electrowetting of Carbon Nanotubes

PASADENA, Calif.—If you can imagine the straw in your soda can being a million times smaller and made of carbon, you pretty much have a mental picture of a carbon nanotube. Scientists have been making them at will for years, but have never gotten the nanotubes to suck up liquid metal to form tiny wires. In fact, conventional wisdom and hundreds of refereed papers say that such is not even possible.

Now, with the aid of an 1875 study of mercury's electrical properties, researchers from the California Institute of Technology have succeeded in forcing liquid mercury into carbon nanotubes. Their technique could have important applications, including nanolithography, the production of nanowires with unique quantum properties, nano-sized plumbing for the transport of extremely small fluid quantities, and electronic circuitry many times smaller than the smallest in existence today.

Reporting in the December 2 issue of the journal Science, Caltech assistant professor of chemistry Patrick Collier and associate professor of chemical engineering Konstantinos Giapis describe their success in electrowetting carbon nanotubes. By "electrowetting" they mean that the voltage applied to a nanotube immersed in mercury causes the liquid metal to rise into the nanotube by capillary action and cling to the surface of its inner wall.

Besides its potential for fundamental research and commercial applications, Giapis says that the result is an opportunity to set the record straight. "We have found that when measuring the properties of carbon nanotubes in contact with liquid metals, researchers need to take into account that the application of a voltage can result in electrically activated wetting of the nanotube.

"Ever since carbon nanotubes were discovered in 1991, people have envisioned using them as molds to make nanowires or as nanochannels for flowing liquids. The hope was to have the nanotubes act like molecular straws," says Giapis.

However, researchers never got liquid metal to flow into the straws, and eventually dismissed the possibility that metal could even do so because of surface tension. Mercury was considered totally unpromising because, as anyone knows who has played with liquid mercury in chemistry class, a glob will roll around a desktop without wetting anything it touches.

"The consensus was that the surface tension of metals was just too high to wet the walls of the nanotubes," adds Collier, the co-lead author of the paper. This is not to say that researchers have never been able to force anything into a nanotube: in fact, they have, albeit by using more complex and less controllable ways that have always led to the formation of discontinuous wires.

Collier and Giapis enter the picture because they had been experimenting with coating nanotubes with an insulator in order to create tiny probes for future medical and industrial applications. In attaching nanotubes to gold-coated atomic force microscope tips to form nanoprobes, they discovered that the setup provided a novel way of making liquid mercury rise in the tubes by capillary action.

Casting far beyond the nanotube research papers of the last decade, the researchers found an 1875 study by Nobel Prize-winning physicist Gabriel Lippmann that described in detail how the surface tension of mercury is altered by the application of an electrical potential. Lippmann's 1875 paper provided the starting point for Collier and Giapis to begin their electrowetting experiments.

After mercury entered the nanotubes with the application of a voltage, the researchers further discovered that the mercury rapidly escaped from the nanotubes immediately after the voltage was turned off. "This effect made it very difficult to provide hard proof that electrowetting occurred," Collier said. In the end, persistence and hard work paid off as the results in the Science paper demonstrate.

Giapis and Collier think that they will be able to drive various other metals into the nanotubes by employing the process at higher temperature. They hope to be able to freeze the metal nanowires in the nanotubes so that they remain intact when the voltage is turned off.

"We can pump mercury at this point, but it's possible that you could also pump nonmetallic liquids," Giapis says. "So we now have a way of pumping fluids controllably that could lead to nanofluidic devices. We envision making nano-inkjet printers that will use metal ink to print text and circuitry with nanometer precision. These devices could be scaled up to operate in a massively parallel manner. "

The paper is titled "Electrowetting in Carbon Nanotubes." In addition to Collier and Giapis, the other authors are Jinyu Chen, a postdoctoral scholar in chemistry, and Aleksandr Kutana, a postdoctoral scholar in chemical engineering.

Robert Tindol
Exclude from News Hub: 

Deciphering the Mystery of Bee Flight

PASADENA, Calif.- One of the most elusive questions in science has finally been answered: How do bees fly?

Although the issue is not as profound as how the universe began or what kick-started life on earth, the physics of bee flight has perplexed scientists for more than 70 years. In 1934, in fact, French entomologist August Magnan and his assistant André Sainte-Lague calculated that bee flight was aerodynamically impossible. The haphazard flapping of their wings simply shouldn't keep the hefty bugs aloft.

And yet, bees most certainly fly, and the dichotomy between prediction and reality has been used for decades to needle scientists and engineers about their inability to explain complex biological processes.

Now, Michael H. Dickinson, the Esther M. and Abe M. Zarem Professor of Bioengineering, and his postdoctoral student Douglas L. Altshuler and their colleagues at Caltech and the University of Nevada at Las Vegas, have figured out honeybee flight using a combination of high-speed digital photography, to snap freeze-frame images of bees in motion, and a giant robotic mock-up of a bee wing. The results of their analysis appear in the November 28 issue of the Proceedings of the National Academy of Sciences.

"We're no longer allowed to use this story about not understanding bee flight as an example of where science has failed, because it is just not true," Dickinson says.

The secret of honeybee flight, the researchers say, is the unconventional combination of short, choppy wing strokes, a rapid rotation of the wing as it flops over and reverses direction, and a very fast wing-beat frequency.

"These animals are exploiting some of the most exotic flight mechanisms that are available to insects," says Dickinson.

Their furious flapping speed is surprising, Dickinson says, because "generally the smaller the insect the faster it flaps. This is because aerodynamic performance decreases with size, and so to compensate small animals have to flap their wings faster. Mosquitoes flap at a frequency of over 400 beats per second. Birds are more of a whump, because they beat their wings so slowly."

Being relatively large insects, bees would be expected to beat their wings rather slowly, and to sweep them across the same wide arc as other flying bugs (whose wings cover nearly half a circle). They do neither. Their wings beat over a short arc of about 90 degrees, but ridiculously fast, at around 230 beats per second. Fruit flies, in comparison, are 80 times smaller than honeybees, but flap their wings only 200 times a second.

When bees want to generate more power--for example, when they are carting around a load of nectar or pollen--they increase the arc of their wing strokes, but keep flapping at the same rate. That is also odd, Dickinson says, because "it would be much more aerodynamically efficient if they regulated not how far they flap their wings but how fast "

Honeybees' peculiar strategy may have to do with the design of their flight muscles.

"Bees have evolved flight muscles that are physiologically very different from those of other insects. One consequence is that the wings have to operate fast and at a constant frequency or the muscle doesn't generate enough power," Dickinson says.

"This is one of those cases where you can make a mistake by looking at an animal and assuming that it is perfectly adapted. An alternate hypothesis is that bee ancestors inherited this kind of muscle and now present-day bees must live with its peculiarities," Dickinson says.

How honeybees make the best of it may help engineers in the design of flying insect-sized robots: "You can't shrink a 747 wing down to this size and expect it to work, because the aerodynamics are different," he says. "But the way in which bee wings generate forces is directly applicable to these devices."


Contact: Kathy Svitil (626) 395-8022

Visit the Caltech Media Relations Web site at: