Radio Stars

Caltech's newest astronomy professor searches for cosmic radio waves

Growing up in rural northwest Ireland, beyond the reach of city lights, Gregg Hallinan fell in love with the night sky. "When you didn't have bad weather, and you didn't have clouds, the skies were nothing short of spectacular," he says. "From a young age, I was obsessed with astronomy—it's all I cared for. My parents got me a telescope when I was seven or eight, and from then on, that was it."

Now, Hallinan has brought his celestial obsession to Caltech as a new assistant professor of astronomy. He explores magnetic activity around stars, planets, and those in-between objects called brown dwarfs, which are balls of gas that just aren't big enough for nuclear fusion to ignite and turn them into stars. "I consider myself incredibly lucky that I can take my passion and hobby and have it as a career," he says.

Magnetic fields play important roles throughout the universe. In our own neighborhood, the sun's magnetic field causes all the features we see on its surface, like sunspots, solar flares, and long arcs of plasma called prominences. Earth's magnetic field forms a huge bubble that shields the planet from solar wind, which contains energetic particles that can strip the Earth of its ozone layer, our protection from harmful ultraviolet radiation. On Earth and other planets, like Jupiter, magnetic fields accelerate charged particles and slam them into the magnetic poles, creating light shows we know as the northern and southern lights.

Around stars, planets, and other astronomical objects, the accelerating particles also produce radio signals that travel across space. Hallinan's most surprising and important discovery yet was one he made as a graduate student at the National University of Ireland, Galway, when he found that brown dwarfs generate regular pulses of radio emissions—a feature more commonly associated with planets like Jupiter.

After receiving his PhD in 2008, he stayed at Galway as a postdoc for two years to follow up on his thesis work. He then spent five months at the National Radio Astronomical Observatory in Socorro, New Mexico, and a year at the University of California, Berkeley, before coming to Caltech.

Studying magnetic fields and the fleeting blips of radio emission they produce has led Hallinan to become increasingly interested in those signals, which are examples of a broader type of phenomenon called radio transients. All kinds of cosmic phenomena can generate these variable radio signals, such as exploding stars, mysterious blasts called gamma ray bursts, and stars being ripped apart when venturing too close to a black hole.

One of the most exciting sources of radio transients are planets around other stars, Hallinan says, and hunting for radio-emitting planets will be a major focus of his research. Radio pulses from an exoplanet would indicate the presence of a magnetic field, and since Earth's protective magnetic field may have been crucial for allowing life to evolve, radio activity from an exoplanet could be a signature for a habitable planet. "Looking very long term, when we're characterizing habitable planets, magnetic fields could be important constraints for trying to figure out if there's life on those planets," Hallinan says. But so far, he adds, no one has detected radio emission from any exoplanet. "I'm trying to be one of the pioneers in trying to detect that radiation."

"The radio transient sky is virtually unexplored," he says. "We know there's stuff happening out there, but we haven't yet got the technology to systematically search for those transients." But Hallinan is working to change that, helping to lead exactly such a search for radio emission from exoplanets and other kinds of radio transients. In particular, he's enlisting the most powerful radio telescope in the world to help with the search: the Jansky Very Large Array in New Mexico. He's also working to bring a radio-transient monitoring project to Caltech's Owens Valley Radio Observatory, east of the Sierra Nevada. And with other new radio telescopes coming on line—such as the Low Frequency Array in Europe and the Long Wavelength Array in New Mexico—discoveries are on their way, Hallinan says. That especially includes exoplanets. "We're very hopeful that we'll find radio emission from other planets in the next few years."

Caltech is already the home of the Palomar Transient Factory, a project led by professor of astronomy and planetary science Shri Kulkarni that surveys the skies for flashes of light in visible wavelengths, instead of radio. "Caltech is pretty much unparalleled in the study of transient science," Hallinan says, and a radio-transient project will expand on Caltech's expertise. "The most exciting thing about radio transient work is that we don't know what's out there. You're at the cutting edge."

Aside from astronomy, he's a big fan of mixed martial arts; back in Ireland, he was a karate instructor. But these days, Hallinan is focused on the cosmos, a passion first kindled on those clear winter nights above the Irish countryside.  

Marcus Woo
Exclude from News Hub: 
News Type: 
Research News

Caltech's George Helou Honored by Home Country of Lebanon

George Helou, senior research associate in physics at Caltech, has received numerous honors over the past year from his home country of Lebanon in recognition of his work in astronomy. "It is gratifying to receive these accolades from my country of origin, as an indication of the value they attach to science and education," says Helou, who is also executive director of the Infrared Processing and Analysis Center (IPAC), deputy director of the Spitzer Science Center, and director of the Herschel Science Center. For the full story on his recent honors, which include election to the Lebanese Academy of Sciences, click here

Katie Neith
Exclude from News Hub: 

Michael Aschbacher Wins Wolf Prize in Mathematics

Michael Aschbacher, the Shaler Arthur Hanisch Professor of Mathematics, will share the 2012 Wolf Prize in mathematics. The award recognizes his role in classifying types of mathematical objects called finite simple groups. According to the prize citation, "His impact on the theory of finite groups is extraordinary in its breadth, depth, and beauty."

"The classification of finite simple groups is one of the crowning achievements of modern mathematics," says Hirosi Ooguri, the Fred Kavli Professor of Theoretical Physics and Mathematics at Caltech. "It's wonderful that Michael is recognized as the principal architect of this work."

Aschbacher will share the prize, which includes $100,000, with Luis Caffarelli at the University of Texas, Austin, who was recognized for work on partial differential equations. They will receive the award from Israeli President Shimon Peres at a ceremony on May 13 at the Knesset in Jerusalem.

"Receiving an award such as the Wolf Prize is of course personally very satisfying," Aschbacher says. "The finite simple groups are the building blocks of finite group theory, playing a role somewhat analogous to that of prime numbers in arithmetic. As a result, the classification theorem is not only a beautiful and natural result, but it's also very useful."

Since 1978, The Wolf Prize is awarded annually in the fields of agriculture, chemistry, mathematics, medicine, physics, and the arts. Among this year's winners is singer Placido Domingo. Past winners have included notable names such as Stephen Hawking in physics, violinist Isaac Stern and architect Frank Gehry in the arts. Previous winners from Caltech include Harry Gray, Ahmed Zewail, and Rudy Marcus in chemistry; Alexander Varshavsky, and the late Seymour Benzer, Edward Lewis, and Roger Sperry in medicine. 

Aschbacher has recently garnered several awards for his work on finite simple groups. He was awarded the 2012 Leroy P. Steele Prize for Mathematical Exposition, and last year, he won the Rolf Schock Prize from the Royal Swedish Academy of Sciences. He also received the Cole Prize in Algebra and is a member of the National Academy of Sciences and the American Academy of Arts and Sciences.

Marcus Woo
Exclude from News Hub: 

AAS Honors John Johnson for Observational Astronomical Research

John Johnson, assistant professor of astronomy at Caltech, has been named the recipient of the American Astronomical Society's 2012 Newton Lacy Pierce Prize, which is awarded for outstanding achievement in observational astronomical research based on measurements of radiation from an astronomical object.

According to the award citation, Johnson was selected "for major contributions to understanding fundamental relationships between extrasolar planets and their parent stars."

Johnson's primary research focus is on the detection and characterization of exoplanets. Recently, he led a team of astronomers that discovered the three smallest confirmed planets ever detected outside our solar system.

"I'm proud because this reflects well on my entire team here at Caltech, as well as my collaborators at other institutions," said Johnson. "We put in a lot of hard work and we strive to produce the best, highest impact, and most trustworthy exoplanet science out there. This award is a nice validation that we're on the right track."

As the recipient of the Pierce Prize, Johnson will receive a cash award and has been invited to give the plenary talk at the AAS meeting in Long Beach next year.

"I can remember sitting in a plenary talk as a grad student thinking about how nice it would be to one day do research worthy of such an audience," says Johnson. "It looks like I've arrived."


Allison Benter
Exclude from News Hub: 

Astronomers Release Unprecedented Data Set on Celestial Objects that Brighten and Dim

PASADENA, Calif.—Astronomers from the California Institute of Technology (Caltech) and the University of Arizona have released the largest data set ever collected that documents the brightening and dimming of stars and other celestial objects—two hundred million in total.

The night sky is filled with objects like asteroids that dash across the sky and others—like exploding stars and variable stars-that flash, dim, and brighten. Studying such phenomena can help astronomers better understand the evolution of stars, massive black holes in the centers of galaxies, and the structure of the Milky Way. These types of objects were also essential for the recent discovery of dark energy-the mysterious energy that dominates the expansion of the universe—which earned last year's Nobel Prize.

Using the Catalina Real-Time Transient Survey (CRTS), a project led by Caltech, the astronomers systematically scanned the heavens for these dynamic objects, producing an unprecedented data set that will allow scientists worldwide to pursue new research.

"Exploring variable objects and transient phenomena like stellar explosions is one of the most vibrant and growing research areas in astrophysics," says S. George Djorgovski, professor of astronomy at Caltech and principal investigator on the CRTS. "In many cases, this yields unique information needed to understand these objects."

The new data set is based on observations taken with the 0.7-meter telescope on Mt. Bigelow in Arizona. The observations were part of the Catalina Sky Survey (CSS), a search for Near-Earth Objects (NEOs)—asteroids that may pose a threat to Earth-conducted by astronomers at the University of Arizona. By repeatedly taking pictures of large swaths of the sky and comparing these images to previous ones, the CRTS is able to monitor the brightness of about half a billion objects, allowing it to search for those that dramatically brighten or dim. In this way, the CRTS team identified tens of thousands of variables, maximizing the science that can be gleaned from the original data.

The new data set contains the so-called brightness histories of a total of two hundred million stars and other objects, incorporating over 20 billion independent measurements. "This set of objects is an order of magnitude larger than the largest previously available data sets of their kind," says Andrew Drake, a staff scientist at Caltech and lead author on a poster to be presented at the meeting of the American Astronomical Society in Austin on January 12. "It will enable many interesting studies by the entire astronomical community."

One of the unique features of the survey, Drake says, is that it emphasizes an open-data philosophy. "We discover transient events and publish them electronically in real time, so that anyone can follow them and make additional discoveries," he explains.

"It is a good example of scientific-data sharing and reuse," Djorgovski says. "We hope to set an example of how data-intensive science should be done in the 21st century."

The data set include over a thousand exploding stars called supernovae, including many unusual and novel types, as well as hundreds of so-called cataclysmic variables, which are pairs of stars in which one spills matter onto another, called a white dwarf; tens of thousands of other variable stars; and dwarf novae, which are binary stars that dramatically change in brightness.

"We take hundreds of images every night from each of our telescopes as we search for hazardous asteroids," adds Edward Beshore, principal investigator of the University of Arizona's asteroid-hunting CSS. "As far back as 2005, we were asking if this data could be useful to the community of astronomers. We are delighted that we could forge this partnership. In my estimation, it has been a great success and is a superb example of finding ways to get greater value from taxpayers' investments in basic science."

The team says they soon plan to release additional data taken with a 1.5-meter telescope on Mt. Lemmon in Arizona and a 0.5-meter telescope in Siding Spring in Australia.

In addition to Djorgovski, Drake, and Beshore, the team includes staff scientist Ashish Mahabal, computational scientist Matthew Graham, postdoctoral scholar Ciro Donalek, and research scientist Roy Williams from Caltech. Researchers from other institutions include Steve Larson, Andrea Boattini, Alex Gibbs, Al Grauer, Rik Hill, and Richard Kowalski from the University of Arizona; Mauricio Catelan from Universidad Catholica in Chile; Eric Christensen from the Gemini Observatory in Hawaii; and Jose Prieto from Princeton University. The Caltech research is supported by the National Science Foundation. The work done at the University of Arizona is supported by NASA.

Marcus Woo
Exclude from News Hub: 
News Type: 
Research News

Aschbacher Receives Steele Prize

Michael Aschbacher, the Shaler Arthur Hanisch Professor of Mathematics, has been awarded the 2012 Leroy P. Steele Prize for Mathematical Exposition by the American Mathematical Society (AMS). Aschbacher, along with coauthors Richard Lyons of Rutgers University, Steve Smith of the University of Illinois at Chicago, and Ronald Solomon of Ohio State University, were recognized for a paper on the classification of certain types of groups, which are fundamental mathematical objects.

"AMS prizes are big honors, and we are proud that Michael has gotten this prize," says Barry Simon, the IBM Professor of Mathematics and Theoretical Physics." Over the last few decades, Aschbacher has played a leading role in the classification of so-called finite simple groups-an achievement that, Simon says, is one of the major mathematical accomplishments of the last 50 years, earning Aschbacher the Rolf Schock Prize from the Royal Swedish Academy of Sciences last year. "The mathematical proof involved in that work is so complicated that even experts in the specialty haven't absorbed it all," Simon says. "What Aschbacher and his coauthors got the Steele Prize for is an exposition for professional mathematicians that's one part of a wider program. Making this material accessible to mathematicians in different fields of mathematics is an important accomplishment."

In addition to the Schock Prize, Aschbacher has also received the Cole Prize in Algebra from AMS, and he is a member of the National Academy of Sciences and the American Academy of Arts and Sciences. The Steele Prize was awarded for the paper titled "The classification of finite simple groups: groups of characteristic 2 type," published in Mathematical Surveys and Monographs, Vol. 172.

Marcus Woo
Exclude from News Hub: 

The "Supernova of a Generation" Shows Its Stuff

Astronomers determine how the brightest and closest stellar explosion in 25 years blew up

PASADENA, Calif.—It was the brightest and closest stellar explosion seen from Earth in 25 years, dazzling professional and backyard astronomers alike. Now, thanks to this rare discovery—which some have called the "supernova of a generation"—astronomers have the most detailed picture yet of how this kind of explosion happens. Known as a Type Ia supernova, this type of blast is an essential tool that allows scientists to measure the expansion of the universe and understand the very nature of the cosmos.

"What caused these explosions has divided the astronomical community deeply," says Shri Kulkarni, the John D. and Catherine T. MacArthur Professor of Astronomy and Planetary Sciences. But this new supernova—dubbed SN2011fe—can help astronomers solve this longstanding mystery. "SN2011fe is like the Rosetta Stone of Type Ia supernovae," says Kulkarni, who is also the principal investigator on the Palomar Transient Factory (PTF). Led by the California Institute of Technology (Caltech), the PTF is designed to survey the skies for transient flashes of light that last for a few days or months, such as those emitted by exploding stars.

On August 24, the PTF team discovered the supernova in one of the arms of the Pinwheel Galaxy (also called M101), 21 million light years away. They caught the supernova just 11 hours after it exploded.

"Never before have we seen a stellar thermonuclear explosion so soon after it happened," says Lars Bildsten, professor of theoretical astrophysics at the Kavli Institute for Theoretical Physics at UC Santa Barbara, and member of the PTF team, which described its supernova findings in the December 15 issue of the journal Nature.

The PTF team uses an automated system to search for supernovae, and because they were able to point their telescopes at SN2011fe so quickly after its detonation, the astronomers were able to put together a blow-by-blow analysis of the explosion, determining that the supernova involves a dense, Earth-sized object called a white dwarf and, most likely, a main-sequence star (a star in the main stage of its life).

Scientists have long suspected that Type Ia supernovae involve a binary system of two stars in orbit around each other, with one of those stars being a white dwarf. The white dwarf, which is made out of carbon and oxygen, explodes when matter from its companion star spills over onto its surface. But no one is sure what kind of star the companion is. Scientists have suggested that it's another white dwarf, a main-sequence star, a helium star, or a star in a late life stage that's puffed up into a red giant.

Still, because the explosion always involves a white dwarf, its overall brightness and behavior is relatively predictable, making it a useful tool for measuring distances. Since all Type Ia supernovae produce about the same amount of light, those that appear dimmer must be farther away. In this way, by measuring the brightness of supernovae, astronomers can use them as cosmic meter sticks to determine the size of the universe—and how fast it's expanding. In fact, the work that earned the 2011 Nobel Prize in physics—the discovery that the expansion of the universe is speeding up—was based on observations using Type Ia supernovae.

"This discovery is exciting because the supernova's infancy and proximity allows us to directly see what the progenitor system is," explains Mansi Kasliwal, an astronomer at the Carnegie Institution for Science who is a recent Caltech doctoral graduate and a coauthor on the paper. "We have expected for a while that a Type Ia supernova involves a carbon-oxygen white dwarf, but now we have direct evidence."

In the case of SN2011fe, the researchers were also able to deduce, by process of elimination, that the companion star is most likely a main-sequence star. How do they know?

If the companion was a red giant, the explosion of the white dwarf would send a shock wave through the red giant, heating it. This scenario would have generated several tens of times more light than the astronomers observed. Additionally, it happens that the Hubble Space Telescope took images of the location where SN2011fe lived before it blew up. When the researchers looked at the data, they found no evidence of red giants or helium stars.

If the companion was another white dwarf, the interactions between the companion and the explosion would produce light in the optical and ultraviolet wavelengths. Since none of this sort of radiation was seen coming from SN2011fe, it is less likely that the companion was a white dwarf.

These results—which they describe in a companion paper in the same issue of Nature—along with X-ray and radio observations that also fail to see any evidence for red giants or helium stars, rule those out as the companion. Caltech postdoc Assaf Horesh is the lead author on the paper describing the X-ray and radio data, which will be published in The Astrophysical Journal
The astronomers have also observed, in unprecedented detail, the material that's blown off during the explosion. In particular, the team detected oxygen hurtling out from the supernova at speeds of over 20,000 kilometers per second—the first time anyone has seen high-speed oxygen coming from a Type Ia supernova, according to the researchers. "These observations probe the thin, outermost layers of the explosion," Bildsten says. "These are the parts that are moving the fastest, for which we have never been able to see this mix of atomic elements."

Not only was the supernova detected quickly, but the data processing—performed by researchers led by Peter Nugent, staff scientist at Lawrence Berkeley National Laboratory—was also done within hours. The machine-learning algorithms developed by Joshua Bloom, an associate professor at UC Berkeley, also helped make the fast find possible. And because the astronomers caught the blast so soon after it ignited, and because it's so close, the researchers say SN2011fe will become one of the best-studied supernovae ever.

"The rapid discovery and classification of SN2011fe—all on the same night—is a testament to the great teamwork between all the researchers from over a half a dozen institutions," Kulkarni says. "The future looks very bright. Soon we should be finding supernovae at an even younger age and thereby better understand how these explosions happen."

Nugent is the lead author on the Nature paper, which is titled, "Supernova 2011fe from an exploding carbon-oxygen white dwarf star." The lead author on the companion paper, "Exclusion of a luminous red giant as a companion star to the progenitor of supernova SN 2011fe," is Weidong Li of UC Berkeley. The Astrophysical Journal paper is titled, "Early radio and X-ray observations of the youngest nearby type Ia supernova PTF11kly (SN 2011fe)."

The Palomar Transient Factory (PTF) uses the 48-inch Oschin Schmidt telescope and the 60-inch telescope of the Palomar Observatory of Caltech for its observations and is a collaboration between Caltech, Columbia University, Las Cumbres Observatory Global Telescope, Lawrence Berkeley National Laboratory, Oxford University, UC Berkeley, and the Weizmann Institute of Science.

NOTE: Weidong Li died on December 12, just before the publication of these papers.

Marcus Woo
Exclude from News Hub: 
News Type: 
Research News

More Clues in the Hunt for the Higgs

Physicists unveil the largest amount of data ever presented for the Higgs search

PASADENA, Calif.—Physicists have announced that the Large Hadron Collider (LHC) has produced yet more tantalizing hints for the existence of the Higgs boson. The European Center for Nuclear Research (CERN) in Geneva, the international team of thousands of scientists—including many from the California Institute of Technology (Caltech)—unveiled for the first time all the data taken over the last year from the two main detectors at the LHC: the Compact Muon Solenoid (CMS) and ATLAS (A Toroidal LHC ApparatuS). The results represent the largest amount of data ever presented for the Higgs search.

The Higgs boson is a hypothesized particle that endows every other particle with mass, and is the presumed last piece of the so-called Standard Model, the theory that describes how every particle interacts. According to physicists, the discovery of the Higgs boson, in whatever form it may take, is crucial for understanding the fundamental laws of physics.  

The team says they have seen what they call "excess events"—a slight surplus of particle-collision events over what would be expected if the Higgs didn't exist. This suggests that the particle might have a mass between 115 and 127 gigaelectron volts (GeV, a unit of mass; in comparison, the mass of a proton is about 1 GeV). While the physicists can't yet claim discovery of the elusive particle, they are closer than ever, having ruled out a very large range of the Higgs's possible masses with great certainty.

But if physicists do not find the Higgs in the remaining mass range between 115 and 127 GeV, then it means the particle—if it exists at all—is of a more exotic form, requiring new theories of physics. Although remarkably successful, the Standard Model is incomplete, and an exotic form of the Higgs could help point the way toward a more complete theory—which physicists say is an exciting challenge.

"This is the beginning of a game-changing time for particle physics," says Harvey Newman, professor of physics. Along with professor of physics Maria Spiropulu, Newman leads Caltech's group that works on the CMS detector.

The LHC searches for the Higgs boson by slamming together protons at near-light speeds, producing new particles like the Higgs in the process. The problem is that the particle is exceptionally short-lived, decaying into other smaller particles within a tiny fraction of a second of its birth. To find the Higgs, physicists have to pick through the remains of each proton collision and reconstruct what happened.

If the collisions successfully produce a Higgs, then the particle can decay in several ways, depending on its mass. The Caltech team helped analyze three of these decays, called channels, in which the Higgs either decays into two photons, a pair of particles called W bosons, or another pair called Z bosons.

Graduate students Yong Yang, Yousi Ma, Jan Veverka, and Vladlen Timciuc are all studying the photon-photon channel, searching for the Higgs as well as possible signs of new physics phenomena. Caltech Tolman Postdoctoral Scholar Emanuele Di Marco, helped lead the analysis of the W-boson channel for the CMS team. Spiropulu, graduate student Chris Rogan, and other colleagues have also done preparatory studies for the Z-boson channel, which they published in the Physical Review in 2010.

"The impressive data produced by the CMS is the result of the experiment's ability to cleanly identify and precisely measure the energies of photons, electrons, and positrons," says Adi Bornheim, a Caltech staff scientist who heads the electromagnetic calorimeter (ECAL) detector group at CMS. Composed of 76,000 crystal detectors and weighing in at more than 90 tons, the calorimeter measures the energy of the electrons and protons produced by LHC collisions with high resolution.

"For the last 17 years, our group has led the way in constructing and calibrating the calorimeter, which has to be extremely precise for these demanding studies," adds Marat Gataullin, an assistant scientist at Caltech.

The collision experiments in 2010 and 2011 at the LHC operated only at half their designed energy levels, the researchers say. Still, the experiments exceeded design specifications for the proton beam's focus and intensity, producing about one quadrillion (a million billion) proton collisions and resulting in millions of gigabytes of data. Even though the latest experiments were more complicated than ever, the scientists in the ATLAS and CMS teams say they were able to analyze the data in record time, using new technology pioneered and developed by Newman's group at Caltech.

In order to confirm once and for all whether the Higgs exists as physicists understand it—or if they'll need to come up with new theories—the LHC will need be cranked up to collide protons with more energy. "We now need more data-three or four times the data that we expect the LHC to deliver in 2012," Newman says. The LHC is currently operating at around seven teraelectron volts (a TeV is a thousand times larger than a GeV), and scientists are considering boosting up the energy in the next year, which will help in the search. The LHC is designed to smash protons using energies as high as 14 TeV. "We foresee reaching 14 TeV by 2015, boosting the intensity by ten times starting in around 2022, and cranking up the energy to 33 TeV starting about 20 years from now," he adds. "This will open a vast new realm for exploration, and will surely revolutionize our understanding of the nature of matter and forces at the most basic level."

The Caltech CMS group, which includes eight graduate students and several postdocs, engineers, and technical staff, is working on many other projects in addition to the Higgs search, such as exploring supersymmetry (a theory that says every particle has a "supersymmetric" partner), searching for other exotic, theoretical particles, and developing new kinds of particle detectors.

"We're grateful for the achievements of the LHC team and our colleagues with CMS," Spiropulu says. "We are working hard on the final stage of improving the experiments and on publishing the results-both about the Higgs and possible new, exciting theories of physics-in the coming weeks and months."

For more information, go to the Caltech CMS website (, the CMS public site (, and the American CMS site (

Marcus Woo

High-Energy Physicists Set Record for Network Data Transfer

With a sustained data rate of 186 gigabits per second, high-energy physicists demonstrate the efficient use of long-range networks to support cutting-edge science

PASADENA, Calif.—Researchers have set a new world record for data transfer, helping to usher in the next generation of high-speed network technology. At the SuperComputing 2011 (SC11) conference in Seattle during mid-November, the international team transferred data in opposite directions at a combined rate of 186 gigabits per second (Gbps) in a wide-area network circuit. The rate is equivalent to moving two million gigabytes per day, fast enough to transfer nearly 100,000 full Blu-ray disks—each with a complete movie and all the extras—in a day.

The team of high-energy physicists, computer scientists, and network engineers was led by the California Institute of Technology (Caltech), the University of Victoria, the University of Michigan, the European Center for Nuclear Research (CERN), Florida International University, and other partners.

According to the researchers, the achievement will help establish new ways to transport the increasingly large quantities of data that traverse continents and oceans via global networks of optical fibers. These new methods are needed for the next generation of network technology—which allows transfer rates of 40 and 100 Gbps—that will be built in the next couple of years.

"Our group and its partners are showing how massive amounts of data will be handled and transported in the future," says Harvey Newman, professor of physics and head of the high-energy physics (HEP) team. "Having these tools in our hands allows us to engage in realizable visions others do not have. We can see a clear path to a future others cannot yet imagine with any confidence."

Using a 100-Gbps circuit set up by Canada's Advanced Research and Innovation Network (CANARIE) and BCNET, a non-profit, shared IT services organization, the team was able to reach transfer rates of 98 Gbps between the University of Victoria Computing Centre located in Victoria, British Columbia, and the Washington State Convention Centre in Seattle. With a simultaneous data rate of 88 Gbps in the opposite direction, the team reached a sustained two-way data rate of 186 Gbps between two data centers, breaking the team's previous peak-rate record of 119 Gbps set in 2009.

In addition, partners from the University of Florida, the University of California at San Diego, Vanderbilt University, Brazil (Rio de Janeiro State University and the São Paulo State University), and Korea (Kyungpook National University and the Korean Institute for Science and Technology Information) helped with a larger demonstration, transferring massive amounts of data between the Caltech booth at the SC11 conference and other locations within the United States, as well as in Brazil and Korea.

The fast transfer rate is also crucial for dealing with the tremendous amounts of data coming from the Large Hadron Collider (LHC) at CERN, the particle accelerator that physicists hope will help them discover new particles and better understand the nature of matter, and space and time, solving some of the biggest mysteries of the universe. More than 100 petabytes (more than four million Blu-ray disks) of data have been processed, distributed, and analyzed using a global grid of 300 computing and storage facilities located at laboratories and universities around the world, and the data volume is expected to rise a thousand-fold as physicists crank up the collision rates and energies at the LHC.

"Enabling scientists anywhere in the world to work on the LHC data is a key objective, bringing the best minds together to work on the mysteries of the universe," says David Foster, the deputy IT department head at CERN.

"The 100-Gbps demonstration at SC11 is pushing the limits of network technology by showing that it is possible to transfer petascale particle physics data in a matter of hours to anywhere around the world," adds Randall Sobie, a research scientist at the Institute of Particle Physics in Canada and team member.

The key to discovery, the researchers say, is in picking out the rare signals that may indicate new physics discoveries from a sea of potentially overwhelming background noise caused by already understood particle interactions. To do this, individual physicists and small groups located around the world must repeatedly access—and sometimes extract and transport—multiterabyte data sets on demand from petabyte data stores. That's equivalent to grabbing hundreds of Blu-ray movies all at once from a pool of hundreds of thousands. The HEP team hopes that the demonstrations at SC11 will pave the way towards more effective distribution and use for discoveries of the masses of LHC data.

"By sharing our methods and tools with scientists in many fields, we hope that the research community will be well positioned to further enable their discoveries, taking full advantage of 100 Gbps networks as they become available," Newman says. "In particular, we hope that these developments will afford physicists and young students the opportunity to participate directly in the LHC's next round of discoveries as they emerge."

More information about the demonstration can be found at See a video about the demonstration here.

This work was supported by the U.S. Department of Energy Office of Science and the National Science Foundation, in cooperation with the funding agencies of the international partners. Equipment and support was also provided by the team's industry partners: CIENA, Brocade, Mellanox, Dell and Force10 (now Dell/Force10), and Supermicro.

Harvey B. Newman, Professor of Physics
(626) 395-6656

Marcus Woo

Snowflake Science

Caltech Physicist Explains Why Snowflakes Are So Thin and Flat

We've all heard that no two snowflakes are alike. Caltech professor of physics Kenneth Libbrecht will tell you that this has to do with the ever-changing conditions in the clouds where snow crystals form. Now Libbrecht, widely known as the snowflake guru, has shed some light on a grand puzzle in snowflake science: why the canonical, six-armed "stellar" snowflakes wind up so thin and flat.

Few people pay close attention to the form that snow crystals—a.k.a. snowflakes—take as they fall from the sky. But in the late 1990s, Libbrecht's interest in the tiny white doilies was piqued. The physicist, who until then had worked to better understand the sun and to detect cosmic gravitational waves, happened across an article describing one of many common snowflake structures—a capped column, which looks something like an icy thread bobbin under the microscope. Such a snowflake starts out, as all do, as a hexagonal crystal of ice. As it grows, accumulating water molecules from the air, it forms a tiny column. Then it encounters conditions elsewhere in the cloud that promote the growth of platelike structures, so it ends up with platelike caps at both ends of the column.

"I read about capped columns, and I just thought, 'I grew up in snow country. How come I've never seen one of these?'" Libbrecht says. The next time he went home to North Dakota, he grabbed a magnifying glass and headed outside. "I saw capped columns. I saw all these different snowflakes," he says. "It's very easy. It's just that I had never looked."

Since then, he has published seven books of snowflake photographs, including a field guide for other eager snowflake watchers. And his library of snowflake images boasts more than 10,000 photographs. But Libbrecht is a physicist, so beyond capturing stunning pictures, he wanted to understand the molecular dynamics that dictate how ice crystals grow. For that, he's developed methods for growing and analyzing snowflakes in the lab.

Now Libbrecht believes he's on his way to explaining one of the major outstanding questions of snowflake science—a question at the heart of his original interest in capped columns all those years ago. Scientists have known for more than 75 years that at conditions typically found in snowflake-producing clouds, ice crystals follow a standard pattern of growth: near -2°C, they grow into thin, platelike forms; near -5°C, they create slender columns and needles; near   -15°C, they become really thin plates; and at temperatures below -30°C, they're back to columns. But no one has been able to explain why such relatively small changes in temperature yield such dramatic changes in snowflake structure.

Libbrecht started his observations with the thinnest, largest platelike snowflakes, which form around -15°C in high humidity. Some of these snowflakes are about as sharp as the edge of a razor blade. "What I found in my experiments," Libbrecht says, "is a growth instability, or sharpening effect." He noticed that as a snow crystal develops at -15°C, the top edge starts to develop a little bump of a ledge, which gets sharp at the tip. Basically, the corners stick out a bit farther toward the moist air, so they grow faster. And a cycle begins: "As soon as the ledge gets a little bit sharper, then it grows faster, and if it grows faster, then it gets sharper still, creating a positive feedback effect," Libbrecht says. "In the atmosphere, it would just get bigger and bigger and thinner and thinner, and eventually you'd get a really nice, beautiful snowflake."

If this sharpening effect occurs at other temperatures, which is likely, then it explains how small changes in temperature can yield such wildly varying snowflake structures. "The sharpening effect can yield thin plates or slender columns, just by changing directions," Libbrecht says. "That's a big piece of the puzzle, because now you don't have to make these enormous changes to get different structures. You just have to explain why the instability tips to produce plates at some temperatures, and tips to make columns at other temperatures. The flip-flopping of the sharpening effect nicely explains how the ice growth rates can change by a factor of 1000 when the temperature changes by just a few degrees.”

Libbrecht can't yet fully explain the underlying molecular mechanisms that produce the sharpening effect or exactly why different temperatures lead to sharpening on different faces of growing snow crystals. "But," he says, "this is a real advance in snowflake science. Now you can explain why the plates are so thin and the columns are so tall."

Kimm Fesenmaier