Peering Into the Heart of a Supernova

Caltech simulation points out how to detect a rapidly spinning stellar core

PASADENA, Calif.—Each century, about two massive stars in our own galaxy explode, producing magnificent supernovae. These stellar explosions send fundamental, uncharged particles called neutrinos streaming our way and generate ripples called gravitational waves in the fabric of space-time. Scientists are waiting for the neutrinos and gravitational waves from about 1,000 supernovae that have already exploded at distant locations in the Milky Way to reach us. Here on Earth, large, sensitive neutrino and gravitational-wave detectors have the ability to detect these respective signals, which will provide information about what happens in the core of collapsing massive stars just before they explode.

If we are to understand that data, however, scientists will need to know in advance how to interpret the information the detectors collect. To that end, researchers at the California Institute of Technology (Caltech) have found via computer simulation what they believe will be an unmistakable signature of a feature of such an event: if the interior of the dying star is spinning rapidly just before it explodes, the emitted neutrino and gravitational-wave signals will oscillate together at the same frequency.

"We saw this correlation in the results from our simulations and were completely surprised," says Christian Ott, an assistant professor of theoretical astrophysics at Caltech and the lead author on a paper describing the correlation, which appears in the current issue of the journal Physical Review D. "In the gravitational-wave signal alone, you get this oscillation even at slow rotation. But if the star is very rapidly spinning, you see the oscillation in the neutrinos and in the gravitational waves, which very clearly proves that the star was spinning quickly—that's your smoking-gun evidence."

Scientists do not yet know all the details that lead a massive star—one that is at least 10 times as massive as the Sun—to become a supernova. What they do know (which was first hypothesized by Caltech astronomer Fritz Zwicky and his colleague Walter Baade in 1934) is that when such a star runs out of fuel, it can no longer support itself against gravity's pull, and the star begins to collapse in upon itself, forming what is called a proto-neutron star. They also now know that another force, called the strong nuclear force, takes over and leads to the formation of a shock wave that begins to tear the stellar core apart. But this shock wave is not energetic enough to completely explode the star; it stalls part way through its destructive work.

There needs to be some mechanism—what scientists refer to as the "supernova mechanism"—that completes the explosion. But what could revive the shock? Current theory suggests several possibilities. Neutrinos could do the trick if they were absorbed just below the shock, re-energizing it. The proto-neutron star could also rotate rapidly enough, like a dynamo, to produce a magnetic field that could force the star's material into an energetic outflow, called a jet, through its poles, thereby reviving the shock and leading to explosion. It could also be a combination of these or other effects. The new correlation Ott's team has identified provides a way of determining whether the core's spin rate played a role in creating any detected supernova.

It would be difficult to glean such information from observations using a telescope, for example, because those provide only information from the surface of the star, not its interior. Neutrinos and gravitational waves, on the other hand, are emitted from inside the stellar core and barely interact with other particles as they zip through space at the speed of light. That means they carry unaltered information about the core with them. 

The ability neutrinos have to pass through matter, interacting only ever so weakly, also makes them notoriously difficult to detect. Nonetheless, neutrinos have been detected: twenty neutrinos from Supernova 1987a in the Large Magellanic Cloud were detected in February 1987. If a supernova went off in the Milky Way, it is estimated that current neutrino detectors would be able to pick up about 10,000 neutrinos. In addition, scientists and engineers now have detectors—such as the Laser Interferometer Gravitational-Wave Observatory, or LIGO, a collaborative project supported by the National Science Foundation and managed by Caltech and MIT—in place to detect and measure gravitational waves for the first time.

Ott's team happened across the correlation between the neutrino signal and the gravitational-wave signal when looking at data from a recent simulation. Previous simulations focusing on the gravitational-wave signal had not included the effect of neutrinos after the formation of a proto-neutron star. This time around, they wanted to look into that effect. 

"To our big surprise, it wasn't that the gravitational-wave signal changed significantly," Ott says. "The big new discovery was that the neutrino signal has these oscillations that are correlated with the gravitational-wave signal." The correlation was seen when the proto-neutron star reached high rotational velocities—spinning about 400 times per second.

Future simulation studies will look in a more fine-grained way at the range of rotation rates over which the correlated oscillations between the neutrino signal and the gravitational-wave signal occur. Hannah Klion, a Caltech undergraduate student who recently completed her freshman year, will conduct that research this summer as a Summer Undergraduate Research Fellowship (SURF) student in Ott's group. When the next nearby supernova occurs, the results could help scientists elucidate what happens in the moments right before a collapsed stellar core explodes.

In addition to Ott, other Caltech authors on the paper, "Correlated Gravitational Wave and Neutrino Signals from General-Relativistic Rapidly Rotating Iron Core Collapse," are Ernazar Abdikamalov, Evan O'Connor, Christian Reisswig, Roland Haas, and Peter Kalmus. Steve Drasco of the California Polytechnic State University in San Luis Obispo, Adam Burrows of Princeton University, and Erik Schnetter of the Perimeter Institute for Theoretical Physics in Ontario, Canada, are also coauthors. Ott is an Alfred P. Sloan Research Fellow.

Most of the computations were completed on the Zwicky Cluster in the Caltech Center for Advanced Computing Research. Ott built the cluster with a grant from the National Science Foundation. It is supported by the Sherman Fairchild Foundation.

Writer: 
Kimm Fesenmaier
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Physicists Discover a New Particle that May Be the Higgs Boson

International consortium of scientists includes a large Caltech contingent

PASADENA, Calif.—Physicists at the Large Hadron Collider (LHC) in Geneva, Switzerland, have discovered a new particle that may be the long-sought Higgs boson, the fundamental particle that is thought to endow elementary particles with mass. 

"This is a momentous time in the history of particle physics and in scientific exploration—the implications are profound," says Harvey Newman, professor of physics at the California Institute of Technology (Caltech). "This is experimental science at its best."

The international team of scientists and engineers—which includes a large contingent from Caltech, led by Newman and Maria Spiropulu, professor of physics—says it needs more data to determine for certain if the particle they've discovered is indeed the Higgs boson predicted by the Standard Model, the theory that describes how all particles interact. The results so far, however, show that it has many of the properties expected for such a particle. 

"One of the most exciting aspects of this observation is that the road remains open for a vast range of 'lookalike' alternatives, where any deviation from the Standard Model would point the way to the existence of other new particles or forces of nature," Newman says.

Regardless of the exact identity of the new particle, CERN's scientists say, the highly anticipated discovery heralds a new era in physics.

The physicists revealed their latest results at a seminar at the European Center for Nuclear Research (CERN) in Geneva, which was shared with the world with a live webcast and the EVO (Enabling Virtual Organizations) system developed at Caltech.

The discovery itself is based on the analysis of an unprecedented amount of data that was collected by the two main detectors at the LHC—the Compact Muon Solenoid (CMS) and A Toroidal LHC Apparatus (ATLAS). The data was collected with the LHC running at 7 teraelectron volts (TeV, a unit of energy) in 2011 and 8 TeV in 2012. The Caltech team is part of the CMS collaboration.

Using the CMS detector, the physicists identified signals that point to a new particle with a mass of 125 gigaelectron volts (GeV, a unit of mass; in comparison, the mass of a proton is about 1 GeV). The team running the ATLAS detector, which searches for the Higgs using a different method, reported similar results.

"This is an incredible, exciting moment," says Spiropulu. "Even these early results give us important hints as to how mass in the universe came to be. Together with hundreds of our colleagues Caltech scientists have worked for decades to reach this point: building multiple generations of experiments; designing and building detectors to precisely measure photons, electrons, and muons, which are keys to the discovery; and inventing worldwide systems that empower thousands of physicists throughout the world to collaborate day and night, share and analyze the data, and develop the new techniques leading to this great result."

To search for the Higgs, physicists have had to comb through the remains of trillions of particle collisions produced by the LHC, which accelerates protons around a ring almost five miles wide to nearly the speed of light. As the protons careen toward each other, a small fraction of them collide, creating other particles such as the Higgs. It is estimated that it takes one billion collisions to make just one Higgs boson.

The Higgs is fleeting, however, and quickly decays into lighter particles, which are the traces that allow the Higgs to be detected and analyzed. The Higgs is predicted to decay in several different ways, or channels, each resulting in different particles. CMS focuses mainly on the decay channels that result in two photons or two other particles called Z bosons. It was by measuring and analyzing these and other decay particles that the physicists were able to discover the potential Higgs.

When all the decays are taken into account in combination, the scientists announced, the data for a signal corresponding to a Standard Model Higgs boson have a statistical significance of five sigmas. This means the signal is highly unlikely to be the result of a statistical fluke caused by some rare, background fluctuation. Only when data are significant to five sigmas are physicists confident enough to declare a discovery.

Last December, evidence seen in the data from CMS generated plenty of excitement as a result of an excess of events—a slight surplus in particle collision events over what would have been expected if the Higgs does not exist—with a statistical significance of just three sigmas.

The Higgs boson is the last remaining fundamental particle predicted by the Standard Model yet to be detected, and hopes of detecting it was one of the chief reasons for building the LHC. The LHC accelerator, along with the CMS and ATLAS experiments, are arguably the largest and most complex scientific instruments ever built.

Despite its many successes, the Standard Model is incomplete—it does not incorporate gravity or dark matter, for example. One of the goals of physicists, then, is to develop more complete theories that better explain the composition of the universe and what happened during the first moments after the Big Bang. Discovering the Higgs boson—or a particle like it—is essential for developing these new theories.

The measurements of the new particle, the physicists say, are so far consistent—within statistical uncertainty—with the Higgs boson as predicted by the Standard Model. Still, they need more data to know for sure whether it is the predicted Higgs or something stranger, a Higgs lookalike—a prospect that many theorists have long been anticipating.  

By the end of 2012, the Caltech researchers say, the CMS collaboration expects to more than double its current total amount of data. With more data and analysis, the scientists might then be able to confirm whether the particle they announced is indeed the Higgs—and whether it is the Standard Model Higgs or a more exotic version.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Caltech at the LHC

Maria Spiropulu and Harvey Newman, both professors of physics at Caltech, lead the Caltech team of 40 physicists, students, and engineers that is part of the Compact Muon Solenoid (CMS) collaboration at the Large Hadron Collider (LHC) in Geneva, Switzerland.

Spiropulu is an expert on devising ways to discover exotic phenomena beyond the Standard Model—the theory that describes how all particles interact—such as theories of supersymmetry that predict particles of dark matter. Newman did much of the design and development work on the crystal detectors that are now used in CMS. He also conceived and developed the worldwide grid of networks and data centers that stores and processes the flood of data coming from the LHC. With the LHC generating gigabytes of data per second, no single site can hold all the information, so the data is handled in a distributed fashion at hundreds of sites throughout the world, including Caltech's Center for Advanced Computing Research, the first university-based center for LHC data analysis.

Newman's team also runs the transatlantic network that links the LHC to the United States, allowing data to flow between Europe and North America. His team, together with Steven Low, professor of computer science and electrical engineering at Caltech, developed the state-of-the-art applications for transferring data over long distances, enabling terabytes of data to stream between sites at speeds of up to the 100 gigabits per second. Newman and Caltech engineer Philippe Galvez also developed a system called Enabling Virtual Organizations, an internet-based tool that helps physicists and scientists from other fields communicate and collaborate from anywhere in the world. 

According to Newman and Spiropulu, the Caltech team consists of experts in everything from the detector and data analysis to how new phenomena might manifest at the LHC. Because the group is involved in so many aspects of CMS, Caltech is making a particularly significant contribution, Spiropulu says.

"Caltech is a premier institute of technology," Newman adds. "We continue to use our skills and vision to develop and deploy unique technologies with global impact even as we empower the scientists in our field." 

The Higgs boson is predicted to decay in different ways. The Caltech team is analyzing the data from three of these so-called decay channels: the photon-photon channel, in which the Higgs decays into two photons; the ZZ channel, in which it decays into two particles called Z bosons; and the WW channel, in which the Higgs decays into two particles called W bosons.

Graduate students Yong Yang, Jan Veverka, and Vladlen Timciuc are all studying the photon-photon channel, searching for the Standard Model Higgs and excluding possible imposters. "We make sure CMS precisely measures the energies of photons, electrons, and positrons," says Adi Bornheim, a Caltech staff scientist who led the electromagnetic calorimeter (ECAL) detector group at CMS and currently participates in tests of the signal robustness at 125 gigaelectron volts (GeV, a unit of mass) in the photon-photon channel. Composed of 76,000 crystal detectors and weighing in at more than 90 tons, the calorimeter measures the energy of the electrons and photons produced by LHC collisions with high precision 

Since 1994, Newman and Ren-Yuan Zhu, Caltech's manager of ECAL—along with a team of experts on crystal calorimetry and lasers—have been making sure the detector is calibrated to provide the exquisitely precise measurements required for discovery by the CMS detector. Marat Gataullin, an assistant scientist at Caltech and leader of the calibration team at CMS, says, "It would have been impossible to find the new particle in this channel without the 17-year investment in this special detector of the Caltech CMS group."

Caltech Tolman Fellow Artur Apresyan joined the search in the photon-photon channel recently, along with graduate student Cristian Pena, who recently arrived at CERN (the European Organization for Nuclear Research, the site of the LHC) to carry out the calibration of the ECAL for the remainder of the extended 2012 run. "This is intense," Pena says. "I want to get my hands into the work and contribute immediately."

Another Caltech Tolman Fellow, Emanuele Di Marco, helped lead the analysis of Higgs decays in the WW channel for the CMS team. He will present the results on behalf of CMS at the ICHEP 2012 conference, which will be held from July 4 through 11 in Melbourne, Australia. "As the impact of the discovery sinks in, I realize how much work remains to build the full picture," Di Marco says. "Squeezing the signal out of the WW channel is challenging."   

In addition to the WW channel, Di Marco and newly arrived Caltech Millikan Fellow Si Xie have also worked on the ZZ channel. "We are doing our best to advance CMS's ability to measure this channel," Xie says. "There are already outstanding analysis techniques in CMS and we want to perfect them."

Newman, Spiropulu, Xie, Di Marco, and graduate students Chris Rogan and Yi Chen—in collaboration with colleagues from CERN, Fermilab, and Johns Hopkins University—will continue the preparatory studies for the ZZ channel; they published the first of those studies in the journal Physical Review D in 2010. They developed a kind of Higgs look-alike program that is being applied to the data to pick out the newly discovered particle.

"We will pin down the properties of this new particle," says Rogan, who was honored by Forbes Magazine as one of this year's top 30 leaders in science and technology under 30 years old. "We can learn more—we must use the knowledge from this discovery to inform and help direct our searches for supersymmetry or other exotic forms of new physics as well as experimental searches for dark matter. The implications for many other subfields are very significant." For instance, Javier Mauricio Duarte, a first year graduate student, will be using these findings in his work on supersymmetry and models of dark matter. 

As the LHC continues to ramp up its energy, physicists hope that even more discoveries are on the way. "The LHC is pretty close to my dream experiment, and the LHC at twice the energy will be even better!" said graduate student Alexander Mott in an interview with Scientific American.

Undergraduates are also a critical part of the team. In the last two years, there have been a total of 24 students from the Summer Undergraduate Research Fellowships (SURF) and Minority Undergraduate Research Fellowships (MURF) programs, as well as from programs at CERN. "Caltech students can really 'do things' from an early stage in their careers—at a level one rarely sees elsewhere," Newman says.

This summer, the group includes seven undergraduates (SURFs as well as Rose Hill and Musk Foundation summer fellows) and a couple of CERN summer students. Lisa Lee, a junior at Caltech who just arrived at CERN, sums up the experience so far with a sense of wonder. "What are the chances," she says, "that you take on a summer undergraduate research project in an experiment that makes the most significant discovery in recent history right at the moment you start?"  

Writer: 
Marcus Woo
Images: 
Writer: 

X-ray Telescope Takes First Image

NASA's NuSTAR space telescope has taken its first image, snapping a shot of the high-energy X rays from a black hole in the constellation Cygnus. NuSTAR—short for Nuclear Spectroscopic Telescope Array—was launched on June 13, and is the first telescope that can focus high-energy X rays. It will explore black holes, the dense remnants of dead stars, energetic cosmic explosions, and even our very own sun.

"Today, we obtained the first-ever focused images of the high-energy X-ray universe," says Fiona Harrison, a professor of physics and astronomy at Caltech and the mission's principal investigator, in a NASA press release. "It's like putting on a new pair of glasses and seeing aspects of the world around us clearly for the first time."

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Put a Seismometer in Your Living Room

Back in the 1960s, Charlie Richter (PhD '28) installed a seismometer in his living room. It was bigger than his TV set, and it didn't go with the sofa, but it saved him a lot of late-night drives into the Seismo Lab and was a great conversation piece. Now, if you live in the Pasadena area, you can have one, too. Professor of geophysics Robert Clayton will send a wallet-sized seismometer to the first 1,000 volunteers with an Internet connection and a spare USB port. There is one small catch: you have to promise to leave your computer on 24/7.

Even as you impress your friends by creating your own mini-quakes ("we encourage people to test the unit by jumping up and down," Clayton says with a grin), you'll be contributing to a serious civic project—creating block-by-block earthquake-intensity maps that can guide ambulances and fire trucks to hard-hit areas almost while the ground is still rolling. Ordinary seismic data won't do; as we learned in the 1994 Northridge earthquake, heavy damage can strike far from the epicenter.

Crowd-sourced seismology dates to the turn of the millennium, when David Wald (PhD '93), Vincent Quitoriano (BS '99), and Humboldt State's Lori Dengler created an online questionnaire called "Did You Feel It?" The site went viral with the magnitude 7.1 Hector Mine quake, and now even a shaker in the 4 to 5 range will generate some 50,000 responses. The first thing people do—we hope!—is take cover," Clayton says. "But the second thing they do, apparently, is log on and file a report. So we want to put instruments where the people are."

A collaboration between Caltech's seismology, earthquake engineering, and computer science departments, the Community Seismic Network was inspired by what Clayton calls "the densest urban array ever"—a multimillion-dollar array of 5,206 sensors, or better than one per block, that were buried under the grassy verges between the sidewalks and the streets of Long Beach and operated from January through June of 2011. The sensors were used by the Signal Hill Petroleum Company to map its holdings by analyzing how shock waves reflected off the rock formations deep underground. The preferred oil-field survey tool is a stick of dynamite, but working in a city called for a subtler approach—sort of. "They had to pull a parade permit every day," says Clayton. A train of 30-ton vibrator trucks would drive up and down the boulevards, rattling doors and windows, preceded by a police escort to clear the way.

The survey crew worked weekdays from 9:00 to 4:00, but the seismometers ran 24/7. The array straddled the Newport-Inglewood fault zone, the source of the disastrous 1933 quake that killed 115 people. "The earth is constantly chattering away down there," says Clayton, and the network caught hundreds of after-hours tremors, including a magnitude 2.4 under nearby Carson. A color-coded video of this hiccup shows the seismic waves propagating eastward block by block, station by station. The waves distort as they encounter the fault, whose segments reveal themselves as fleeting white scars in the sea of red and blue.

The cross-sectional renderings are even more revealing. Grad student Asaf Inbal projected the waveforms from an unfelt microquake back to their source in the fault plane. The resulting animation "looks like a summer lightning storm," Clayton says, but in this case, the towering cumulonimbus clouds are actually micro-slip zones, and the thunderbolts are transitory sub-centimeter displacements striking as deep as 20 kilometers underground.

Such detailed data will allow seismologists to answer "the big question," says Clayton: "How does an earthquake nucleate? Most activity simply dies out over very small distances. What lets something suddenly take off?"

Which brings us back to Pasadena. The Southern California Seismic Network, run jointly by Caltech and the U.S. Geological Survey, runs from San Luis Obispo to the Mexican border. It includes about 40 stations in the Greater Los Angeles region, or about one every 10 kilometers; each state-of-the-art installation costs $100,000. The Community Seismic Network uses mass-produced accelerometer chips like the ones in your Wii, each costing about a hundred bucks. High-end they're not. In fact, Clayton says, "no single sensor would have impressively seen any one event. But collectively, when you have 100 of them, you see it."

Given enough sensors, you can assess seismic hazards block by block, and in some cases floor by floor. Caltech's Millikan Library, for example, has sensors on each of its nine floors, plus one on the roof and one in the basement. Undergrad Hong Sheng and senior research fellow Monica Kohler (PhD '95) used that data to create a 3-D rendering of the building's response to the magnitude-4.2 Newhall quake of September 1, 2011. Watching the library's energetic hula is rather unsettling, especially if you hang out on Millikan's middle floors; fortunately, the scale is greatly exaggerated and the actual sway is less than a tenth of a millimeter. Similar animations could alert inspectors to hidden damage after a large earthquake, Clayton says. Tall buildings sway at frequencies determined by the details of their construction, so "if this frequency drops during an event and doesn't bounce back once the shaking stops, it could be a sign of broken welds. We saw this in the Northridge earthquake."

Little by little, Clayton hopes to grow the Community Seismic Network to 15,000 sensors spanning the Los Angeles basin, from the Hollywood Hills to Lake Forest in southern Orange County. But for now he just wants to fill the area bounded by the San Gabriel Mountains and the 605, 10, and 2 freeways. Be the first one on your block—sign up today!

The Community Seismic Network is funded by the Gordon and Betty Moore Foundation and by Caltech trustee Ted Jenkins (BS '65, MS '66).

Writer: 
Doug Smith
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

NuSTAR Space Telescope Blasts Off

Caltech-led mission will explore the X-ray universe

This morning, NASA's NuSTAR telescope was launched into the low-Earth orbit from which it will begin exploring the high-energy X-ray universe to uncover the secrets of black holes, the dense remnants of dead stars, energetic cosmic explosions, and even our very own sun.   

The space telescope—the most powerful high-energy X-ray telescope ever developed—rode toward its destination inside the nose of a Pegasus rocket strapped onto the belly of a "Stargazer" L-1011 aircraft. Around 9:00 a.m. (PDT), the plane—which had earlier taken off from the Kwajalein Atoll in the western Pacific—dropped the rocket from an altitude of 39,000 feet. The rocket was in free fall for about five seconds before the first of its three stages ignited, blasting NuSTAR into orbit around the equator.

"NuSTAR will open a whole new window on the universe by being the very first telescope to focus high-energy X rays," says Fiona Harrison, professor of physics and astronomy at Caltech and the principal investigator of the NuSTAR mission. The telescope is 100 times more sensitive than any previous high-energy X-ray telescope, and it will make images that are 10 times sharper than any that have been taken before at these energies, she says. 

NuSTAR—short for "Nuclear Spectroscopic Telescope Array"—can make sensitive observations of cosmic phenomena at higher frequencies than other X-ray telescopes now in orbit, including NASA's Chandra X-Ray Observatory and the European Space Agency's XMM-Newton observatory. X rays are at the high-frequency end of the electromagnetic spectrum, and can be thought of as light particles called photons that carry a lot of energy. Chandra and XMM-Newton can detect X rays with energies up to about 10,000 electron volts (eV); NuSTAR will be able to see photons with energies up to 78,000 eV. By comparison, the energy of visible light is just a few eVs, while the X rays used to check for broken bones have energies on the order of hundreds of thousands of eVs.

High-energy X rays can penetrate skin and muscle because they carry so much energy. But that also means that they are hard to reflect. The mirrors in optical telescopes cannot be used to reflect and focus X rays. Instead, X-ray telescopes can only reflect incoming X rays at very shallow angles. These photons travel on paths that are almost parallel to the reflective surface, like rocks skipping on a pond. To reflect enough X rays for the detectors to observe, telescopes must use nested cylindrical shells that focus the photons onto a detector. Chandra has four such nested shells; XMM-Newton has 58. NuSTAR, in contrast, has 133, providing unprecedented sensitivity and resolution.

Each of NuSTAR's nested shells is coated with about two hundred thin reflective layers, some merely a few atoms thick, composed of either high-density materials, such as tungsten and platinum, or low-density materials like silicon and carbon. By alternating the two types of layers, NuSTAR's designers have produced a crystal-like structure that reflects high-energy X rays. Harrison and her group at Caltech started developing this technology more than 15 years ago and first tested it in a balloon experiment called the High-Energy Focusing Telescope (HEFT) in 2005. 

Fiona Harrison, professor of physics and astronomy at Caltech, is the principal investigator of the NuSTAR mission.
Credit: Lance Hayashida

A telescope focuses light by bending it so that it converges onto one spot—an eyepiece or a detector. But because X rays can only be reflected at such shallow angles, they do not converge very strongly. As a result, the distance between an X-ray telescope's mirrors and the detector must be especially long for the X rays to focus. Chandra is 45 feet long and XMM-Newton is about 30 feet long—as big as buses. NuSTAR—funded under NASA's Explorers program, which emphasizes smaller, cheaper missions that do science as efficiently as possible—has a deployable mast that allows it to squeeze inside the Pegasus rocket's roughly seven-foot-long payload compartment. About a week after launch, the mast will unfold and stretch to more than 30 feet. 

This new technology, Harrison explains, "will allow NuSTAR to study some of the hottest, densest, and most energetic phenomena in the universe." Black holes are a key target of the telescope. Just 20 years ago, she says, black holes were thought to be rare and exotic, but "today we know that every galaxy has a massive black hole in its heart." Our own Milky Way galaxy, with a black hole four million times as massive as our sun, is no exception. Gas and dust block most of our view of the galactic center, but by observing in high-energy X rays, NuSTAR can peer directly into the heart of the Milky Way. 

Disks of gas and dust surround many of the supermassive black holes of other galaxies. As the material spirals into these black holes, which are millions to billions of times as massive as the sun, the regions closest to the black hole radiate prodigious amounts of high-energy X rays, which are visible even if the black hole is hidden behind dust and gas. NuSTAR will therefore allow astronomers to not only conduct a census of all the black holes in the cosmic neighborhood, but also to study the extreme environments around the black holes. Astronomers will even be able to measure how fast black holes spin, which is important for understanding how they form and their role in the history and evolution of their host galaxies. 

Astronomers will also point NuSTAR at supernovae remnants, the hot embers left over from exploded stars. After a star burns through all of its fuel, it blows up, blasting material out into space. In that explosion, new elements are formed (in fact, many of the heavier elements on Earth were forged long ago in stars and supernovae). Some newborn atoms are radioactive, and NuSTAR will be able to detect this radioactivity, allowing astronomers to probe what happens during the fiery death of a star. 

The telescope also will devote some time to the observation of our own star, the sun. The outer layer of the sun, called the corona, burns at millions of degrees. Some scientists speculate that nanoflares—smaller versions of the solar flares that occasionally erupt from the solar surface—keep the corona hot. NuSTAR may be able to see nanoflares for the first time. "In a few hours of observations, NuSTAR will answer this longstanding question that solar physicists have been scratching their heads about for years," says Daniel Stern of NASA's Jet Propulsion laboratory, NuSTAR's project scientist.

In July, NuSTAR will start taking data, revealing a whole new X-ray universe—shining, shimmering, and splendid—to scientists. "We expect amazing discoveries from it," Stern says.

The NuSTAR mission is led by Caltech and managed by JPL for NASA.

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Caltech Graduate Student Wins DOE Fellowship for Computational Science

Caltech graduate student Melissa Yeung has been selected as one of 21 students nationally to receive a Department of Energy (DOE) Computational Science Graduate Fellowship this year. The honor covers up to four years of support for graduate studies in fields that focus on the use of high-performance computing technology to solve complex problems in science and engineering.

The DOE Computational Science Graduate Fellowships are jointly funded by the Office of Science and National Nuclear Security Administration's Office of Defense Programs. In addition to covering tuition and providing an annual stipend, the award gives students an opportunity to spend a summer at a DOE national laboratory doing computational research in a field that is unrelated to their thesis work. 

As part of Caltech's Applied Geometry Lab, Yeung studies an area of mathematics known as discrete differential geometry, which has diverse applications in such fields as engineering, computer animation, product design, and medicine. As a second-year graduate student in mathematics, Yeung says she feels incredibly fortunate to be at Caltech, where she has been given the freedom to work across divisions; although she is enrolled in the Division of Physics, Mathematics and Astronomy, she is now being advised by Mathieu Desbrun, professor of computing and mathematical sciences from the Division of Engineering and Applied Science.

"'What do you get when a mathematician and a computer scientist meet?' is usually the beginning of a bad joke. At Caltech, it's instead the start of a great collaboration," says Desbrun, who is also the director of Computing and Mathematical Sciences and Information Science and Technology. "Working with Melissa offers new perspectives that computer scientists like myself are not frequently presented with." 

Yeung is the second student from Caltech's Applied Geometry Lab to receive a DOE Computational Science Graduate Fellowship in recent years. In 2010, Evan Gawlik, who completed his senior thesis with Desbrun, also received the honor.

Yeung traces the beginnings of her interest in computational geometry to a lecture that she happened to pop into at a conference a few months before she came to Caltech. In that lecture, she heard about how computational topology could be used to analyze complex datasets, identifying underlying shape characteristics that bring new understanding to a problem.

Shortly after arriving at Caltech, while flipping through the course catalog, Yeung came across a course offered by Desbrun on discrete differential geometry and started sitting in on the lectures. And that was it: she was hooked. "I think math is beautiful, but at times it seems that the more math I do, the more disconnected from the world I feel. It has been really gratifying for me to see all of this math that I thought was so incredibly abstract so immediately applicable."

 

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
Yes

Physicists Close in on a Rare Particle-Decay Process

Underground Experiment May Unlock Mysteries of the Neutrino

PASADENA, Calif.—In the biggest result of its kind in more than ten years, physicists have made the most sensitive measurements yet in a decades-long hunt for a hypothetical and rare process involving the radioactive decay of atomic nuclei.

If discovered, the researchers say, this process could have profound implications for how scientists understand the fundamental laws of physics and help solve some of the universe's biggest mysteries—including why there is more matter than antimatter and, therefore, why regular matter like planets, stars, and humans exists at all.

The experiment, the Enriched Xenon Observatory 200 (EXO-200), is an international collaboration that includes the California Institute of Technology (Caltech) and is led by Stanford University and the SLAC National Accelerator Laboratory, a U.S. Department of Energy (DOE) National Laboratory.

The EXO-200 experiment has placed the most stringent constraints yet on the nature of a so-called neutrinoless double beta decay. In doing so, physicists have narrowed down the range of possible masses for the neutrino, a tiny uncharged particle that rarely interacts with anything, passing right through rock, people, and entire planets as it zips along at nearly the speed of light.

The collaboration, consisting of 80 researchers, has submitted a paper describing the results to the journal Physical Review Letters.

In a normal double beta decay, which was first observed in 1986, two neutrons in an unstable atomic nucleus turn into two protons; two electrons and two antineutrinos—the antimatter counterparts of neutrinos—are emitted in the process.

But physicists have suggested that two neutrons could also decay into two protons by emitting two electrons without producing any antineutrinos. "People have been looking for this process for a very long time," says Petr Vogel, senior research associate in physics, emeritus, at Caltech and a member of the EXO-200 team. "It would be a very fundamental discovery if someone actually observes it."

A neutrino is inevitably produced in a single beta decay. Therefore, the two neutrinos that are produced in a neutrinoless double beta decay must somehow cancel each other out. For that to happen, physicists say, a neutrino must be its own antiparticle, allowing one of the two neutrinos to act as an antineutrino and annihilate the other neutrino. That a neutrino can be its own antiparticle is not predicted by the Standard Model—the remarkably successful theory that describes how all elementary particles behave and interact.

If this neutrinoless process does indeed exist, physicists would be forced to revise the Standard Model.

The process also has implications for cosmology and the origin of matter, Vogel says. Right after the Big Bang, the universe had the same amount of matter as antimatter. Somehow, however, that balance was tipped, producing a slight surplus in matter that eventually led to the existence of all of the matter in the universe. The fact that the neutrino can be its own antiparticle might have played a key role in tipping that balance.

In the EXO-200 experiment, physicists monitor a copper cylinder filled with 200 kilograms of liquid xenon-136, an unstable isotope that, theoretically, can undergo neutrinoless double beta decay. Very sensitive detectors line the wall at both ends of the cylinder. To shield it from cosmic rays and other background radiation that may contaminate the signal of such a decay, the apparatus is buried deep underground in the DOE's Waste Isolation Pilot Plant in Carlsbad, New Mexico, where low-level radioactive waste is stored. The physicists then wait to see a signal.

The process, however, is very rare. In a normal double beta decay, half of a given sample would decay after 1021 years—a half-life roughly 100 billion times longer than the time that has elapsed since the Big Bang.

One of the goals of the experiment is to measure the half-life of the neutrinoless process (if it is discovered). In these first results, no signal for a neutrinoless double beta decay was detected in almost seven months' of data—and that non-detection allowed the researchers to rule out possible values for the half-life of the neutrinoless process. Indeed, seven months of finding nothing means that the half-life cannot be shorter than 1.6 × 1025 years, or a quadrillion times older than the age of the universe. With the value of the half-life pinned down, physicists can calculate the mass of a neutrino—another longstanding mystery. The new data suggest that a neutrino cannot be more massive than about 0.140 to 0.380 electron volts (eV, a unit of mass commonly used in particle physics); an electron, by contrast, is about 500,000 eV, or about 9 × 10-31 kilograms.

More than ten years ago, the collaboration behind the Heidelberg-Moscow Double Beta Decay Experiment controversially claimed to have discovered neutrinoless double beta decay using germanium-76 isotopes. But now, the EXO-200 researchers say, their new data makes it highly unlikely that those earlier results were valid.

The EXO-200 experiment, which started taking data last year, will continue its quest for the next several years.

The EXO collaboration involves scientists from SLAC, Stanford, the University of Alabama, Universität Bern, Caltech, Carleton University, Colorado State University, University of Illinois Urbana-Champaign, Indiana University, UC Irvine, Institute for Theoretical and Experimental Physics (Moscow), Laurentian University, the University of Maryland, the University of Massachusetts–Amherst, the University of Seoul, and the Technische Universität München. This research was supported by the DOE and the National Science Foundation in the United States, the Natural Sciences and Engineering Research Council in Canada, the Swiss National Science Foundation, and the Russian Foundation for Basic Research. This research used resources of the National Energy Research Scientific Computing Center (NERSC).

Writer: 
Marcus Woo
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Notes from the Back Row: "An Explosion of Explosions"

"I'm looking for things that go bump, burp, or boom in the sky," says astronomer Shrinivas "Shri" Kulkarni, noting that "there is one supernova somewhere in the universe every second." A supernova is the brilliant new object we see when a star explodes, and if that star happens to go off in the nighttime skies over California, the odds are pretty good he'll find it. In his Watson Lecture given on April 25, 2012, Kulkarni, Caltech's John D. and Catherine T. MacArthur Professor of Astronomy and Planetary Science and the director of the Caltech Optical Observatories, described how Caltech's fully automated Palomar Transient Factory—Kulkarni calls it "Transients 'R' Us"—is revolutionizing how we explore the changing sky.

Supernovae fade into oblivion within days, or at best a few months. Catching them while they're bright is a matter of looking in the right direction at the right time—which is what a vast network of telescopes, many of them in the back yards of dedicated amateur astronomers, are doing right now. But the Transient Factory, says Kulkarni, was the first to create a sort of chain in which a discovery by the dedicated wide-field survey telescope automatically directs a larger telescope to zoom in for a closer look that same night; if the second telescope is intrigued by what it sees, the humans are notified so that even bigger telescopes can be brought to bear. "The hardware is, in fact, the smallest part of the cost," Kulkarni says. "The most expensive component is grayware." Now, with the software pipeline up and running, it's time to "go out and look for weird things," says Kulkarni, whose trophies over the years include the discovery of the first millisecond pulsar and the first brown dwarf. The weird things the Factory is finding will make a nice addition to the collection—watch the talk to get a glimpse of them.

"An Explosion of Explosions" is available for download in HD from Caltech on iTunesU. (Episode 11)

Writer: 
Doug Smith
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Overactive Black Holes Shut Down Star Formation

A team of astronomers has found that the most active galactic nuclei—enormous black holes that are violently devouring gas and dust at the centers of galaxies—may prevent new stars from forming. The team, which includes several researchers from Caltech, reported its findings in the May 10 issue of the journal Nature.

Supermassive black holes—which are more than a million times as massive as the sun—are found at the cores of nearly all large galaxies. As gas and dust fall into the behemoth, the material heats up and produces tremendous amounts of energy, radiating X rays. Previous observations have suggested that black holes with higher activity—and therefore brighter X-ray radiation—trigger more stars to form in their galaxies.

But using the Herschel Space Observatory, the astronomers have now discovered that if the black hole is too active, it appears to shut down star formation. What may be happening, the astronomers say, is that, at first, the infalling gas and dust can clump together and make new stars. But if you feed the black hole too much, the material gets so hot that the resulting radiation blasts the surrounding gas away. Without enough raw materials, new stars can no longer form.

While astronomers previously have seen black holes suppress star formation in a few individual galaxies, these new results are based on observations of 65 galaxies, says Joaquin Vieira, a postdoctoral scholar in physics at Caltech and coauthor of the paper. Instead of just studying a specific case, the researchers looked at the aggregate properties of a large sample. "This is an important and unexpected result that could have only been shown with a large and cohesive survey," adds James Bock, a senior faculty associate at Caltech and cocoordinator of the Herschel survey.

To learn more, click here

Writer: 
Marcus Woo
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - PMA