International Team of Scientists Establishes New Internet Land-Speed Benchmark

PASADENA, Calif.—Scientists at the California Institute of Technology (Caltech) and the European Organization for Nuclear Research (CERN), along with colleagues at AMD, Cisco, Microsoft Research, Newisys, and S2io have set a new Internet2 land-speed record. The team transferred 859 gigabytes of data in less than 17 minutes at a rate of 6.63 gigabits per second between the CERN facility in Geneva, Switzerland, and Caltech in Pasadena, California, a distance of more than 15,766 kilometers. The speed is equivalent to transferring a full-length DVD movie in just four seconds.

The technology used in setting this record included S2io's Xframe® 10 GbE server adapter, Cisco 7600 Series Routers, Newisys 4300 servers utilizing AMD Opteron processors, Itanium servers, and the 64-bit version of Windows Server 2003.

The performance is also remarkable because it is the first record to break the 100 petabit meter per second mark. One petabit is 1,000,000,000,000,000 bits.

This latest record by Caltech and CERN is a further step in an ongoing research-and-development program to create high-speed global networks as the foundation of next-generation data-intensive grids.

Multi-gigabit-per-second IPv4 and IPv6 end-to-end network performance will lead to new research and business models. People will be able to form "virtual organizations" of planetary scale, sharing in a flexible way their collective computing and data resources. In particular, this is vital for projects on the frontiers of science and engineering, projects such as particle physics, astronomy, bioinformatics, global climate modeling, and seismology.

Harvey Newman, professor of physics at Caltech, said, "This is a major milestone towards our dynamic vision of globally distributed analysis in data-intensive, next-generation high-energy physics (HEP) experiments. Terabyte-scale data transfers on demand, by hundreds of small groups and thousands of scientists and students spread around the world, is a basic element of this vision; one that our recent records show is realistic." Olivier Martin, head of external networking at CERN and manager of the DataTAG project said, "As of 2007, when the Large Hadron Collider, currently being built at CERN, is switched on, this huge facility will produce some 15 petabytes of data a year, which will be stored and analyzed on a global grid of computer centers. This new record is a major step on the way to providing the sort of networking solutions that can deal with this much data."

The team used the optical networking capabilities of the LHCnet, DataTAG, and StarLight and gratefully acknowledges support from the DataTAG project sponsored by the European Commission (EU Grant IST-2001-32459), the DOE Office of Science, High Energy and Nuclear Physics Division (DOE Grants DE-FG03-92-ER40701 and DE-FC02-01ER25459), and the National Science Foundation (Grants ANI 9730202, ANI-0230967, and PHY-0122557).

About Caltech:

With an outstanding faculty, including three Nobel laureates, and such off-campus facilities as Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, about 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university. More information is available at http://www.caltech.edu.

About CERN:

CERN, the European Organization for Nuclear Research, has its headquarters in Geneva, Switzerland. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About the European Union DataTAG project:

The DataTAG is a project co-funded by the European Union, the U.S. Department of Energy, and the National Science Foundation. It is led by CERN together with four other partners. The project brings together the following European leading research agencies: Italy's Istituto Nazionale di Fisica Nucleare (INFN), France's Institut National de Recherche en Informatique et en Automatique (INRIA), the UK's Particle Physics and Astronomy Research Council (PPARC), and Holland's University of Amsterdam (UvA). The DataTAG project is very closely associated with the European Union DataGrid project, the largest grid project in Europe also led by CERN. For more information, see http://www.datatag.org.

Writer: 
Robert Tindol
Writer: 

Gamma-ray burst of December 3 was a new type of cosmic explosion

PASADENA, Calif.—Astronomers have identified a new class of cosmic explosions that are more powerful than supernovae but considerably weaker than most gamma-ray bursts. The discovery strongly suggests a continuum between the two previously-known classes of explosions.

In this week's issue of Nature, astronomers from the Space Research Institute of the Russian Academy of Sciences and the California Institute of Technology announce in two related papers the discovery of the explosion, which was first detected on December 3, 2003, by the European-Russian Integral satellite and then observed in detail at ground-based radio and optical observatories. The burst, known by its birthdate, GRB031203, appeared in the constellation Puppis and is about 1.6 billion light-years away.

Although the burst was the closest gamma-ray burst to Earth ever studied (all the others have been several billion light-years away), researchers noticed that the explosion was extremely faint--releasing only about one-thousandth of the gamma rays of a typical gamma-ray burst. However, the burst was also much brighter than supernovae explosions, which led to the conclusion that a new type of explosion had been found.

Both supernovae and the rare but brilliant gamma-ray bursts are cosmic explosions marking the deaths of massive stars. Astronomers have long wondered what causes the seemingly dramatic differences between these events. The question of how stars die is currently a major focus of stellar research, and is particularly directed toward the energetic explosions that destroy a star in one cataclysmic event.

Stars are powered by the fusion ("burning'') of hydrogen in their interiors. Upon exhaustion of fuel in the interior, the core of massive stars collapse to compact objects--typically a neutron star and occasionally a black hole. The energy released as a result of the collapse explodes the outer layers, the visible manifestation of which is a supernova. In this process, new elements are added to the inventory of matter in the universe.

However, this nuclear energy may be insufficient to power the supernova explosions. One theory is that additional energy is generated from the matter falling onto the newly produced black hole. Many astronomers believe that this is what powers the luminous gamma-ray bursts.

But the connection between such extreme events and the more common supernovae is not yet clear, and if they are indeed closely related, then there should be a continuum of cosmic explosions, ranging in energy from that of "ordinary" supernovae to that of gamma-ray bursts.

In 1998, astronomers discovered an extremely faint gamma-ray burst, GRB 980425, coincident with a nearby luminous supernova. The supernova, SN 1998bw, also showed evidence for an underlying engine, albeit a very weak one. The question that arose was whether the event, GRB 980425/SN 1998bw, was a "freak" explosion or whether it was indicative of a larger population of low-powered cosmic explosions with characteristics in between the cosmological gamma-ray bursts and typical supernovae.

"I knew this was an exciting find because even though this was the nearest gamma-ray burst to date, the gamma-ray energy measured by Integral is one thousand times fainter than typical cosmological gamma-ray bursts," says Sergey Sazonov of the Space Research Institute, the first author of one of the two Nature papers.

The event was studied in further detail by the Chandra X-Ray Observatory and the Very Large Array, a radio telescope facility located in New Mexico.

"I was stunned that my observations from the Very Large Array showed that this event confirmed the existence of a new class of bursts," says graduate student Alicia Soderberg, who is the principal author of the other Nature paper. "It was like hitting the jackpot."

There are several exciting implications of this discovery, including the possible existence of a significant new population of low-luminosity gamma-ray bursts lurking within the nearby universe, said Shrinivas Kulkarni, who is the MacArthur Professor of Astronomy and Planetary Science at Caltech and Soderberg's faculty adviser.

"This is an intriguing discovery," says Kulkarni. "I expect a treasure trove of such events to be identified by NASA's Swift mission scheduled to be launched this fall from Cape Canaveral. I am convinced that further discoveries and studies of this new class of hybrid events will forward our understanding of the death of massive stars."

 

Writer: 
RT

Physicists Successful in Trapping Ultracold Neutrons at Los Alamos National Laboratory

PASADENA, Calif.—Free neutrons are usually pretty speedy customers, buzzing along at a significant fraction of the speed of light. But physicists have created a new process to slow neutrons down to about 15 miles per hour—the pace of a world-class mile runner—which could lead to breakthroughs in understanding the physical universe at its most fundamental level.

According to Brad Filippone, a physics professor at the California Institute of Technology, he and a group of colleagues from Caltech and several other institutions recently succeeded in collecting record-breaking numbers of ultracold neutrons at the Los Alamos Neutron Science Center. The new technique resulted in about 140 neutrons per cubic centimeter, and the number could be five times higher with additional tweaking of the apparatus.

"Our principal interest is in making precision measurements of fundamental neutron properties," says Filippone, explaining that a neutron has a half-life of only 15 minutes. In other words, if a thousand neutrons are trapped, five hundred will have broken down after 15 minutes into a proton, electron, and antineutrino.

Neutrons normally exist in nature in a much more stable state within the nuclei of atoms, joining the positively charged protons to make up most of the atom's mass. Neutrons become quite unstable if they are stripped from the nucleus, but the very fact that they decay so quickly can make them useful for various experiments.

The traditional way physicists obtained free neutrons was by trying to slow them down as they emerged from a nuclear reactor, making them bounce around in material to get rid of energy. This procedure worked fine for slowing down neutrons to a few feet per second, but that's still pretty fast. The new technique at Los Alamos National Laboratory involves a second stage of slowdown that is impractical near a nuclear reactor, but which works well at a nuclear accelerator where the event producing the neutrons is abrupt rather than ongoing. The process begins with smashing protons from the accelerator into a solid material like tungsten, which results in neutrons being knocked out of their nuclei.

The neutrons are then slowed down as they bounce around in a nearby plastic material, and then some of them are slowed much further if they happen to enter a birthday-cake-sized block of solid deuterium (or "heavy hydrogen") that has been cooled down to a temperature a few degrees above absolute zero.

When the neutrons enter the crystal latticework of the deuterium block, they can lose virtually all their energy, and emerge from the block at speeds so slow they can no longer zip right through the walls of the apparatus. The trapped ultracold neutrons bounce along the nickel walls of the apparatus and eventually emerge, where they can be collected for use in a separate experiment. According to Filippone, the extremely slow speeds of the neutrons are important in studying their decays at a minute level of detail. The fundamental theory of particle physics known as the Standard Model predicts a specific pattern in the neutron's decay, but if the ultracold neutron experiments were to reveal slightly different behavior, then physicists would have evidence of a new type of physics, such as supersymmetry. Future experiments could also exploit an inherent quantum limit of the ultracold neutrons to bounce no lower than about 15 microns on a flat surface—or about a fifth the width of a human hair. With a cleverly designed experiment, Filippone says, this limit could lead to better knowledge of gravitational interactions at very small distances. The next step for the experimenters is to return to Los Alamos in October. Then, they will use the ultracold neutrons to study the neutrons themselves. The research was supported by about $1 million funding from Caltech and the National Science Foundation.

Writer: 
RT

From Cosmos to Climate, Six Caltech Professors Awarded Sloan Research Fellowships

PASADENA, Calif.— Six members of the Caltech faculty have received Alfred P. Sloan Research Fellowships for 2004.

The Caltech recipients in the field of mathematics are Nathan Dunfield and Vadim Kaloshin, both associate professors of mathematics. In physics, Sloan Fellowships were awarded to Andrew Blain, assistant professor of astronomy, Sunil Golwala, assistant professor of physics, Re'em Sari, associate professor of astrophysics and planetary science, and Tapio Schneider, assistant professor of environmental science and engineering.

Each Sloan Fellow receives a grant of $40,000 for a two-year period. The grants of unrestricted funds are awarded to young researchers in the fields of physics, chemistry, computer science, mathematics, neuroscience, computational and evolutionary molecular biology, and economics. The grants are given to pursue diverse fields of inquiry and research, and to allow young scientists the freedom to establish their own independent research projects at a pivotal stage in their careers. The Sloan Fellows are selected on the basis of "their exceptional promise to contribute to the advancement of knowledge."

From over 500 nominees, a total of 116 young scientists and economists from 51 different colleges and universities in the United States and Canada, including Caltech's six, were selected to receive a Sloan Research Fellowship.

Twenty-eight previous Sloan Fellows have gone on to win Nobel Prizes.

The Alfred P. Sloan Research Fellowship program was established in 1955 by Alfred P. Sloan, Jr., who was the chief executive officer of General Motors for 23 years. Its objective is to encourage research by young scholars at a time in their careers when other support may be difficult to obtain. It is the oldest program of the Alfred P. Sloan Foundation and one of the oldest fellowship programs in the country.

Nathan Dunfield conducts research in topology, the study of how geometric structures in three-dimensional space can be altered. His focus is on the connections to the symmetries of rigid geometric objects, especially certain types of non-Euclidean geometries, and he also uses computer experiments to probe some of the central questions in the study of topology. Dunfield will utilize his Sloan Fellowship to further his research in this area.

Vadim Kaloshin is an expert in chaos theory and "strange attractors." He is especially interested in mathematical equations known as Hamiltonian systems and how they apply to stability. His work could lead to a better understanding of how chaotic systems behave. Kaloshin will use his Sloan Fellowship to continue investigation in these fields.

Andrew Blain probes the origin of galaxies by observing them at great distances in the process of formation. He concentrates on the signatures that can be seen in the short-wavelength radio and long-wavelength infrared spectrum, where the gas and soot-like dust particles between the stars emit energy they absorb from the youngest and most luminous parts of galaxies. Most studies of the process are still carried out using the direct light from stars at shorter optical wavelengths, but the complementary information from longer wavelengths is essential to build up a more complete picture. The Sloan Foundation Fellowship will be used to link together these two techniques by investigating differences between the way distant galaxies found at each wavelength are distributed in space.

Sunil Golwala's research focuses on understanding dark matter and dark energy, components that dominate the universe but whose identity and nature are unknown. Golwala is interested in the development and use of particle detectors for observing the direct scattering of "Weakly Interacting Massive Particles," one of the leading candidates for dark matter. His work also involves the observation of varying aspects of the cosmic microwave background that inform us about the nature of dark energy via its effect on the growth of galaxy clusters and its clustering effects on super-horizon scales. Golwala will utilize his Sloan Fellowship in pursuit of this endeavor to better understand the universe.

Re'em Sari intends to utilize his Sloan Fellowship to examine the origin of planet formation, a first step in a long journey to look for life around other stars. Some of the fundamental questions he will investigate are: How do planets form? What are the necessary initial conditions for planet formation? What factors determined the number of planets in our solar system? How many planets like Earth do we expect to find around other stars? Are there binary giant planets? Sari will apply his fellowship to further understanding the "grand scheme of planetary systems."

Tapio Schneider works on understanding climate and the dynamical processes in the atmosphere that determine basic climatic features such as the pole-to-equator temperature gradient and the distribution of water vapor. Developing mathematical models of the large-scale (1000 km) turbulent transport of heat, mass, and water vapor is one central aspect of this research. The Sloan Fellowship will provide computing equipment and support to expand these studies on climate.

Contact: Deborah Williams-Hedges (626) 395-3227 debwms@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

###

Writer: 
DWH
Writer: 
Exclude from News Hub: 
Yes

Caltech, Cornell announce new $2-million study for building giant submillimeter telescope

PASADENA, Calif.—The California Institute of Technology and Cornell University are in the planning stages for a new 25-meter telescope to be built in Chile. The submillimeter telescope will cost an estimated $60 million and will be nearly two times larger in diameter than the largest submillimeter telescope currently in existence.

The first step of the plan, which is being announced today jointly by Caltech and Cornell, commits the two institutions to a $2-million study, says Jonas Zmuidzinas, a physics professor at Caltech who is leading the Institute's part of the collaboration. The telescope is projected for a 2012 completion date on a high site in the Atacama Desert of northern Chile, and will significantly ramp up Caltech's research in submillimeter astronomy.

Scientists from Cornell, Caltech, and Caltech's Jet Propulsion Laboratory (JPL) will be participating in the telescope study, including Caltech faculty members Andrew Blain, Sunil Golwala, Andrew Lange, Tom Phillips, Anthony Readhead, Anneila Sargent, and others.

"We are very much looking forward to working with our Cornell colleagues on this project," says Zmuidzinas.

At Cornell, the participants will include professors Riccardo Giovanelli, Terry Herter, Gordon Stacey, and Bob Brown.

Submillimeter wavelength astronomy allows the study of a number of astrophysical phenomena that do not emit much visible or infrared light. The new telescope will observe stars and planets forming from swirling disks of gas and dust, will make measurements to determine the composition of the molecular clouds from which the stars are born, and could even discover large numbers of galaxies undergoing huge bursts of star formation in the very distant universe.

Also, the 25-meter telescope could be used to study the origin of large-scale structure in the universe.

"So far, we have gotten just a small taste of what there is to learn at submillimeter wavelengths," says Zmuidzinas. "This telescope will be a huge step forward for the field."

The new telescope is poised to take advantage of the rapid development of sensitive superconducting detectors, an area in which Zmuidzinas and his Caltech/JPL colleagues have been making important contributions. The new superconducting detectors enable large submillimeter cameras to be built, which will produce very sensitive panoramic images of the submillimeter sky.

The 25-meter telescope is a natural progression in Caltech and JPL's longstanding interest in submillimeter astronomy. Caltech already operates the Caltech Submillimeter Observatory (CSO), a 10.4-meter telescope constructed and operated with funding from the National Science Foundation, with Tom Phillips serving as director. The telescope is fitted with sensitive submillimeter detectors and cameras, many of which were developed in collaboration with JPL, making it ideal for seeking out and observing the diffuse gases and their constituent molecules, crucial to understanding star formation.

The advantages of the new telescope will be fourfold. First, due to the larger size of its mirror and its more accurate surface, the 25-meter telscope should provide six to 12 times the light-gathering ability of the CSO, depending on the exact wavelength. Second, the larger diameter and better surface will result in much sharper images of the sky. Third, the large new cameras will provide huge advantages over those currently available.

Finally, the 16,500-foot elevation of the Atacama Desert will provide an especially dry sky for maximum effectiveness. Submillimeter wavelengths (as short as two-tenths of a millimeter) are strongly absorbed by the water vapor in the atmosphere. For maximum effectiveness, a submillimeter telescope must be located at a very high, very dry altitude--the higher the better--or best of all, in space.

However, while the idea of a large (10-meter) submillimeter telescope in space is being considered by NASA and JPL, it is still more than a decade away. Meanwhile, existing space telescopes such as the Hubble and the Spitzer work at shorter wavelengths, in the visible and infrared, respectively.

In 2007, the European Space Agency plans to launch the 3.5-meter Herschel Space Observatory, which will be the first general-purpose submillimeter observatory in space. NASA is participating in this project, and scientists at JPL and Caltech are providing detectors and components for the science instruments.

"It is a very exciting time for submillimeter astronomy," says Zmuidzinas. "We are making rapid progress on all fronts--in detectors, instruments, and new facilities--and this is leading to important scientific discoveries." 

Writer: 
Robert Tindol
Writer: 

Researchers Using Hubble and Keck Telescopes Find Farthest Known Galaxy in the Universe

PASADENA, California--The farthest known object in the universe may have been discovered by a team of astrophysicists using the Keck and Hubble telescopes. The object, a galaxy behind the Abell 2218 cluster, may be so far from Earth that its light would have left when the universe was just 750 million years old.

The discovery demonstrates again that the technique known as gravitational lensing is a powerful tool for better understanding the origin of the universe. Via further applications of this remarkable technique, astrophysicists may be able to better understand the mystery of how the so-called "Dark Ages" came to an end.

According to California Institute of Technology astronomer Jean-Paul Kneib, who is the lead author reporting the discovery in a forthcoming article in the Astrophysical Journal, the galaxy is most likely the first detected close to a redshift of 7.0, meaning that it is rushing away from Earth at an extremely high speed due to the expansion of the universe. The distance is so great that the galaxy's ultraviolet light has been stretched to the point of being observed at infrared wavelengths.

The team first detected the new galaxy in a long exposure of the Abell 2218 cluster taken with the Hubble Space Telescope's Advanced Camera for Surveys. Analysis of a sequence of Hubble images indicate a redshift of at least 6.6, but additional work with the Keck Observatory's 10-meter telescopes suggests that the astronomers have found an object whose redshift is close to 7.0.

Redshift is a measure of the factor by which the wavelength of light is stretched by the expansion of the universe. The greater the shift, the more distant the object and the earlier it is being seen in cosmic history.

"As we were searching for distant galaxies magnified by Abell 2218, we detected a pair of strikingly similar images whose arrangement and color indicated a very distant object," said Kneib. "The existence of two images of the same object indicated that the phenomenon of gravitational lensing was at work."

The key to the new discovery is the effect the Abell 2218 cluster's gigantic mass has on light passing by it. As a consequence of Einstein's theory of relativity, light is bent and can be focused in a predictable way due to the warpage of space-time near massive objects. In this case the phenomenon actually magnifies and produces multiple images of the same source. The new source in Abell 2218 is magnified by a factor of 25.

The role of gravitational lensing as a useful phenomenon in cosmology was first pointed out by the Caltech astronomer Fritz Zwicky in 1937, who even suggested it could be used to discover distant galaxies that would otherwise be too faint to be seen.

"The galaxy we have discovered is extremely faint, and verifying its distance has been an extraordinarily challenging adventure," Kneib added. "Without the magnification of 25 afforded by the foreground cluster, this early object could simply not have been identified or studied in any detail with presently available telescopes. Indeed, even with aid of the cosmic lens, our study has only been possible by pushing our current observatories to the limits of their capabilities."

Using the unique combination of the high resolution of Hubble and the magnification of the cosmic lens, the researchers estimate that the galaxy is small--perhaps measuring only 2,000 light-years across—but forming stars at an extremely high rate.

An intriguing property of the new galaxy is the apparent lack of the typically bright hydrogen emission seen in many distant objects. Also, its intense ultraviolet signal is much stronger than that seen in later star-forming galaxies, suggesting that the galaxy may be composed primarily of massive stars.

"The unusual properties of this distant source are very tantalizing because, if verified by further study, they could represent those expected for young stellar systems that ended the dark ages," said Richard Ellis, Steele Family Professor of Astronomy, and a coauthor of the article.

The term "Dark Ages" was coined by the British astronomer Sir Martin Rees to signify the period in cosmic history when hydrogen atoms first formed but stars had not yet had the opportunity to condense and ignite. Nobody is quite clear how long this phase lasted, and the detailed study of the cosmic sources that brought this period to an end is a major goal of modern cosmology.

The team plans to continue the search for additional extremely distant galaxies by looking through other cosmic lenses in the sky.

"Estimating the abundance and characteristic properties of sources at early times is particularly important in understanding how the Dark Ages came to an end," said Mike Santos, a former Caltech graduate student involved in the discovery and now a postdoctoral researcher at the Institute of Astronomy in Cambridge, England. "We are eager to learn more by finding further examples, although it will no doubt be challenging."

The Caltech team reporting on the discovery consists of Kneib, Ellis, Santos, and Johan Richard. Kneib and Richard are also affiliated with the Observatoire Midi-Pyrenees of Toulouse, France. Santos is also at the Institute of Astronomy, in Cambridge.

The research was funded in part by NASA.

The W. M. Keck Observatory is managed by the California Association for Research in Astronomy, a scientific partnership between the California Institute of Technology, the University of California, and NASA. For more information, visit the observatory online at www.keckobservatory.org.

Writer: 
RT

Astronomers measure distance to star celebrated in ancient literature and legend

PASADENA—The cluster of stars known as the Pleiades is one of the most recognizable objects in the night sky, and for millennia has been celebrated in literature and legend. Now, a group of astronomers has obtained a highly accurate distance to one of the stars of the Pleiades known since antiquity as Atlas. The new results will be useful in the longstanding effort to improve the cosmic distance scale, as well as to research the stellar life-cycle. In the January 22 issue of the journal Nature, astronomers from the California Institute of Technology and the Jet Propulsion Laboratory report the best-ever distance to the double-star Atlas. The star, along with "wife" Pleione and their daughters, the "seven sisters," are the principal stars of the Pleiades that are visual to the unaided eye, although there are actually thousands of stars in the cluster. Atlas, according to the team's decade of careful interferometric measurements, is somewhere between 434 and 446 light-years from Earth.\

The range of distance to the Pleiades cluster may seem somewhat imprecise, but in fact is accurate by astronomical standards. The traditional method of measuring distance is by noting the precise position of a star and then measuring its slight change in position when Earth itself has moved to the other side of the sun. This approach can also be used to find distance on Earth. If you carefully record the position of a tree an unknown distance away, move a specific distance to your side, and measure how far the tree has apparently "moved," it's possible to calculate the actual distance to the tree by using trigonometry.

However, this procedure gives only a rough estimate to the distance of even the nearest stars, due to the gigantic distances involved and the subtle changes in stellar position that must be measured. Further, the team's new measurement settles a controversy that arose when the European satellite Hipparcos provided a distance measurement to the Pleiades so much nearer the distance than assumed that the findings contradicted theoretical models of the life cycles of stars.

This contradiction was due to the physical laws of luminosity and its relationship to distance. A 100-watt light bulb one mile away looks exactly as bright as a 25-watt light bulb half a mile away. So to figure out the wattage of a distant light bulb, we have to know how far away it is. Similarly, to figure out the "wattage" (luminosity) of observed stars, we have to measure how far away they are. Theoretical models of the internal structure and nuclear reactions of stars of known mass also predict their luminosities. So the theory and measurements can be compared.

However, the Hipparcos data provided a distance lower than that assumed from the theoretical models, thereby suggesting either that the Hipparcos distance measurements themselves were off, or else that there was something wrong with the models of the life cycles of stars. The new results show that the Hipparcos data was in error, and that the models of stellar evolution are indeed sound. The new results come from careful observation of the orbit of Atlas and its companion--a binary relationship that wasn't conclusively demonstrated until 1974 and certainly was unknown to ancient watchers of the sky. Using data from the Mt. Wilson stellar interferometer (located next to the historic Mt. Wilson Observatory in the San Gabriel range) and the Palomar Testbed Interferometer at Caltech's Palomar Observatory in San Diego County, the team determined a precise orbit of the binary. Interferometry is an advanced technique that allows, among other things, for the "splitting" of two bodies that are so far away that they normally appear as a single blur, even in the biggest telescopes. Knowing the orbital period and combining it with orbital mechanics allowed the team to infer the distance between the two bodies, and with this information, to calculate the distance of the binary to Earth. "For many months I had a hard time believing our distance estimate was 10 percent larger than that published by the Hipparcos team," said the lead author, Xiao Pei Pan of JPL. "Finally, after intensive rechecking, I became confident of our result."

Coauthor Shrinivas Kulkarni, MacArthur Professor of Astronomy and Planetary Science at Caltech, said, "Our distance estimate shows that all is well in the heavens. Stellar models used by astronomers are vindicated by our value." "Interferometry is a young technique in astronomy and our result paves the way for wonderful returns from the Keck Interferometer and the anticipated Space Interferometry Mission that is expected to be launched in 2009," said coauthor Michael Shao of JPL. Shao is also the principal scientist for the Keck Interferometer and the Space Interferometry Mission. The Palomar Testbed Interferometer was designed and built by a team of researchers from JPL led by Shao and JPL engineer Mark Colavita. Funded by NASA, the interferometer is located at the Palomar Observatory near the historic 200-inch Hale Telescope. The device served as an engineering testbed for the interferometer that now links the 10-meter Keck Telescopes atop Mauna Kea in Hawaii.

 

Writer: 
Robert Tindol
Writer: 

The coming global peak in oil productionis a grave concern, according to new book

PASADENA, Calif.—Ancient Persians tipped their fire arrows with it, and Native Americans doctored their ails with it. Any way you look at petroleum, the stuff has been around for a long time. Problem is, it's not going to be around much longer--or at least not in the quantities necessary to keep our Hummers humming.

To address the choices society will soon face in the inevitable peaking of worldwide oil production, California Institute of Technology physics professor David Goodstein has written a new book titled Out of Gas: The End of the Age of Oil. Goodstein argues that global production will peak sooner than most people think, possibly in this decade--a view held by a number of geologists--and that the peak itself will be the beginning of serious and widespread social and economic consequences.

"Some say that the world has enough oil to last for another forty years or more, but that view is almost surely mistaken," writes Goodstein, whose past forays into the world of science communication have included his award-winning PBS series The Mechanical Universe, as well as the best-selling book Feynman's Lost Lecture.

Goodstein writes that the worldwide peak will almost surely be highly disruptive, if not catastrophic, considering the difficult American experience of the early 1970s, when U.S. production met its own peak. Since then, U.S. production has been on a downslope that will continue until the tap runs dry.

But even the 1970s' experience would be nothing compared to a worldwide peak, Goodstein explains. Indeed, the country then experienced serious gas shortages and price increases, exacerbated in no small part by the Arab oil embargo. But frustration and exasperation aside, there was oil to buy on the global market if one could locate a willing seller. By contrast, the global peak will mean that prices will thereafter rise steadily and the resource will become increasingly hard to obtain.

Goodstein says that best and worst-case scenarios are fairly easy to envision. At worst, after the so-called Hubbert's peak (named after M. King Hubbert, the Texas geophysicist who was nearly laughed out of the industry in the 1950s for even suggesting that a U.S. production peak was possible), all efforts to deal with the problem on an emergency basis will fail. The result will be inflation and depression that will probably result indirectly in a decrease in the global population. Even the lucky survivors will find the climate a bit much to take, because billions of people will undoubtedly rely on coal for warmth, cooking, and basic industry, thereby spewing a far greater quantity of greenhouse gases into the air than that which is currently released.

"The change in the greenhouse effect that results eventually tips Earth's climate into a new state hostile to life. End of story. In this instance, worst case really means worst case."

The best-case scenario, Goodstein believes, is that the first warning that Hubbert's peak has occurred will result in a quick and stone-sober global wake-up call. Given sufficient political will, the transportation system will be transformed to rely at least temporarily on an alternative fuel such as methane. Then, more long-term solutions to the crisis will be put in place--presumably nuclear energy and solar energy for stationary power needs, and hydrogen or advanced batteries for transportation.

The preceding is the case that Goodstein makes in the first section of the book. The next section is devoted to a nontechnical explanation of the facts of energy production. Goodstein, who has taught thermodynamics to a generation of Caltech students, is particularly accomplished in conveying the basic scientific information in an easily understandable way. In fact, he often does so with wit, explaining in a brief footnote on the naming of subatomic particles, for example, that the familiar "-on" ending of particles, such as "electrons," "mesons," and "photons," may also suggest an individual quantum of humanity known as the "person."

The remainder of the book is devoted to suggested technological fixes. None of the replacement technologies are as simple and cheap as our current luxury of going to the corner gas station and filling up the tank for the equivalent of a half-hour's wages, but Goodstein warns that the situation is grave, and that things will change very soon.

"The crisis will occur, and it will be painful," he writes in conclusion. "Civilization as we know it will come to an end sometime in this century unless we can find a way to live without fossil fuels."

Goodstein dedicates the book "to our children and grandchildren, who will not inherit the riches that we inherited."

The book, published by W.W. Norton & Company, is now available.

Writer: 
Robert Tindol
Tags: 
Writer: 

Moore Foundation Awards Additional $17.5 Million for Thirty-Meter Telescope Plans

PASADENA—The Gordon and Betty Moore Foundation awarded $17.5 million to the University of California for collaboration with the California Institute of Technology on a project intended to build the world's most powerful telescope. Coupled with an award by the foundation to Caltech for the same amount, a total of $35 million is now available for the two institutions to collaborate on this visionary project to build the Thirty Meter Telescope (TMT). Their next step will be to work together to formulate detailed design plans for the telescope.

A 30-meter-diameter optical and infrared telescope, complete with adaptive optics, would result in images more than 12 times sharper than those of the Hubble Space Telescope. The TMT will have nine times the light-gathering ability of one of the 10-meter Keck Telescopes, which are currently the largest in the world. With such a telescope, astrophysicists will be able to study the earliest galaxies and the details of their formation as well as pinpoint the processes that lead to young planetary systems around nearby stars.

"We are very pleased that the Gordon and Betty Moore Foundation has recognized the strengths of the University of California and Caltech to carry out such an important project," said UC President Robert C. Dynes. "The giant telescope will help our astronomy faculty stay at the very forefront of that dynamic field of science."

"The University of California and Caltech will work in close and constant collaboration to achieve the goals of the design effort," said Joseph Miller, director of UC Observatories/Lick Observatory, headquartered at UC Santa Cruz. "We've also entered into collaborations with the Association of Universities for Research in Astronomy and the Association of Canadian Universities for Research in Astronomy, both of whom are in the process of seeking major funding."

According to Richard Ellis, director of Caltech Optical Observatories and Steele Professor of Astronomy at Caltech, the Gordon and Betty Moore Foundation's award will provide the crucial funding needed to address the major areas of risk in this large project.

"This next phase is of central importance, because in the course of carrying it out, we will establish the fundamental technologies and methods necessary for the building of the telescope," Ellis said.

Miller and Ellis agree that the TMT is a natural project for UC and Caltech to undertake jointly, given their decades of experience as collaborators in constructing, operating, and conducting science with the world's largest telescopes at the Keck Observatory. The TMT design is a natural evolution of the Keck Telescope design, and many of the same UC and Caltech scientists involved in the creation of the Keck Observatory are deeply involved in the TMT project.

Following the Gordon and Betty Moore Foundation-funded design study, the final phase of the project, not yet funded, will be construction of the observatory at an as-yet-undetermined site. The end of this phase would mark the beginning of regular astronomical observations, perhaps by 2012.

The Gordon and Betty Moore Foundation was established in November 2000, by Intel co-founder Gordon Moore and his wife Betty. The foundation funds outcome-based projects that will measurably improve the quality of life by creating positive outcomes for future generations. Grantmaking is concentrated in initiatives that support the foundation's principal areas of concern: environmental conservation, science, higher education, and the San Francisco Bay area. ### MEDIA CONTACTS: Jill Perry, Media Relations Director, Caltech (626) 395-3226, jperry@caltech.edu

Tim Stephens, Science Writer, University Relations/Public Information Office, University of California, Santa Cruz (831) 459-4352, stephens@ucsc.edu

Writer: 
JP
Writer: 

Caltech, SLAC, and LANL Set New Network Performance Marks

PHOENIX, Ariz.--Teams of physicists, computer scientists, and network engineers from Caltech, SLAC, LANL, CERN, Manchester, and Amsterdam joined forces at the Supercomputing 2003 (SC2003) Bandwidth Challenge and captured the Sustained Bandwidth Award for their demonstration of "Distributed particle physics analysis using ultra-high speed TCP on the Grid," with a record bandwidth mark of 23.2 gigabits per second (or 23.2 billion bits per second).

The demonstration served to preview future Grid systems on a global scale, where communities of hundreds to thousands of scientists around the world would be able to access, process, and analyze terabyte-sized data samples, drawn from data stores thousands of times larger. A new generation of Grid systems is being developed in the United States and Europe to meet these challenges, and to support the next generation of high-energy physics experiments that are now under construction at the CERN laboratory in Geneva.

The currently operating high-energy physics experiments at SLAC (Palo Alto, California), Fermilab (Batavia, Illinois), and BNL (Upton, New York) are facing qualitatively similar challenges.

During the Bandwidth Challenge, the teams used all three of the 10 gigabit/sec wide-area network links provided by Level 3 Communications and Nortel, connecting the SC2003 site to Los Angeles, and from there to the Abilene backbone of Internet2, the TeraGrid, and to Palo Alto using a link provided by CENIC and National LambdaRail. The bandwidth mark achieved was more than 500,000 times faster than an Internet user with a typical modem connection (43 kilobits per second). The amount of TCP data transferred during the 48-minute-long demonstration was over 6.6 terabytes (or 6.6 trillion bytes). Typical single-stream host-to-host TCP data rates achieved were 3.5 to 5 gigabits per second, approaching the single-stream bandwidth records set last month by Caltech and CERN.

The data, generated from servers at the Caltech Center for Advanced Computing Research (CACR), SLAC, and LANL booths on the SC2003 showroom floor at Phoenix, a cluster at the StarLight facility in Chicago as well as the TeraGrid node at Caltech, was sent to sites in four countries (USA, Switzerland, Netherlands, and Japan) on three continents. Participating sites in the winning effort were the Caltech/DataTAG and Amsterdam/SURFnet PoPs at Chicago (hosted by StarLight), the Caltech PoP at Los Angeles (hosted by CENIC), the SLAC PoP at Palo Alto, the CERN and the DataTAG backbone in Geneva, the University of Amsterdam and SURFnet in Amsterdam, the AMPATH PoP at Florida International University in Miami, and the KEK Laboratory in Tokyo. Support was provided by DOE, NSF, PPARC, Cisco Systems, Level 3, Nortel, Hewlett-Packard, Intel, and Foundry Networks.

The team showed the ability to use efficiently both dedicated and shared IP backbones. Peak traffic on the Los Angeles-Phoenix circuit, dedicated to this experiment, reached almost 10 gigabits per second utilizing more than 99 percent of the capacity. On the shared Abilene and TeraGrid circuits the experiment was able to share fairly over 85 percent of the available bandwidth. Snapshots of the maximum link utilizations during the demonstration showed 8.7 gigabits per second on the Abilene link and 9.6 gigabits per second on the TeraGrid link.

This performance would never have been achieved without the use of new TCP implementations because the widely deployed TCP RENO protocol performs poorly at gigabit-per-second speed. The primary TCP algorithm used was new FAST TCP stack developed at the Caltech Netlab. Additional streams were generated using HS-TCP, implemented at Manchester, and scalable TCP.

Harvey Newman, professor of physics at Caltech, said: "This was a milestone in our development of wide-area networks and of global data-intensive systems for science. Within the past year we have learned how to use shared networks up to the 10 gigabit-per-second range effectively. In the next round we will combine these developments with the dynamic building of optical paths across countries and oceans. This paves the way for more flexible, efficient sharing of data by scientists in many countries, and could be a key factor enabling the next round of physics discoveries at the high-energy frontier. There are also profound implications for integrating information sharing and on-demand audiovisual collaboration in our daily lives, with a scale and quality previously unimaginable."

Les Cottrell, assistant director of SLAC's computer services, said: "This demonstrates that commonly available standard commercial hardware and software, from vendors like Cisco, can effectively and fairly use and fill up today's high-speed Internet backbones, and sustain TCP flows of many gigabits per second on both dedicated and shared intracountry and transcontinental networks. As 10 gigabit-per-second Ethernet equipment follows the price reduction curve experienced by earlier lower-speed standards, this will enable the next generation of high-speed networking and will catalyze new data-intensive applications in fields such as high-energy physics, astronomy, global weather, bioinformatics, seismology, medicine, disaster recovery, and media distribution."

Wu-chun (Wu) Feng, team leader of research and development in Advanced Network Technology in the Advanced Computing Laboratory at LANL, noted: "The SC2003 Bandwidth Challenge provided an ideal venue to demonstrate how a multi-institutional and multi-vendor team can quickly come together to achieve a feat that would otherwise be unimaginable today. Through the collaborative efforts of Caltech, SLAC, LANL, CERN, Manchester, and Amsterdam, we have once again pushed the envelope of high-performance networking. Moore's law move over!"

"Cisco was very pleased to help support the SC2003 show infrastructure, SCINET," said Bob Aiken, director of engineering for academic research and technology initiatives at Cisco. "In addition, we also had the opportunity to work directly with the high-energy physics (HEP) research community at SLAC and Caltech in the United States, SURFnet in the Netherlands, CERN in Geneva, and KEK in Japan, to once again establish a new record for advanced network infrastructure performance.

"In addition to supporting network research on the scaling of TCP, Cisco also provided a wide variety of solutions, including Cisco Systems ONS 15540, Cisco ONS 15808, Cisco Catalyst 6500 Series, Cisco 7600 Series, and Cisco 12400 Series at the HEP sites in order for them to attain their goal. The Cisco next-generation 10 GE line cards deployed at SC2003 were part of the interconnect between the HEP sites of Caltech, SLAC, CERN, KEK/Japan, SURFnet, StarLight, and the CENIC network."

"Level 3 was pleased to support the SC2003 conference again this year," said Paul Fernes, director of business development for Level 3. "We've provided network services for this event for the past three years because we view the conference as a leading indicator of the next generation of scientific applications that distinguished researchers from all over the world are working diligently to unleash. Level 3 will continue to serve the advanced networking needs of the research and academic community, as we believe that we have a technologically superior broadband infrastructure that can help enable new scientific applications that are poised to significantly contribute to societies around the globe."

Cees de Laat, associate professor at the University of Amsterdam and organizer of the Global Lambda Integrated Facility (GLIF) Forum, added: "This world-scale experiment combined leading researchers, advanced optical networks, and network research sites to achieve this outstanding result. We were able to glimpse a yet-to-be explored network paradigm, where both shared and dedicated paths are exploited to map the data flows of big science onto a hybrid network infrastructure in the most cost-effective way. We need to develop a new knowledge base to use wavelength-based networks and Grids effectively, and projects such as UltraLight, TransLight, NetherLight, and UKLight, in which the team members are involved, have a central role to play in reaching this goal."

###

About Caltech: With an outstanding faculty, including four Nobel laureates, and such off-campus facilities as the Jet Propulsion Laboratory, Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities. http://www.caltech.edu

About SLAC: The Stanford Linear Accelerator Center (SLAC) is one of the world 's leading research laboratories. Its mission is to design, construct, and operate state-of-the-art electron accelerators and related experimental facilities for use in high-energy physics and synchrotron radiation research. In the course of doing so, it has established the largest known database in the world, which grows at 1 terabyte per day. That, and its central role in the world of high-energy physics collaboration, places SLAC at the forefront of the international drive to optimize the worldwide, high-speed transfer of bulk data. http://www.slac.stanford.edu/

About LANL: Los Alamos National Laboratory is operated by the University of California for the National Nuclear Security Administration of the U.S. Department of Energy and works in partnership with NNSA's Sandia and Lawrence Livermore National Laboratories to support NNSA in its mission. Los Alamos enhances global security by ensuring the safety and reliability of the U.S. nuclear weapons stockpile, developing technical solutions to reduce the threat of weapons of mass destruction, and solving problems related to energy, environment, infrastructure, health, and national security concerns. http://www.lanl.gov/

About Netlab: Netlab is the Networking Laboratory at Caltech led by Professor Steven Low, where FAST TCP has been developed. The group does research in the control and optimization of protocols and networks, and designs, analyzes, implements, and experiments with new algorithms and systems. http://netlab.caltech.edu/FAST/

About CERN: CERN, the European Organization for Nuclear Research, has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About DataTAG: The European DataTAG is a project co-funded by the European Union, the U.S. Department of Energy through Caltech, and the National Science Foundation. It is led by CERN together with four other partners. The project brings together the following European leading research agencies: Italy's Istituto Nazionale di Fisica Nucleare (INFN), France's Institut National de Recherche en Informatique et en Automatique (INRIA), the U.K.'s Particle Physics and Astronomy Research Council (PPARC), and the Netherlands' University of Amsterdam (UvA). The DataTAG project is very closely associated with the European Union DataGrid project, the largest Grid project in Europe also led by CERN. For more information, see http://www.datatag.org.

About StarLight: StarLight is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications. Operational since summer 2001, StarLight is a 1 GE and 10 GE switch/router facility for high-performance access to participating networks and also offers true optical switching for wavelengths. StarLight is being developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), the International Center for Advanced Internet Research (iCAIR) at Northwestern University, and the Mathematics and Computer Science Division at Argonne National Laboratory, in partnership with Canada's CANARIE and the Netherlands' SURFnet. STAR TAP and StarLight are made possible by major funding from the U.S. National Science Foundation to UIC. StarLight is a service mark of the Board of Trustees of the University of Illinois. See www.startap.net/starlight.

About the University of Manchester: The University of Manchester, located in the United Kingdom, was first granted a Royal Charter in April 1880 as the Victoria University and became the first of the U.K.'s great civic universities. As a full-range university it now has more than 70 departments involved in teaching and research, with more than 2,000 academic staff. There are more than 18,000 full-time students, including 2,500 international students, from over 120 countries studying for undergraduate and postgraduate level degrees. The University of Manchester has a proud tradition of innovation and excellence which continues today. Some of the key scientific developments of the century have taken place here. In Manchester, Rutherford conducted the research which led to the splitting of the atom and the world's first stored-program electronic digital computer, built by Freddie Williams and Tom Kilburn, successfully executed its first program in June 1948. The departments of Physics, Computational Science, Computer Science and the Network Group together with the E-Science North West Centre research facility are very active in developing a wide range of e-science projects and Grid technologies. See www.man.ac.uk.

About National LambdaRail: National LambdaRail (NLR) is a major initiative of U.S. research universities and private sector technology companies to provide a national scale infrastructure for research and experimentation in networking technologies and applications. NLR puts the control, the power, and the promise of experimental network infrastructure in the hands of the nation's scientists and researchers. Visit http://www.nationallambdarail.org for more information.

About CENIC: CENIC is a not-for-profit corporation serving California Institute of Technology, California State University, Stanford University, University of California, University of Southern California, California Community Colleges, and the statewide K-12 school system. CENIC's mission is to facilitate and coordinate the development, deployment, and operation of a set of robust multi-tiered advanced network services for this research and education community. http://www.cenic.org

About University of Amsterdam: The Advanced Internet Research group of the University of Amsterdam's Faculty of Science researches new architectures and protocols for the Internet. It actively participates in worldwide standardization organizations Internet Engineering Task Force and the Global Grid Forum. The group conducts experiments with extremely high-speed network infrastructures. The Institute carries out groundbreaking research in the fields of security, authorization, authentication and accounting for grid environments. The Institute is developing a virtual laboratory based on grid technology for e-science applications. For more information see http://www.science.uva.nl/research/air>www.science.uva.nl/research/air.

Writer: 
Robert Tindol
Writer: 

Pages

Subscribe to RSS - PMA