Thirty Meter Telescope (TMT) Passes Conceptual Design Review

PASADENA, Calif.—The detailed design for the Thirty Meter Telescope (TMT) developed by a U.S.-Canadian team is capable of delivering on the full promise of its enormous light-collecting area, according to the findings of an independent panel of experts.

With the TMT, astronomers will be able to analyze the light from the first stars born after the Big Bang, directly observe the formation and evolution of galaxies, see planets around nearby stars, and make observations that test fundamental laws of physics.

"The successful completion of our conceptual design review means that the TMT has a strong science vision, good technical requirements, a thoroughly reviewed design, and a powerful team to carry our work forward," says Project Manager Gary Sanders.

Now in detailed design, the TMT is a concept for the world's largest telescope. It consists of a primary mirror with 738 individual 1.2-meter segments that span 30 meters in total, three times the effective diameter of the current largest telescopes. All of the segments will be under exquisite computer control so that they work together as a single mirror.

The review panel evaluated all aspects of the project, including its optical design, the telescope structure, science instrumentation, site testing, management and cost estimate procedures. The panel was positive on nearly all fronts and praised in particular the adaptive optics technology being planned for the giant telescope.

Adaptive optics will allow the TMT to reach the "diffraction limit," comparable to a telescope's resolution in space. TMT project engineers are integrating this system with the designs for the eight science instruments under detailed study, so the power of the adaptive optics (AO) system should be available at the beginning of the telescope's science operations in 2016, the external panel reported following the May 8-11 conceptual design review.

The baseline adaptive optics system for TMT involves nine laser beams that are launched from a small telescope at the peak of the structure that supports the telescope's secondary mirror. These laser beams reflect off a layer of sodium gas high in Earth's upper atmosphere to provide artificial points of light analogous to distant stars. These point-like laser reflections drift and wobble just like the star light, giving the AO system reference points to use anywhere in the sky as it compensates for distortions of the star light by Earth's ever-changing atmosphere. This technology has been pioneered at the Lick Observatory, the Gemini Observatory 8-meter telescopes and the Keck Observatory 10-meter telescopes.

TMT is also studying the potential for an adaptive secondary mirror for the telescope. This would involve covering the bottom of a flexible glass surface as large as the primary mirror in many current telescopes (a concave hemisphere 3.6 meters in diameter) with hundreds of tiny pistons to push and pull the surface of the mirror in minute increments. A computer controls these movements many times per second, as it works to adjust the mirror so it has the exact opposite shape of the distortions in the incoming star light.

Much of the TMT's scientific work will be done in the infrared, where the diffraction limit is easier to attain, young stars and galaxies are to be found, and the opportunities for new discoveries are abundant.

The eight scientific instruments in detailed design for the TMT are huge in comparison to current-generation astronomical instruments, and equivalently more complex. Each instrument is the size of school bus or larger, and they rest on two platforms on either side of the telescope that are each the size of a basketball court. The biggest technical challenge among the instruments is the Planetary Formation Instrument, which employs "extreme" adaptive optics in an effort to directly image other planets, the board found.

The technical requirements for the telescope, its structure, and its control system are clear and appropriate for this stage of the project, the review board concluded.

"The panel's report is glowing in its praise and confident that TMT is on track," says Richard Ellis, the Steele Family Professor of Astronomy at the California Institute of Technology, one of the partners in the project. "We'll decide in mid-2008 where to build the telescope and then start construction in early 2009."

The TMT is a collaboration between the California Institute of Technology, the University of California, the Association of Universities for Research in Astronomy, Inc. (AURA), and the Association of Canadian Universities for Research in Astronomy (ACURA), with significant work being done by industry and by university teams studying instrument designs.

Canadians welcome the external panel's endorsement of the depth and quality of the TMT design work. "We look forward to supplying the enclosure, telescope structure and adaptive optics system in time for first science," says Ray Carlberg of the University of Toronto, the Canadian project director for ACURA, an association of 24 Canadian universities in partnership with the National Research Council of Canada.

The design and development phase of the TMT project has a budget of $64 million, including $35 million in private sector contributions from the Gordon and Betty Moore Foundation. The conceptual design review board found that the project is estimating the cost of the TMT using up-to-date industry standards. A formal cost review of the project is scheduled for September 2006.

The TMT project is studying five sites in Chile, Hawaii, and Mexico as possible locations for the telescope. The project office is currently based in Pasadena, CA, where the conceptual design review was held.

Edward Stone, chair of the TMT Board of Directors and former director of NASA's Jet Propulsion Laboratory, is available to answer media questions about the conceptual design review and the status of this exciting project.

For more information on the project, see www.tmt.org.

The TMT is designed to meet the scientific goals of the Giant Segmented Mirror Telescope concept, which was the highest-priority ground-based project in the most recent astronomy decadal survey conducted by the National Academy of Sciences, published in 2000.

 

Writer: 
Robert Tindol
Writer: 

Palomar Observes Broken Comet

PALOMAR MOUNTAIN, Calif.—Astronomers have recently been enjoying front-row seats to a spectacular cometary show. Comet 73P/Schwassmann-Wachmann 3 is in the act of splitting apart as it passes close to Earth. The breakup is providing a firsthand look at the death of a comet.

Eran Ofek of the California Institute of Technology and Bidushi Bhattacharya of Caltech's Spitzer Science Center have been observing the comet's tragic tale with the Palomar Observatory's 200-inch Hale Telescope. Their view is helping them and other scientists learn the secrets of comets and why they break up.

The comet was discovered by Arnold Schwassmann and Arno Arthur Wachmann 76 years ago and it broke into four fragments just a decade ago. It has since further split into dozens, if not hundreds, of pieces.

"We've learned that Schwassmann-Wachmann 3 presents a very dynamic system, with many smaller fragments than previously thought," says Bhattacharya. In all, 16 new fragments were discovered as a part of the Palomar observations.

A sequence of images showing the piece of the comet known as fragment R has been assembled into a movie. The movie shows the comet in the foreground against distant stars and galaxies, which appear to streak across the images. Because the comet was moving at a different rate across the sky than the stellar background, the telescope was tracking the comet's motion and not that of the stars. Fragment R and many smaller fragments of the comet are visible as nearly stationary objects in the movie.

"Seeing the many fragments was both an amazing and sobering experience," says a sleepy Eran Ofek, who has been working non-stop to produce these images and a movie of the comet's fragments.

The images used to produce the movie were taken over a period of about an hour and a half when the comet was approximately 17 million kilometers (10.6 million miles) from Earth. Astronomically speaking the comet is making a close approach to Earth this month giving astronomers their front-row seat to the comet's break up. Closest approach for any fragment of the comet occurs on May 12, when a fragment will be just 5.5 million miles from Earth. This is more than 20 times the distance to the moon. There is no chance that the comet will hit Earth.

"It is very impressive that a telescope built more than 50 years ago continues to contribute to forefront astrophysics, often working in tandem with the latest space missions and biggest ground-based facilities," remarks Shri Kulkarni, MacArthur Professor of Astronomy and Planetary Science and director of the Caltech Optical Observatories.

The Palomar observations were coordinated with observations acquired through the Spitzer Space Telescope, which imaged the comet's fragments in the infrared. The infrared images, combined with the visible-light images obtained using the Hale Telescope, will give astronomers a more complete understanding of the comet's break up.

Additional support for the observations and data analysis came from Caltech postdoc Arne Rau and grad student Alicia Soderberg.

Images of the comet and a time-lapse movie can be found at:

http://www.astro.caltech.edu/palomar/images/73p/

Contact:

Scott Kardel Palomar Public Affairs Director (760) 742-2111 wsk@astro.caltech.edu

Writer: 
RT

CARMA Radio Telescope Array in the Inyo Mountains Dedicated May 5

PASADENA, Calif.—The official dedication of the Combined Array for Research in Millimeter-Wave Astronomy (CARMA) facility was held Friday, May 5, at Cedar Flat in the Inyo Mountains near Bishop.

CARMA is a joint venture of the California Institute of Technology, the University of California at Berkeley, the University of Illinois at Urbana-Champaign, and the University of Maryland. The project has involved moving the six existing 10-meter telescopes at Caltech's Owens Valley Radio Observatory (OVRO) millimeter-wave array, along with the nine 6-meter telescopes at the Berkeley-Illinois-Maryland Association (BIMA) array, to the new Cedar Flat location, about 13 miles east of Big Pine by mountain route.

According to Anneila Sargent, CARMA director, the new facility will give radio astronomers an extremely clear view of the universe due to the dry air and 7,300-foot altitude of the Cedar Flat site.

Inovative technology and better atmospheric transmission make CARMA a much more powerful instrument than merely the sum of the previous arrays, said Sargent, who is Rosen Professor of Astronomy at Caltech. The facility will be used to observe molecular gas and dust in planets, star-forming clouds, planet-forming disks around other stars, nearby galaxies, and galaxies so distant that they must have formed soon after the Big Bang.

"These measurements will enable studies that address directly some of the most important questions in astrophysics today," Sargent said. "These include how the modern universe and the first stars and galaxies formed and evolved, how stars and planetary systems like our own are formed, and what the chemistry of the interstellar gas can tell us about the origins of life."

The new array is operated by the CARMA Association, which comprises the four partner universities. The association will coordinate the separate activities of its members through a board of representatives that includes senior administrators from each partner university and the CARMA science steering committee, made up of scientists from Caltech and from BIMA.

As a multi-university facility, CARMA also has a major educational mission. Innovative astronomy and technical development programs will ensure that the next generation of radio astronomers and instrumentalists will receive hands-on training while conducting frontline research. The National Science Foundation has supported both the OVRO and BIMA arrays since their inception, and will continue to support CARMA operations. Construction costs for the new combined array are being divided equally among the NSF, Caltech, and BIMA, and astronomers around the world will have access to the facility.

Sargent says that funding from the Gordon and Betty Moore Foundation and the Norris Foundation have also been crucial. "We're especially grateful for their getting us started."

Writer: 
Robert Tindol
Writer: 

Two from Caltech Faculty Elected to the American Academy of Arts and Sciences

PASADENA, Calif.—Two faculty members at the California Institute of Technology are among this year's newly elected fellows of the American Academy of Arts and Sciences. They join 173 other Americans and 20 foreign honorees as the 2006 class of fellows of the prestigious institution that was cofounded in 1780 by John Adams.

This year's new Caltech inductees are Anneila Sargent, the Rosen Professor of Astronomy and director of the Combined Array for Research in Millimeter-Wave Astronomy (CARMA), and Henry Lester, the Bren Professor of Biology. Their election brings the total number of fellows from Caltech to 83.

Sargent and Lester join an illustrious list of fellows, both past and present. Other inductees in the 2006 class include former presidents George H. W. Bush and William Jefferson Clinton; Supreme Court Chief Justice John Roberts; Nobel Prize-winning biochemist and Rockefeller University President Sir Paul Nurse; the chairman and vice chairman of the 9/11 commission, Thomas Kean and Lee Hamilton; actor and director Martin Scorsese; choreographer Meredith Monk; conductor Michael Tilson Thomas; and New York Stock Exchange chairman Marshall Carter. Past fellows have included George Washington, Benjamin Franklin, Ralph Waldo Emerson, Albert Einstein, and Winston Churchill.

Sargent, a native of Scotland, is an authority on star formation. Most recently she has been investigating the way in which stars like the sun are created and evolve to become planetary systems. She uses various radio and submillimeter telescopes to search for and study other potential planetary systems.

Her interests range from the earliest stages of star formation, when dense cores in interstellar clouds collapse to form stars, to the epochs when individual planets may be born. This field has garnered considerable interest within the scientific community, as well as from the news media and the general public, because of the possibility of locating other worlds beyond the solar system.

She is a former president of the American Astronomical Society, incoming chair of the National Research Council's board of physics and astronomy, cochair of the 1996 "Search for Origins" workshop sponsored by the White House Office of Science and Technology Policy, a former chair of NASA's space science advisory committee, and a member of the 2000 National Research Council's survey committee on astronomy and astrophysics.

Her major honors include the 2002 University of Edinburgh Alumnus of the Year award and the 1998 NASA Public Service Medal.

Lester is a New York City native who has been a Caltech faculty member since 1973. His lab is currently involved in several avenues of research, but he is probably best known for his research on the neuroscience of nicotine addiction. A recipient of research funding from the California-based Tobacco-Related Disease Research Program (TRDRP) and the National Institutes of Health, Lester has published numerous papers showing the underlying mechanisms of nicotine addiction.

In 2004, he and collaborators from Caltech and other institutions announced their discovery that activating the receptor known as alpha4 involved in the release of the neurotransmitter dopamine is sufficient for reward behavior, sensitization, and tolerance to repeated doses of nicotine. The discovery was important, experts said, because knowing precisely the cells and cell receptors that are involved could provide useful targets for addiction therapies.

Lester, Caltech chemist Dennis Dougherty, and a group from the University of Cambridge last year announced their success in finding the "switch" part of receptors like those for nicotine and serotonin.

His other current research interests include ion channels, synaptic transmission, light-flash physiology, and signal transduction. Within the past year he has also published papers on the creation of mouse models for epilepsy, tardive dyskinesia, Alzheimer's disease, and Parkinson's disease. The academy is an independent policy research center that focuses on complex and emerging problems such as scientific issues, global security, social policy, the humanities and culture, and education.

The new fellows and foreign honorary members will be formally recognized at the annual induction ceremony on October 7 at the academy's headquarters in Cambridge, Massachusetts.

Writer: 
Robert Tindol
Writer: 
Exclude from News Hub: 
Yes

Caltech Physicists and International MINOS Team Discover New Details of Why Neutrinos Disappear

PASADENA, Calif.—Physicists from the California Institute of Technology and an international collaboration of scientists at the Department of Energy's Fermi National Accelerator Laboratory have observed the disappearance of muon neutrinos traveling from the lab's site in Illinois to a particle detector in Minnesota. The observation is consistent with an effect known as neutrino oscillation, in which neutrinos change from one kind to another.

The Main Injector Neutrino Oscillation Search (MINOS) experiment at Fermilab's site in Batavia, Illinois, revealed a value of delta m^2 = 0.0031 eV^2, a quantity that plays a crucial role in neutrino oscillations and hence the role of neutrinos in the evolution of the universe.

The MINOS detector concept and design was originated by Caltech physicist Doug Michael. Caltech physicists also built half of the massive set of scintillator planes for the five-kiloton far detector. Michael led the formulation and pushed forward the program to increase the intensity of the proton beams that are the source of the neutrinos used by MINOS, leading to the present results.

"Only a year ago we launched the MINOS experiment," said Fermilab director Pier Oddone. "It is great to see that the experiment is already producing important results, shedding new light on the mysteries of the neutrino."

Nature provides for three types of neutrinos, yet scientists know very little about these "ghost particles," which can traverse the entire Earth without interacting with matter. But the abundance of neutrinos in the universe, produced by stars and nuclear processes, may explain how galaxies formed and why antimatter has disappeared. Ultimately, these elusive particles may explain the origin of the neutrons, protons and electrons that make up all the matter in the world around us.

"Using a man-made beam of neutrinos, MINOS is a great tool to study the properties of neutrinos in a laboratory-controlled environment," said Stanford University professor Stan Wojcicki, spokesperson of the experiment. "Our first result corroborates earlier observations of muon neutrino disappearance, made by the Japanese Super-Kamiokande and K2K experiments. Over the next few years, we will collect about 15 times more data, yielding more results with higher precision, paving the way to better understanding this phenomenon. Our current results already rival the Super-Kamiokande and K2K results in precision."

Neutrinos are hard to detect, and most of the neutrinos traveling the 450 miles from Fermilab to Soudan, Minnesota-straight through the earth, no tunnel needed-leave no signal in the MINOS detector. If neutrinos had no mass, the particles would not change as they traverse the earth and the MINOS detector in Soudan would have recorded 177 +/- 11 muon neutrinos. Instead, the MINOS collaboration found only 92 muon neutrino events-a clear observation of muon neutrino disappearance and hence neutrino mass.

The deficit as a function of energy is consistent with the hypothesis of neutrino oscillations, and yields a value of delta m^2, the square of the mass difference between two different types of neutrinos, equal to 0.0031 eV^2 +/- 0.0006 eV^2 (statistical uncertainty) +/- 0.0001 eV^2 (systematic uncertainty). In this scenario, muon neutrinos can transform into electron neutrinos or tau neutrinos, but alternative models-such as neutrino decay and extra dimensions-are not yet excluded. It will take the recording of much more data by the MINOS collaboration to test more precisely the exact nature of the disappearance process. Details of the current MINOS results were presented by David Petyt of the University of Minnesota at a special seminar at Fermilab on March 30. On Friday, March 31, the MINOS collaboration commemorated Michael, who was the MINOS co-spokesperson, at a memorial service at Fermilab. Michael died on December 25, 2005, at the age of 45 after a yearlong battle with cancer.

The MINOS experiment includes about 150 scientists, engineers, technical specialists, and students from 32 institutions in six countries, including Brazil, France, Greece, Russia, the United Kingdom, and the United States. The institutions include universities as well as national laboratories. The U.S. Department of Energy provides the major share of the funding, with additional funding from the U.S. National Science Foundation and from the United Kingdom's Particle Physics and Astronomy Research Council (PPARC).

"The MINOS experiment is a hugely important step in our quest to understand neutrinos-we have created neutrinos in the controlled environment of an accelerator and watched how they behave over very long distances," said Professor Keith Mason, CEO of PPARC. "This has told us that they are not totally massless as was once thought, and opens the way for a detailed study of their properties. U.K. scientists have taken key roles in developing the experiment and in exploiting the data from it, the results of which will shape the future of this branch of physics." The Fermilab side of the MINOS experiment consists of a beam line in a 4,000-foot-long tunnel pointing from Fermilab to Soudan. The tunnel holds the carbon target and beam focusing elements that generate the neutrinos from protons accelerated by Fermilab's main injector accelerator. A neutrino detector, the MINOS "near detector" located 350 feet below the surface of the Fermilab site, measures the composition and intensity of the neutrino beam as it leaves the lab. The Soudan side of the experiment features a huge 6,000-ton particle detector that measures the properties of the neutrinos after their 450-mile trip to northern Minnesota. The cavern housing the detector is located half a mile underground in a former iron mine.

The MINOS neutrino experiment follows a long tradition of studying neutrino properties originated at Caltech by physics professor (and former LIGO laboratory director) Barry Barish. Earlier measurements by the Monopole Astrophysics and Cosmic Ray Observatory (MACRO) experiment at the Gran Sasso laboratory in Italy, led by Barish, also showed evidence for the oscillation of neutrinos produced by the interactions of cosmic rays in the atmosphere.

The MINOS result also complemets results from the K2K long-baseline neutrino experiment in Japan. In 1999-2001 and 2003-2004, the K2K experiment in Japan sent neutrinos from an accelerator at the KEK laboratory in Tsukuba to a particle detector in Kamioka, a distance of about 150 miles. Compared to K2K, the MINOS experiment uses a three times longer distance, and the intensity and the energy of the MINOS neutrino beam are higher than those of the K2K beam. These advantages have enabled the MINOS experiment to observe in less than one year about three times more neutrinos than the K2K experiment did in about four years.

"It is a great gift for me to hear this news," said Yoji Totsuka, former spokesperson of the Super-Kamiokande experiment and now director general of KEK. "Neutrino oscillation was first established in 1998, with cosmic-ray data taken by Super-Kamiokande. The phenomenon was then corroborated by the K2K experiment with a neutrino beam from KEK. Now MINOS gives firm results in a totally independent experiment. I really congratulate their great effort to obtain the first result in such a short timescale."

According to Harvey Newman, a professor of physics at Caltech who now leads the MINOS group, the campus group has also had a key role in the research and development of the MINOS scintillators and optical fibers.

"Our Caltech group, then led by Michael, also had a key role in the research and development of the scintillators and optical fibers that led to MINOS having enough light to measure the muons that signal neutrino events.

"We are also working on the analysis of electron-neutrino events that could lead to a determination of the subdominant mixing between the first and third neutrino flavors, which is one of the next major steps in understanding the mysterious nature of neutrinos and their flavor-mixings. We are also leading the analysis of antineutrinos in the data, and the prospects for MINOS to determine the mixing of antineutrinos, where comparison of neutrinos and antineutrinos will test one of the most fundamental symmetries of nature (known as CPT).

"We are leading much of the R&D for the next generation, 25-kiloton detector called NOvA. Building on our experience in MINOS, we have designed the basic 50-foot-long liquid scintillator cell which contains a single 0.8 mm optical fiber to collect the light (there will be approximately 600,000 cells). We will measure and optimize the design this year in Lauritsen [a physics building on the Caltech campus], in time for the start of NOvA construction that will be completed by approximately 2010. We've also started prototype work on a future generation of megaton-scale detectors for neutrino and ultrahigh-energy cosmic rays. This has generated a lot of interest among Caltech undergraduates, who are now starting to contribute to these developments in the lab."

###

 

More information on the MINOS experiment: http://www-numi.fnal.gov/>http://www-numi.fnal.gov/

List of institutions collaborating on MINOS: http://www-numi.fnal.gov/collab/institut.html

The members of the Caltech MINOS group: Caius Howcroft, Harvey Newman, Juan "Pedro" Ochoa, Charles Peck, Jason Trevor, and Hai Zheng.

Writer: 
Robert Tindol
Writer: 

Astronomers Discover a River of Stars Streaming Across the Northern Sky

PASADENA, Calif.—Astronomers have discovered a narrow stream of stars extending at least 45 degrees across the northern sky. The stream is about 76,000 light-years distant from Earth and forms a giant arc over the disk of the Milky Way galaxy.

In the March issue of the Astrophysical Journal Letters, Carl Grillmair, an associate research scientist at the California Institute of Technology's Spitzer Science Center, and Roberta Johnson, a graduate student at California State University Long Beach, report on the discovery.

"We were blown away by just how long this thing is," says Grillmair. "As one end of the stream clears the horizon this evening, the other will already be halfway up the sky."

The stream begins just south of the bowl of the Big Dipper and continues in an almost straight line to a point about 12 degrees east of the bright star Arcturus in the constellation Bootes. The stream emanates from a cluster of about 50,000 stars known as NGC 5466.

The newly discovered stream extends both ahead and behind NGC 5466 in its orbit around the galaxy. This is due to a process called tidal stripping, which results when the force of the Milky Way's gravity is markedly different from one side of the cluster to the other. This tends to stretch the cluster, which is normally almost spherical, along a line pointing towards the galactic center.

At some point, particularly when its orbit takes it close to the galactic center, the cluster can no longer hang onto its most outlying stars, and these stars drift off into orbits of their own. The lost stars that find themselves between the cluster and the galactic center begin to move slowly ahead of the cluster in its orbit, while the stars that drift outwards, away from the galactic center, fall slowly behind.

Ocean tides are caused by exactly the same phenomenon, though in this case it's the difference in the moon's gravity from one side of Earth to the other that stretches the oceans. If the gravity at the surface of Earth were very much weaker, then the oceans would be pulled from the planet, just like the stars in NGC 5466's stream.

Despite its size, the stream has never previously been seen because it is so completely overwhelmed by the vast sea of foreground stars that make up the disk of the Milky Way. Grillmair and Johnson found the stream by examining the colors and brightnesses of more than nine million stars in the Sloan Digital Sky Survey public database.

"It turns out that, because they were all born at the same time and are situated at roughly the same distance, the stars in globular clusters have a fairly unique signature when you look at how their colors and brightnesses are distributed," says Grillmair.

Using a technique called matched filtering, Grillmair and Johnson assigned to each star a probability that it might once have belonged to NGC 5466. By looking at the distribution of these probabilities across the sky, "the stream just sort of reached out and smacked us.

"The new stream may be even longer than we know, as we are limited at the southern end by the extent of the currently available data," he adds. "Larger surveys in the future should be able to extend the known length of the stream substantially, possibly even right around the whole sky."

The stars that make up the stream are much too faint to be seen by the unaided human eye. Owing to the vast distances involved, they are about three million times fainter than even the faintest stars that we can see on a clear night.

Grillmair says that such discoveries are important for our understanding of what makes up the Milky Way galaxy. Like earthbound rivers, such tidal streams can tell us which way is "down," how steep is the slope, and where the mountains and valleys are located.

By measuring the positions and velocities of the stars in these streams, astronomers hope to determine how much "dark matter" the Milky Way contains, and whether the dark matter is distributed smoothly, or in enormous orbiting chunks.

Writer: 
Robert Tindol
Writer: 

Quasar Study Provides Insights into Composition of the Stars That Ended the "Dark Ages"

WASHINGTON, D.C.-A team of astronomers has uncovered new evidence about the stars whose formation ended the cosmic "Dark Ages" a few hundred million years after the Big Bang.

In a presentation today at the annual winter meeting of the American Astronomical Society (AAS), California Institute of Technology graduate student George Becker is scheduled to discuss his team's investigation of several faraway quasars and the gas between the quasars and Earth. The paper on which his lecture is based will be published in the Astrophysical Journal in March.

One quasar in the study seems to reveal numerous patches of "neutral" gas, made up of atoms where the nucleus and electrons cling together, floating in space when the universe was only about 10 percent of its present age. This gas is thought to have existed in significant quantities only within a certain time-frame in the early universe. Prior to the Dark Ages, all material would have been too hot for atomic nuclei to combine with their electrons; after, the light from newly-formed stars would have reached the atoms and stripped off the electrons.

"There should have been a period when most of the atoms in the universe were neutral," Becker explains. "This would have continued until stars and galaxies began forming."

In other words, the universe went from a very hot, very dense state following the Big Bang where all atomic nuclei and electrons were too energetic to combine, to a less dense and cooler phase-albeit a dark one-where the nuclei and the electrons were cool enough to hold onto each other and form neutral atoms, to today's universe where the great majority of atoms are ionized by energetic particles of light.

Wallace Sargent, who coined the term Dark Ages in 1985 and who is Becker's supervising professor, adds that analyzing the quasars to learn about the early universe is akin to looking at a lighthouse in order to study the air between you and it. During the Dark Ages, neutral atoms filling the universe would have acted like a fog, blocking out the light from distant objects. To end the Dark Ages, enough stars and galaxies needed to form to burn this "cosmic fog" away.

"We may have detected the last wisps of the fog," explains Sargent, who is Bowen Professor of Astronomy at Caltech.

The uniqueness of the new study is the finding that the chemical elements of the cool, un-ionized gas seem to have come from relatively ordinary stars. The researchers think this is so because the elements they detect in the gas- oxygen, carbon, and silicon-are in proportions that suggest the materials came from Type II supernovae.

These particular explosions are caused when massive stars collapse and then rebound to form a gigantic explosion. The stars needed to create these explosions can be more than ten times the mass of the sun, yet they are common over almost the entire history of the universe.

However, astronomers believe that the very first stars in the universe would have been much more massive, up to hundreds of times the mass of the sun, and would have left behind a very different chemical signature.

"If the first stars in the universe were indeed very massive stars," Becker explains, "then their chemical signature was overwhelmed by smaller, more typical stars very soon after."

Becker and his colleagues believe they are seeing material from stars that was blown into space by the supernovae explosions and mixed with the pristine gas produced by the Big Bang. Specifically, they are looking at the spectra of the light from quasars as it is absorbed during its journey through the mixed-up gas.

The quasars in this particular study are from the Sloan Digital Sky Survey, an ongoing mapping project that seeks, in part, to determine the distances of 100,000 quasars. The researchers focused on nine of the most distant quasars known, with redshifts greater than 5, meaning that the light we see from these objects would have been emitted when the universe was at most 1.2 billion years old.

Of the nine, three are far enough away that they may have been at the edge of the dark period. Those three have redshifts greater than 6, meaning that the universe was less than 1 billion years old when they emitted the light we observe. By comparison, the present age of the universe is believed to be about 13.7 billion years.

Becker says that the study in part promises a new tool to investigate the nature of stars in the early universe. "Now that we've seen these systems, it's reasonable to ask if their composition reflects the output of those first very massive stars, or whether the mix of chemicals is what you would expect from more ordinary stars that ended in Type II supernovae.

"It turns out that the latter is the case," Becker says. "The chemical composition appears to be very ordinary."

Thus, the study provides a new window into possible transitions in the early universe, Sargent adds. "The relative abundance of these elements gives us in principle a way of finding out what the first stars were.

"This gives us insight into what kind of stars ended the Dark Ages."

Observations for this study were performed using the 10-meter (400-inch) Keck I Telescope on Mauna Kea, Hawaii. In addition to Becker and Sargent, the other authors are Michael Rauch of the Carnegie Observatories and Robert A. Simcoe of the MIT Center for Space Research.

This work was supported by the National Science Foundation.

Writer: 
Robert Tindol
Writer: 

Snowflake Physicist's Photographs to Be Featured on 2006 Postage Stamps

PASADENA, Calif.—Postage rates may keep going up, but when it comes to natural beauty and scientific wonder, one particular issue of stamps is going to be hard to lick.

Beginning next October, the U.S. Postal Service will issue a set of four commemorative stamps featuring images of snowflakes furnished by that hotbed of snowflake research, the California Institute of Technology. The holiday snowflakes stamp set will display photographs taken by Caltech physics professor Kenneth Libbrecht.

For several years Libbrecht has been investigating the basic physics of how patterns are created during crystal growth and other simple physical processes. He has delved particularly deeply into a case study of the formation of snowflakes. His research is aimed at better understanding how structures arise in material systems, but it is also visually compelling and, from the start, has been a hit with the public.

"My snowflake website, www.snowcrystals.com, is getting about two million hits a year," says Libbrecht, "of course, with a big peak during the winter months."

Libbrecht attributes the site's popularity to its discussion of some very accessible science. "Snowflake patterns are well known, the snowflakes fall right out of the sky, and you don't necessarily need a science background to appreciate the science behind how these ice structures form. It's an especially good introduction to science for younger kids," he says.

Libbrecht began his research by growing synthetic snowflakes in his lab, where they can be created and studied under well-controlled conditions. Precision micro-photography was necessary for this work, and over several years Libbrecht developed some specialized techniques for capturing images of snow crystals. Starting in 2001, he expanded his range to photographing natural snowflakes as well. "A few years ago I mounted my microscope in a suitcase, so I now can take it out into the field," says Libbrecht. "Sometimes I arrange trips to visit colleagues in the frozen north, and other times I arrange extended ski vacations with my family. The most difficult part these days is getting this complex-looking instrument through airport security."

Libbrecht's camera rig is essentially a microscope with a camera attached. The entire apparatus was built on campus and designed specifically for snowflake photography. "Snowflakes are made of ice, which is mostly completely clear, so lighting is an important consideration in this whole business," he says. "I use different types of colored lights shining through the crystals, so the ice structures act like complex lenses to refract the light in different ways. The better the lighting, the more interesting is the final photograph." The structures of snowflakes are ephemeral, so speed is needed to get good photographs. Within minutes after falling, a snowflake will begin to degrade as its sharper features evaporate away. The complex structures are created as the crystals grow, and when they stop growing, the crystals soon become rounded and more blocky in appearance. "When photographing in the field, I first let the crystals fall onto a piece of cardboard," says Libbrecht. "Then I find one I like, pick it up using a small paintbrush, and place it on a microscope slide. I then put it under the microscope, adjust the focus and lighting, and take the shot. You need to search through a lot of snowflakes to find the most beautiful specimens." Libbrecht finds that observing natural snowflakes in the field is an important part of his research, and nicely complements his laboratory work. "I've learned a great deal about crystal growth by studying ice, and have gotten many insights from looking at natural crystals. Nature provides a wonderful variety of snow crystal types to look at, and the crystals that fall great distances are larger than what we can easily grow in the lab." So where does one find really nice snowflakes? Certainly not in Pasadena, where Caltech is located, but Libbrecht says that certain snowy places are better than others. The snowflakes chosen for the stamps were photographed in Fairbanks, Alaska, in the Upper Peninsula of Michigan, and in Libbrecht's favorite spot-Cochrane, Northern Ontario. "Northern Ontario provides some really excellent specimens to photograph," says Libbrecht. "The temperature is cold, but not too cold, and the weather brings light snow frequently.

"Fairbanks sometimes offers some unusual crystal types, because it's so cold. Warmer climates, for example, in New York State and the vicinity, tend to produce less spectacular crystals." As for the nitty-gritty of snowflake research, probably the question Libbrecht is asked the most is whether the old story about no two snowflakes being exactly alike is really true.

"The answer is basically yes, because there is such an incredibly large number of possible ways to make a complex snowflake," he says. "In many cases, there are very clear differences between snow crystals, but of course there are many similar crystals as well. In the lab we often produce very simple, hexagonal crystals, and these all look very similar."

Libbrecht can grow many different snowflake forms at will in his lab, but says there are still many subtle mysteries in crystal growth that are of interest to physicists who are trying to understand and control the formation of various materials. A real-world application of research on crystals is the growth of semiconductors for our electronic gadgets. These semiconductors are made possible in part by painstakingly controlling how certain substances condense into solid structures.

Lest anyone thinks that Libbrecht limits his life as a physicist to snowflakes, he is also involved in the Laser Interferometer Gravitational-Wave Observatory (LIGO), an NSF-funded project that seeks to confirm the existence of gravitational waves from exotic cosmic sources such as colliding black holes.

In LIGO, Libbrecht has lots of professional company; in fact, the field was essentially founded by Albert Einstein, who first predicted the existence of gravitational waves as a consequence of general relativity. Kip Thorne and Ron Drever at Caltech, along with Rai Weiss at MIT, were instrumental in initiating the LIGO project in the 1980s.

But in snowflake research, Libbrecht is pretty much a one-man show. And he says there's something about the exclusivity that he likes.

"It suits some of my hermit-like tendencies," comments Libbrecht. "As Daniel Boone once said, if you can smell the smoke of another person's fire, then it's time to move on. My research on snow crystal growth is the one thing I do that simply wouldn't get done otherwise."

Additional information about the 2006 stamps is available at http://www.usps.com/communications/news/stamps/2005/sr05_054.htm

 

 

Writer: 
Robert Tindol
Tags: 
Writer: 

Physicists Achieve Quantum Entanglement Between Remote Ensembles of Atoms

PASADENA, Calif.—Physicists have managed to "entangle" the physical state of a group of atoms with that of another group of atoms across the room. This research represents an important advance relevant to the foundations of quantum mechanics and to quantum information science, including the possibility of scalable quantum networks (i.e., a quantum Internet) in the future.

Reporting in the December 8 issue of the journal Nature, California Institute of Technology physicist H. Jeff Kimble and his colleagues announce the first realization of entanglement for one "spin excitation" stored jointly between two samples of atoms. In the Caltech experiment, the atomic ensembles are located in a pair of apparatuses 2.8 meters apart, with each ensemble composed of about 100,000 individual atoms.

The entanglement generated by the Caltech researchers consisted of a quantum state for which, when one quantum spin (i.e., one quantum bit) flipped for the atoms at the site L of one ensemble, invariably none flipped at the site R of the other ensemble, and when one spin flipped at R, invariably none flipped at L. Yet, remarkably, because of the entanglement, both possibilities existed simultaneously.

According to Kimble, who is the Valentine Professor and professor of physics at Caltech, this research significantly extends laboratory capabilities for entanglement generation, with now-entangled "quantum bits" of matter stored with separation several thousand times greater than was heretofore possible.

Moreover the experiment provides the first example of an entangled state stored in a quantum memory that can be transferred from the memory to another physical system (in this case, from matter to light). Since the work of Schrödinger and Einstein in the 1930s, entanglement has remained one of the most profound aspects and persistent mysteries of quantum theory. Entanglement leads to strong correlations between the various components of a physical system, even if those components are very far apart. Such correlations cannot be explained by classical physics and have been the subject of active experimental investigation for more than 40 years, including pioneering demonstrations that used entangled states of photons, carried out by John Clauser (son of Caltech's Millikan Professor of Engineering, Emeritus, Francis Clauser).

In more recent times, entangled quantum states have emerged as a critical resource for enabling tasks in information science that are otherwise impossible in the classical realm of conventional information processing and distribution. Some tasks in quantum information science (for instance, the implementation of scalable quantum networks) require that entangled states be stored in massive particles, which was first accomplished for trapped ions separated by a few hundred micrometers in experiments at the National Institute of Standards and Technology in Boulder, Colorado, in 1998.

In the Caltech experiment, the entanglement involves "collective atomic spin excitations." To generate such excitations, an ensemble of cold atoms initially all in level "a" of two possible ground levels is addressed with a suitable "writing" laser pulse. For weak excitation with the write laser, one atom in the sample is sometimes transferred to ground level "b," thereby emitting a photon.

Because of the impossibility of determining which particular atom emitted the photon, detection of this first write photon projects the ensemble of atoms into a state with a single collective spin excitation distributed over all the atoms. The presence (one atom in state b) or absence (all atoms in state a) of this symmetrized spin excitation behaves as a single quantum bit.

To generate entanglement between spatially separated ensembles at sites L and R, the write fields emitted at both locations are combined together in a fashion that erases any information about their origin. Under this condition, if a photon is detected, it is impossible in principle to determine from which ensemble's L or R it came, so that both possibilities must be included in the subsequent description of the quantum state of the ensembles.

The resulting quantum state is an entangled state with "1" stored in the L ensemble and "0" in the R ensemble, and vice versa. That is, there exist simultaneously the complimentary possibilities for one spin excitation to be present in level b at site L ("1") and all atoms in the ground level a at site R ("0"), as well as for no spin excitations to be present in level b at site L ("0") and one excitation to be present at site R ("1").

This entangled state can be stored in the atoms for a programmable time, and then transferred into propagating light fields, which had not been possible before now. The Caltech researchers devised a method to determine unambiguously the presence of entanglement for the propagating light fields, and hence for the atomic ensembles.

The Caltech experiment confirms for the first time experimentally that entanglement between two independent, remote, massive quantum objects can be created by quantum interference in the detection of a photon emitted by one of the objects.

In addition to Kimble, the other authors are Chin-Wen Chou, a graduate student in physics; Hugues de Riedmatten, Daniel Felinto, and Sergey Polyakov, all postdoctoral scholars in Kimble's group; and Steven J. van Enk of Bell Labs, Lucent Technologies.

Writer: 
Robert Tindol
Writer: 

World Network Speed Record Shattered for Third Consecutive Year

Caltech, SLAC, Fermilab, CERN, Michigan, Florida, Brookhaven, Vanderbilt and Partners in the UK, Brazil, Korea and Japan Set 131.6 Gigabit Per Second Mark During the SuperComputing 2005 Bandwidth Challenge

SEATTLE, Wash.—An international team of scientists and engineers for the third consecutive year has smashed the network speed record, moving data along at an average rate of 100 gigabits per second (Gbps) for several hours at a time. A rate of 100 Gbps is sufficient for transmitting five feature-length DVD movies on the Internet from one location to another in a single second.

The winning "High-Energy Physics" team is made up of physicists, computer scientists, and network engineers led by the California Institute of Technology, the Stanford Linear Accelerator Center (SLAC), Fermilab, CERN, and the University of Michigan and partners at the University of Florida, Vanderbilt, and the Brookhaven National Lab, as well as international participants from the UK (University of Manchester and UKLight), Brazil (Rio de Janeiro State University, UERJ, and the State Universities of São Paulo, USP and UNESP), Korea (Kyungpook National University, KISTI) and Japan (the KEK Laboratory in Tsukuba), who joined forces to set a new world record for data transfer, capturing first prize at the Supercomputing 2005 (SC|05) Bandwidth Challenge (BWC).

The HEP team's demonstration of "Distributed TeraByte Particle Physics Data Sample Analysis" achieved a peak throughput of 151 Gbps and an official mark of 131.6 Gbps measured by the BWC judges on 17 of the 22 optical fiber links used by the team, beating their previous mark for peak throughput of 101 Gbps by 50 percent. In addition to the impressive transfer rate for DVD movies, the new record data transfer speed is also equivalent to serving 10,000 MPEG2 HDTV movies simultaneously in real time, or transmitting all of the printed content of the Library of Congress in 10 minutes.

The team sustained average data rates above the 100 Gbps level for several hours for the first time, and transferred a total of 475 terabytes of physics data among the team's sites throughout the U.S. and overseas within 24 hours. The extraordinary data transport rates were made possible in part through the use of the FAST TCP protocol developed by Associate Professor of Computer Science and Electrical Engineering Steven Low and his Caltech Netlab team, as well as new data transport applications developed at SLAC and Fermilab and an optimized Linux kernel developed at Michigan.

Professor of Physics Harvey Newman of Caltech, head of the HEP team and US CMS Collaboration Board Chair, who originated the LHC Data Grid Hierarchy concept, said, "This demonstration allowed us to preview the globally distributed Grid system of more than 100 laboratory and university-based computing facilities that is now being developed in the U.S., Latin America, and Europe in preparation for the next generation of high-energy physics experiments at CERN's Large Hadron Collider (LHC) that will begin operation in 2007.

"We used a realistic mixture of streams, including the organized transfer of multiterabyte datasets among the laboratory centers at CERN, Fermilab, SLAC, and KEK, plus numerous other flows of physics data to and from university-based centers represented by Caltech, Michigan, Florida, Rio de Janeiro and São Paulo in Brazil, and Korea, to effectively use the remainder of the network capacity.

"The analysis of this data will allow physicists at CERN to search for the Higgs particles thought to be responsible for mass in the universe, supersymmetry, and other fundamentally new phenomena bearing on the nature of matter and space-time, in an energy range made accessible by the LHC for the first time."

The largest physics collaborations at the LHC, CMS and ATLAS each encompass more than 2,000 physicists and engineers from 160 universities and laboratories. In order to fully exploit the potential for scientific discoveries, the many petabytes of data produced by the experiments will be processed, distributed, and analyzed using a global Grid. The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport terabyte-scale data samples on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" of already-understood particle interactions. This data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Matt Crawford, head of the Fermilab network team at SC|05 said, "The realism of this year's demonstration represents a major step in our ability to show that the unprecedented systems required to support the next round of high-energy physics discoveries are indeed practical. Our data sources in the bandwidth challenge were some of our mainstream production storage systems and file servers, which are now helping to drive the searches for new physics at the high-energy frontier at Fermilab's Tevatron, as well the explorations of the far reaches of the universe by the Sloan Digital Sky Survey."

Les Cottrell, leader of the SLAC team and assistant director of scientific computing and computing services, said, "Some of the pleasant surprises at this year's challenge were the advances in throughput we achieved using real applications to transport physics data, including bbcp and xrootd developed at SLAC. The performance of bbcp used together with Caltech's FAST protocol and an optimized Linux kernel developed at Michigan, as well as our xrootd system, were particularly striking. We were able to match the performance of the artificial data transfer tools we used to reach the peak rates in past years."

Future optical networks incorporating multiple 10 Gbps links are the foundation of the Grid system that will drive the scientific discoveries. A "hybrid" network integrating both traditional switching and routing of packets and dynamically constructed optical paths to support the largest data flows is a central part of the near-term future vision that the scientific community has adopted to meet the challenges of data-intensive science in many fields. By demonstrating that many 10 Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic Grid supporting many terabyte and larger data transactions is practical.

Shawn McKee, associate research scientist in the University of Michigan Department of Physics and leader of the UltraLight Network technical group, said, "This achievement is an impressive example of what a focused network effort can accomplish. It is an important step towards the goal of delivering a highly capable end-to-end network-aware system and architecture that meet the needs of next-generation e-science."

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and plan to deploy a new generation of revolutionary Internet applications. Multigigabit end-to-end network performance will empower scientists to form "virtual organizations" on a planetary scale, sharing their collective computing and data resources in a flexible way. In particular, this is vital for projects on the frontiers of science and engineering in "data intensive" fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

The new bandwidth record was achieved through extensive use of the SCInet network infrastructure at SC|05. The team used 15 10 Gbps links to Cisco Systems Catalyst 6500 Series Switches at the Caltech Center for Advanced Computing Research (CACR) booth, and seven 10 Gbps links to a Catalyst 6500 Series Switch at the SLAC/Fermilab booth, together with computing clusters provided by Hewlett Packard, Sun Microsystems, and IBM, and a large number of 10 gigabit Ethernet server interfaces-more than 80 provided by Neterion, and 14 by Chelsio.

The external network connections to Los Angeles, Sunnyvale, the Starlight facility in Chicago, and Florida included the Cisco Research, Internet2/HOPI, UltraScience Net and ESnet wavelengths carried by National Lambda Rail (NLR); Internet2's Abilene backbone; the three wavelengths of TeraGrid; an ESnet link provided by Qwest; the Pacific Wave link; and Canada's CANARIE network. International connections included the US LHCNet links (provisioned by Global Crossing and Colt) between Chicago, New York, and CERN, the CHEPREO/WHREN link (provisioned by LANautilus) between Miami and Sao Paulo, the UKLight link, the Gloriad link to Korea, and the JGN2 link to Japan.

Regional connections included six 10 Gbps wavelengths provided with the help of CIENA to Fermilab; two 10 Gbps wavelengths to the Caltech campus provided by Cisco Systems' research waves across NLR and California's CENIC network; two 10 Gbps wavelengths to SLAC provided by ESnet and UltraScienceNet; three wavelengths between Starlight and the University of Michigan over Michigan Lambda Rail (MiLR); and wavelengths to Jacksonville and Miami across Florida Lambda Rail (FLR). During the test, several of the network links were shown to operate at full capacity for sustained periods.

While the SC|05 demonstration required a major effort by the teams involved and their sponsors, in partnership with major research and education network organizations in the U.S., Europe, Latin America, and Pacific Asia, it is expected that networking on this scale in support of the largest science projects (such as the LHC) will be commonplace within the next three to five years. The demonstration also appeared to stress the network and server systems used, so the team is continuing its test program to put the technologies and methods used at SC|05 into production use, with the goal of attaining the necessary level of reliability in time for the start of the LHC research program.

As part of the SC|05 demonstrations, a distributed analysis of simulated LHC physics data was done using the Grid-enabled Analysis Environment (GAE) developed at Caltech for the LHC and many other major particle physics experiments, as part of the Particle Physics Data Grid (PPDG), GriPhyN/iVDGL, Open Science Grid, and DISUN projects. This involved transferring data to CERN, Florida, Fermilab, Caltech, and Brazil for processing by clusters of computers, and finally aggregating the results back to the show floor to create a dynamic visual display of quantities of interest to the physicists. In another part of the demonstration, file servers at the SLAC/FNAL booth and in Manchester also were used for disk-to-disk transfers between Seattle and the UK.

The team used Caltech's MonALISA (MONitoring Agents using a Large Integrated Services Architecture) system to monitor and display the real-time data for all the network links used in the demonstration. It simultaneously monitored more than 14,000 grid nodes in 200 computing clusters. MonALISA (http://monalisa.caltech.edu) is a highly scalable set of autonomous self-describing agent-based subsystems that are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and Grid systems, as well as the scientific applications themselves.

The network has been deployed through exceptional support by Cisco Systems, Hewlett Packard, Neterion, Chelsio, Sun Microsystems, IBM, and Boston Ltd., as well as the network engineering staffs of National LambdaRail, Internet2's Abilene Network, ESnet, TeraGrid, CENIC, MiLR, FLR, Pacific Wave, AMPATH, RNP and ANSP/FAPESP in Brazil, KISTI in Korea, UKLight in the UK, JGN2 in Japan, and the Starlight international peering point in Chicago.

The demonstration and the developments leading up to it were made possible through the strong support of the U.S. Department of Energy Office of Science and the National Science Foundation, in cooperation with the funding agencies of the international partners.

Further information about the demonstration may be found at: http://ultralight.caltech.edu/web-site/sc05 http://www-iepm.slac.stanford.edu/monitoring/bulk/sc2005/hiperf.html http://supercomputing.fnal.gov/ http://monalisa.caltech.edu:8080/Slides/SC2005BWC/SC2005_BWCTalk11705.ppt and http://scinet.supercomp.org/2005/bwc/results/summary.html

About Caltech: With an outstanding faculty, including five Nobel laureates, and such off-campus facilities as the Jet Propulsion Laboratory, Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities. http://www.caltech.edu

About SLAC: The Stanford Linear Accelerator Center (SLAC) is one of the world's leading research laboratories. Its mission is to design, construct, and operate state-of-the-art electron accelerators and related experimental facilities for use in high-energy physics and synchrotron radiation research. In the course of doing so, it has established the largest known database in the world, which grows at 1 terabyte per day. That, and its central role in the world of high-energy physics collaboration, places SLAC at the forefront of the international drive to optimize the worldwide, high-speed transfer of bulk data. http://www.slac.stanford.edu/

About CACR: Caltech's Center for Advanced Computing Research (CACR) performs research and development on leading edge networking and computing systems, and methods for computational science and engineering. Some current efforts at CACR include the National Virtual Observatory, ASC Center for Simulation of Dynamic Response of Materials, Particle Physics Data Grid, GriPhyN, Computational Infrastructure for Geophysics, Cascade High Productivity Computing System, and the TeraGrid. http://www.cacr.caltech.edu/

About Netlab: Netlab is the Networking Laboratory at Caltech led by Steven Low, where FAST TCP has been developed. The group does research in the control and optimization of protocols and networks, and designs, analyzes, implements, and experiments with new algorithms and systems. http://netlab.caltech.edu/FAST/

About the University of Michigan: The University of Michigan, with its size, complexity, and academic strength, the breadth of its scholarly resources and the quality of its faculty and students, is one of America's great public universities and one of the world's premiere research institutions. The university was founded in 1817 and has a total enrollment of 54,300 on all campuses. The main campus is in Ann Arbor, Michigan, and has 39,533 students (fall 2004). With over 600 degree programs and $739M in FY05 research funding, the university is one of the leaders in innovation and research. For more information, see http://www.umich.edu.

About the University of Florida: The University of Florida (UF), located in Gainesville, is a major public, comprehensive, land-grant, research university. The state's oldest, largest, and most comprehensive university, UF is among the nation's most academically diverse public universities. It has a long history of established programs in international education, research, and service and has a student population of approximately 49,000. UF is the lead institution for the GriPhyN and iVDGL projects and is a Tier-2 facility for the CMS experiment. For more information, see http://www.ufl.edu.

About Fermilab: Fermi National Accelerator Laboratory (Fermilab) is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Universities Research Association, Inc. Experiments at Fermilab's Tevatron, the world's highest-energy particle accelerator, generate petabyte-scale data per year, and involve large, international collaborations with requirements for high-volume data movement to their home institutions. The laboratory actively works to remain on the leading edge of advanced wide-area network technology in support of its collaborations.

About CERN: CERN, the European Organization for Nuclear Research, has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About StarLight: StarLight is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications. Operational since summer 2001, StarLight is a 1 GE and 10 GE switch/router facility for high-performance access to participating networks, and also offers true optical switching for wavelengths. StarLight is being developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), the International Center for Advanced Internet Research (iCAIR) at Northwestern University, and the Mathematics and Computer Science Division at Argonne National Laboratory, in partnership with Canada's CANARIE and the Netherlands' SURFnet. STAR TAP and StarLight are made possible by major funding from the U.S. National Science Foundation to UIC. StarLight is a service mark of the Board of Trustees of the University of Illinois. See www.startap.net/starlight.

About the University of Manchester: The University of Manchester has been created by combining the strengths of UMIST (founded in 1824) and the Victoria University of Manchester (founded in 1851) to form the largest single-site university in the UK, with 34,000 students. On Friday, October 22, 2004, it received its Royal Charter from Queen Elizabeth II, with an unprecedented £300M capital investment program. Twenty-three Nobel Prize winners have studied at Manchester, continuing a proud tradition of innovation and excellence. Rutherford conducted the research that led to the splitting of the atom there, and the world's first stored-program electronic digital computer successfully executed its first program there in June 1948. The Schools of Physics, Computational Science, Computer Science and the Network Group, together with the E-Science North West Centre research facility, are very active in developing a wide range of e-science projects and Grid technologies. See www.manchester.ac.uk.

About UERJ (Rio de Janeiro): Founded in 1950, the Rio de Janeiro State University (UERJ; http//www.uerj.br) ranks among the ten largest universities in Brazil, with more than 23,000 students. UERJ's five campuses are home to 22 libraries, 412 classrooms, 50 lecture halls and auditoriums, and 205 laboratories. UERJ is responsible for important public welfare and health projects through its centers of medical excellence, the Pedro Ernesto University Hospital (HUPE) and the Piquet Carneiro Day-care Policlinic Centre, and it is committed to the preservation of the environment. The UERJ High Energy Physics group includes 15 faculty, postdoctoral, and visiting PhD physicists, and 12 PhD and master's students, working on experiments at Fermilab (D0) and CERN (CMS). The group has constructed a Tier2 center to enable it to take part in the Grid-based data analysis planned for the LHC, and has originated the concept of a Brazilian "HEP Grid," working in cooperation with USP and several other universities in Rio and São Paulo.

About UNESP (São Paulo): Created in 1976 with the administrative union of several isolated institutes of higher education in the state of Saõ Paulo, the São Paulo State University, UNESP, has campuses in 24 different cities in the State of São Paulo. The university has 25,000 undergraduate students and almost 10,000 graduate students. Since 1999 the university has had a group participating in the DZero Collaboration of Fermilab, which is operating the São Paulo Regional Analysis Center (SPRACE). See http://www.unesp.br.

About USP (São Paulo): The University of São Paulo, USP, is the largest institution of higher education and research in Brazil, and the third largest in Latin America. The university has most of its 35 units located on the campus in the state capital. It has around 40,000 undergraduate students and around 25,000 graduate students. It is responsible for almost 25 percent of all Brazilian papers and publications indexed on the Institute for Scientific Information (ISI). The SPRACE cluster is located at the Physics Institute. See http://www.usp.br.

About Kyungpook National University (Daegu): Kyungpook National University is one of the leading universities in Korea, especially in physics and information science. The university has 13 colleges and nine graduate schools with 24,000 students. It houses the Center for High-Energy Physics (CHEP) in which most Korean high-energy physicists participate. CHEP (chep.knu.ac.kr) was approved as one of the designated Excellent Research Centers supported by the Korean Ministry of Science.

About Vanderbilt: One of America's top 20 universities, Vanderbilt University is a private research university of 6,319 undergraduates and 4,566 graduate and professional students. The university comprises 10 schools, a public policy institute, a distinguished medical center, and the Freedom Forum First Amendment Center. Located a mile and a half southwest of downtown Nashville, the campus is in a park-like setting. Buildings on the original campus date to its founding in 1873, and the Peabody section of campus has been registered as a National Historic Landmark since 1966. Vanderbilt ranks 24th in the value of federal research grants awarded to faculty members, according to the National Science Foundation.

About the Particle Physics Data Grid (PPDG): The Particle Physics Data Grid; see www.ppdg.net) is developing and deploying production Grid systems that vertically integrate experiment-specific applications, Grid technologies, Grid and facility computation, and storage resources to form effective end-to-end capabilities. PPDG is a collaboration of computer scientists with a strong record in Grid technology and physicists with leading roles in the software and network infrastructures for major high-energy and nuclear experiments. PPDG's goals and plans are guided by the immediate and medium-term needs of the physics experiments and by the research and development agenda of the computer science groups.

About GriPhyN and iVDGL: GriPhyN (www.griphyn.org) and iVDGL (www.ivdgl.org) are developing and deploying Grid infrastructure for several frontier experiments in physics and astronomy. These experiments together will utilize petaflops of CPU power and generate hundreds of petabytes of data that must be archived, processed, and analyzed by thousands of researchers at laboratories, universities, and small colleges and institutes spread around the world. The scale and complexity of this "petascale" science drive GriPhyN's research program to develop Grid-based architectures, using "virtual data" as a unifying concept. IVDGL is deploying a Grid laboratory where these technologies can be tested at large scale and where advanced technologies can be implemented for extended studies by a variety of disciplines.

About CHEPREO: Florida International University (FIU), in collaboration with partners at Florida State University, the University of Florida, and the California Institute of Technology, has been awarded an NSF grant to create and operate an interregional Grid-enabled Center for High-Energy Physics Research and Educational Outreach (CHEPREO; www.chepreo.org) at FIU. CHEPREO encompasses an integrated program of collaborative physics research on CMS, network infrastructure development, and educational outreach at one of the largest minority universities in the US. The center is funded by four NSF directorates: Mathematical and Physical Sciences, Scientific Computing Infrastructure, Elementary, Secondary and Informal Education, and International Programs.

About the Open Science Grid: The OSG makes innovative science possible by bringing multidisciplinary collaborations together with the latest advances in distributed computing technologies. This shared cyberinfrastructure, built by research groups from U.S. universities and national laboratories, receives support from the National Science Foundation and the U.S. Department of Energy's Office of Science. For more information about the OSG, visit www.opensciencegrid.org.

About Internet2®: Led by more than 200 U.S. universities working with industry and government, Internet2 develops and deploys advanced network applications and technologies for research and higher education, accelerating the creation of tomorrow's Internet. Internet2 recreates the partnerships among academia, industry, and government that helped foster today's Internet in its infancy. For more information, visit: www.internet2.edu.

About the Abilene Network: Abilene, developed in partnership with Qwest Communications, Juniper Networks, Nortel Networks and Indiana University, provides nationwide high-performance networking capabilities for more than 225 universities and research facilities in all 50 states, the District of Columbia, and Puerto Rico. For more information on Abilene, see http://abilene.internet2.edu/.

About the TeraGrid: The TeraGrid, funded by the National Science Foundation, is a multiyear effort to build a distributed national cyberinfrastructure. TeraGrid entered full production mode in October 2004, providing a coordinated set of services for the nation's science and engineering community. TeraGrid's unified user support infrastructure and software environment allow users to access storage and information resources as well as over a dozen major computing systems at nine partner sites via a single allocation, either as stand-alone resources or as components of a distributed application using Grid software capabilities. Over 40 teraflops of computing power, 1.5 petabytes of online storage, and multiple visualization, data collection, and instrument resources are integrated at the nine TeraGrid partner sites. Coordinated by the University of Chicago and Argonne National Laboratory, the TeraGrid partners include the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UIUC), San Diego Supercomputer Center (SDSC) at the University of California, San Diego (UCSD), the Center for Advanced Computing Research (CACR) at the California Institute of Technology (Caltech), the Pittsburgh Supercomputing Center (PSC), Oak Ridge National Laboratory, Indiana University, Purdue University, and the Texas Advanced Computing Center (TACC) at the University of Texas-Austin.

About National LambdaRail: National LambdaRail (NLR) is a major initiative of U.S. research universities and private sector technology companies to provide a national-scale infrastructure for research and experimentation in networking technologies and applications. NLR puts the control, the power, and the promise of experimental network infrastructure in the hands of the nation's scientists and researchers. Visit http://www.nlr.net for more information.

About CENIC: CENIC (www.cenic.org) is a not-for-profit corporation serving the California Institute of Technology, California State University, Stanford University, University of California, University of Southern California, California Community Colleges, and the statewide K-12 school system. CENIC's mission is to facilitate and coordinate the development, deployment, and operation of a set of robust multi-tiered advanced network services for this research and education community.

About ESnet: The Energy Sciences Network (ESnet; www.es.net) is a high-speed network serving thousands of Department of Energy scientists and collaborators worldwide. A pioneer in providing high-bandwidth, reliable connections, ESnet enables researchers at national laboratories, universities, and other institutions to communicate with each other using the collaborative capabilities needed to address some of the world's most important scientific challenges. Managed and operated by the ESnet staff at Lawrence Berkeley National Laboratory, ESnet provides direct high-bandwidth connections to all major DOE sites, multiple cross connections with Internet2/Abilene, connections to Europe via GEANT and to Japan via SuperSINET, and fast interconnections to more than 100 other networks. Funded principally by DOE's Office of Science, ESnet services allow scientists to make effective use of unique DOE research facilities and computing resources, independent of time and geographic location.

About Qwest: Qwest Communications International Inc. (NYSE: Q) is a leading provider of voice, video, and data services. With more than 40,000 employees, Qwest is committed to the "Spirit of Service" and to providing world-class services that exceed customers' expectations for quality, value, and reliability. For more information, please visit the Qwest Web site at www.qwest.com.

About UKLight: The UKLight facility (www.uklight.ac.uk) was set up in 2003 with a grant of £6.5M from HEFCE (the Higher Education Funding Council for England) to provide an international experimental testbed for optical networking and support projects working on developments towards optical networks and the applications that will use them. UKLight will bring together leading-edge applications, Internet engineering for the future, and optical communications engineering, and enable UK researchers to join the growing international consortium that currently spans Europe and North America. A "Point of Access" (PoA) in London provides international connectivity with 10 Gbit network connections to peer facilities in Chicago (StarLight) and Amsterdam (NetherLight). UK research groups gain access to the facility via extensions to the 10Gbit SuperJANET development network, and a national dark fiber facility is under development for use by the photonics research community. Management of the UKLight facility is being undertaken by UKERNA on behalf of the Joint Information Systems Committee (JISC).

About AMPATH: Florida International University's Center for Internet Augmented Research and Assessment (CIARA) has developed an international, high-performance research connection point in Miami, Florida, called AMPATH (AMericasPATH; www.ampath.fiu.edu). AMPATH's goal is to enable wide-bandwidth digital communications between U.S. and international research and education networks, as well as between a variety of U.S. research programs in the region. AMPATH in Miami acts as a major international exchange point (IXP) for the research and education networks in South America, Central America, Mexico, and the Caribbean. The AMPATH IXP is home for the WHREN-LILA high-performance network link connecting Latin America to the U.S., funded by the NSF (award #0441095) and the Academic Network of São Paulo (award #2003/13708-0).

About the Academic Network of São Paulo (ANSP): ANSP unites São Paulo's University networks with Scientific and Technological Research Centers in São Paulo, and is managed by the State of São Paulo Research Foundation (FAPESP). The ANSP Network is another example of international collaboration and exploration. Through its connection to WHREN-LILA, all of the institutions connected to ANSP will be involved in research with U.S. universities and research centers, offering significant contributions and the potential to develop new applications and services. This connectivity with WHREN-LILA and ANSP will allow researchers to enhance the quality of current data, inevitably increasing the quality of new scientific development. See http://www.ansp.br.

About RNP: RNP, the National Education and Research Network of Brazil, is a not-for-profit company that promotes the innovative use of advanced networking with the joint support of the Ministry of Science and Technology and the Ministry of Education. In the early 1990s, RNP was responsible for the introduction and adoption of Internet technology in Brazil. Today, RNP operates a nationally deployed multigigabit network used for collaboration and communication in research and education throughout the country, reaching all 26 states and the Federal District, and provides both commodity and advanced research Internet connectivity to more than 300 universities, research centers, and technical schools. See http://www.rnp.br.

About KISTI: KISTI (Korea Institute of Science and Technology Information), which was assigned to play the pivotal role in establishing the national science and technology knowledge information infrastructure, was founded through the merger of the Korea Institute of Industry and Technology Information (KINITI) and the Korea Research and Development Information Center (KORDIC) in January, 2001. KISTI is under the supervision of the Office of the Prime Minister and will play a leading role in building the nationwide infrastructure for knowledge and information by linking the high-performance research network with its supercomputers.

About Hewlett Packard: HP is a technology solutions provider to consumers, businesses, and institutions globally. The company's offerings span IT infrastructure, global services, business and home computing, and imaging, and printing. More information about HP (NYSE, Nasdaq: HPQ) is available at www.hp.com.

About Sun Microsystems: Since its inception in 1982, a singular vision-"The Network Is The Computer(TM)"-has propelled Sun Microsystems, Inc. (Nasdaq: SUNW) to its position as a leading provider of industrial-strength hardware, software, and services that make the Net work. Sun can be found in more than 100 countries and on the World Wide Web at http://sun.com.

About IBM: IBM is the world's largest information technology company, with 80 years of leadership in helping businesses innovate. Drawing on resources from across IBM and key business partners, IBM offers a wide range of services, solutions, and technologies that enable customers, large and small, to take full advantage of the new era of e-business. For more information about IBM, visit www.ibm.com.

About Boston Limited: With over 12 years of experience, Boston Limited (www.boston.co.uk) is a UK-based specialist in high-end workstation, server, and storage hardware. Boston's solutions bring the latest innovations to market, such as PCI-Express, DDR II, and Infiniband technologies. As the pan-European distributor for Supermicro, Boston Limited works very closely with key manufacturing partners, as well as strategic clients within the academic and commercial sectors, to provide cost-effective solutions with exceptional performance.

About Neterion, Inc.: Founded in 2001, Neterion Inc. has locations in Cupertino, California, and Ottawa, Canada. Neterion delivers 10 Gigabit Ethernet hardware and software solutions that solve customers' high-end networking problems. The Xframe(r) line of products is based on Neterion-developed technologies that deliver new levels of performance, availability and reliability in the datacenter. Xframe, Xframe II, and Xframe E include full IPv4 and IPv6 support and comprehensive stateless offloads that preserve the integrity of current TCP/IP implementations without "breaking the stack." Xframe drivers are available for all major operating systems, including Microsoft Windows, Linux, Hewlett-Packard's HP-UX, IBM's AIX, Sun's Solaris and SGI's Irix. Neterion has raised over $42M in funding, with its latest C round taking place in June 2004. Formerly known as S2io, the company changed its name to Neterion in January 2005. Further information on the company can be found at http://www.neterion.com/

About Chelsio Communications: Chelsio Communications is leading the convergence of networking, storage, and clustering interconnects with its robust, high-performance, and proven protocol acceleration technology. Featuring a highly scalable and programmable architecture, Chelsio is shipping 10-Gigabit Ethernet adapter cards with protocol offload, delivering the low latency and superior throughput required for high-performance computing applications. For more information, visit the company online at www.chelsio.com.

About the National Science Foundation: The NSF is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense...." With an annual budget of about $5.5 billion, it is the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science, and the social sciences, NSF is the major source of federal backing.

About the DOE Office of Science: DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the nation, and ensures U.S. world leadership across a broad range of scientific disciplines. The Office of Science also manages 10 world-class national laboratories with unmatched capabilities for solving complex interdisciplinary problems, and it builds and operates some of the nation's most advanced R&D user facilities, located at national laboratories and universities. These facilities are used by more than 19,000 researchers from universities, other government agencies, and private industry each year.

 

 

 

Writer: 
Robert Tindol
Writer: 
Exclude from News Hub: 
No

Pages

Subscribe to RSS - PMA