Snowflake Physicist's Photographs to Be Featured on 2006 Postage Stamps

PASADENA, Calif.—Postage rates may keep going up, but when it comes to natural beauty and scientific wonder, one particular issue of stamps is going to be hard to lick.

Beginning next October, the U.S. Postal Service will issue a set of four commemorative stamps featuring images of snowflakes furnished by that hotbed of snowflake research, the California Institute of Technology. The holiday snowflakes stamp set will display photographs taken by Caltech physics professor Kenneth Libbrecht.

For several years Libbrecht has been investigating the basic physics of how patterns are created during crystal growth and other simple physical processes. He has delved particularly deeply into a case study of the formation of snowflakes. His research is aimed at better understanding how structures arise in material systems, but it is also visually compelling and, from the start, has been a hit with the public.

"My snowflake website, www.snowcrystals.com, is getting about two million hits a year," says Libbrecht, "of course, with a big peak during the winter months."

Libbrecht attributes the site's popularity to its discussion of some very accessible science. "Snowflake patterns are well known, the snowflakes fall right out of the sky, and you don't necessarily need a science background to appreciate the science behind how these ice structures form. It's an especially good introduction to science for younger kids," he says.

Libbrecht began his research by growing synthetic snowflakes in his lab, where they can be created and studied under well-controlled conditions. Precision micro-photography was necessary for this work, and over several years Libbrecht developed some specialized techniques for capturing images of snow crystals. Starting in 2001, he expanded his range to photographing natural snowflakes as well. "A few years ago I mounted my microscope in a suitcase, so I now can take it out into the field," says Libbrecht. "Sometimes I arrange trips to visit colleagues in the frozen north, and other times I arrange extended ski vacations with my family. The most difficult part these days is getting this complex-looking instrument through airport security."

Libbrecht's camera rig is essentially a microscope with a camera attached. The entire apparatus was built on campus and designed specifically for snowflake photography. "Snowflakes are made of ice, which is mostly completely clear, so lighting is an important consideration in this whole business," he says. "I use different types of colored lights shining through the crystals, so the ice structures act like complex lenses to refract the light in different ways. The better the lighting, the more interesting is the final photograph." The structures of snowflakes are ephemeral, so speed is needed to get good photographs. Within minutes after falling, a snowflake will begin to degrade as its sharper features evaporate away. The complex structures are created as the crystals grow, and when they stop growing, the crystals soon become rounded and more blocky in appearance. "When photographing in the field, I first let the crystals fall onto a piece of cardboard," says Libbrecht. "Then I find one I like, pick it up using a small paintbrush, and place it on a microscope slide. I then put it under the microscope, adjust the focus and lighting, and take the shot. You need to search through a lot of snowflakes to find the most beautiful specimens." Libbrecht finds that observing natural snowflakes in the field is an important part of his research, and nicely complements his laboratory work. "I've learned a great deal about crystal growth by studying ice, and have gotten many insights from looking at natural crystals. Nature provides a wonderful variety of snow crystal types to look at, and the crystals that fall great distances are larger than what we can easily grow in the lab." So where does one find really nice snowflakes? Certainly not in Pasadena, where Caltech is located, but Libbrecht says that certain snowy places are better than others. The snowflakes chosen for the stamps were photographed in Fairbanks, Alaska, in the Upper Peninsula of Michigan, and in Libbrecht's favorite spot-Cochrane, Northern Ontario. "Northern Ontario provides some really excellent specimens to photograph," says Libbrecht. "The temperature is cold, but not too cold, and the weather brings light snow frequently.

"Fairbanks sometimes offers some unusual crystal types, because it's so cold. Warmer climates, for example, in New York State and the vicinity, tend to produce less spectacular crystals." As for the nitty-gritty of snowflake research, probably the question Libbrecht is asked the most is whether the old story about no two snowflakes being exactly alike is really true.

"The answer is basically yes, because there is such an incredibly large number of possible ways to make a complex snowflake," he says. "In many cases, there are very clear differences between snow crystals, but of course there are many similar crystals as well. In the lab we often produce very simple, hexagonal crystals, and these all look very similar."

Libbrecht can grow many different snowflake forms at will in his lab, but says there are still many subtle mysteries in crystal growth that are of interest to physicists who are trying to understand and control the formation of various materials. A real-world application of research on crystals is the growth of semiconductors for our electronic gadgets. These semiconductors are made possible in part by painstakingly controlling how certain substances condense into solid structures.

Lest anyone thinks that Libbrecht limits his life as a physicist to snowflakes, he is also involved in the Laser Interferometer Gravitational-Wave Observatory (LIGO), an NSF-funded project that seeks to confirm the existence of gravitational waves from exotic cosmic sources such as colliding black holes.

In LIGO, Libbrecht has lots of professional company; in fact, the field was essentially founded by Albert Einstein, who first predicted the existence of gravitational waves as a consequence of general relativity. Kip Thorne and Ron Drever at Caltech, along with Rai Weiss at MIT, were instrumental in initiating the LIGO project in the 1980s.

But in snowflake research, Libbrecht is pretty much a one-man show. And he says there's something about the exclusivity that he likes.

"It suits some of my hermit-like tendencies," comments Libbrecht. "As Daniel Boone once said, if you can smell the smoke of another person's fire, then it's time to move on. My research on snow crystal growth is the one thing I do that simply wouldn't get done otherwise."

Additional information about the 2006 stamps is available at http://www.usps.com/communications/news/stamps/2005/sr05_054.htm

 

 

Writer: 
Robert Tindol
Tags: 
Writer: 

Physicists Achieve Quantum Entanglement Between Remote Ensembles of Atoms

PASADENA, Calif.—Physicists have managed to "entangle" the physical state of a group of atoms with that of another group of atoms across the room. This research represents an important advance relevant to the foundations of quantum mechanics and to quantum information science, including the possibility of scalable quantum networks (i.e., a quantum Internet) in the future.

Reporting in the December 8 issue of the journal Nature, California Institute of Technology physicist H. Jeff Kimble and his colleagues announce the first realization of entanglement for one "spin excitation" stored jointly between two samples of atoms. In the Caltech experiment, the atomic ensembles are located in a pair of apparatuses 2.8 meters apart, with each ensemble composed of about 100,000 individual atoms.

The entanglement generated by the Caltech researchers consisted of a quantum state for which, when one quantum spin (i.e., one quantum bit) flipped for the atoms at the site L of one ensemble, invariably none flipped at the site R of the other ensemble, and when one spin flipped at R, invariably none flipped at L. Yet, remarkably, because of the entanglement, both possibilities existed simultaneously.

According to Kimble, who is the Valentine Professor and professor of physics at Caltech, this research significantly extends laboratory capabilities for entanglement generation, with now-entangled "quantum bits" of matter stored with separation several thousand times greater than was heretofore possible.

Moreover the experiment provides the first example of an entangled state stored in a quantum memory that can be transferred from the memory to another physical system (in this case, from matter to light). Since the work of Schrödinger and Einstein in the 1930s, entanglement has remained one of the most profound aspects and persistent mysteries of quantum theory. Entanglement leads to strong correlations between the various components of a physical system, even if those components are very far apart. Such correlations cannot be explained by classical physics and have been the subject of active experimental investigation for more than 40 years, including pioneering demonstrations that used entangled states of photons, carried out by John Clauser (son of Caltech's Millikan Professor of Engineering, Emeritus, Francis Clauser).

In more recent times, entangled quantum states have emerged as a critical resource for enabling tasks in information science that are otherwise impossible in the classical realm of conventional information processing and distribution. Some tasks in quantum information science (for instance, the implementation of scalable quantum networks) require that entangled states be stored in massive particles, which was first accomplished for trapped ions separated by a few hundred micrometers in experiments at the National Institute of Standards and Technology in Boulder, Colorado, in 1998.

In the Caltech experiment, the entanglement involves "collective atomic spin excitations." To generate such excitations, an ensemble of cold atoms initially all in level "a" of two possible ground levels is addressed with a suitable "writing" laser pulse. For weak excitation with the write laser, one atom in the sample is sometimes transferred to ground level "b," thereby emitting a photon.

Because of the impossibility of determining which particular atom emitted the photon, detection of this first write photon projects the ensemble of atoms into a state with a single collective spin excitation distributed over all the atoms. The presence (one atom in state b) or absence (all atoms in state a) of this symmetrized spin excitation behaves as a single quantum bit.

To generate entanglement between spatially separated ensembles at sites L and R, the write fields emitted at both locations are combined together in a fashion that erases any information about their origin. Under this condition, if a photon is detected, it is impossible in principle to determine from which ensemble's L or R it came, so that both possibilities must be included in the subsequent description of the quantum state of the ensembles.

The resulting quantum state is an entangled state with "1" stored in the L ensemble and "0" in the R ensemble, and vice versa. That is, there exist simultaneously the complimentary possibilities for one spin excitation to be present in level b at site L ("1") and all atoms in the ground level a at site R ("0"), as well as for no spin excitations to be present in level b at site L ("0") and one excitation to be present at site R ("1").

This entangled state can be stored in the atoms for a programmable time, and then transferred into propagating light fields, which had not been possible before now. The Caltech researchers devised a method to determine unambiguously the presence of entanglement for the propagating light fields, and hence for the atomic ensembles.

The Caltech experiment confirms for the first time experimentally that entanglement between two independent, remote, massive quantum objects can be created by quantum interference in the detection of a photon emitted by one of the objects.

In addition to Kimble, the other authors are Chin-Wen Chou, a graduate student in physics; Hugues de Riedmatten, Daniel Felinto, and Sergey Polyakov, all postdoctoral scholars in Kimble's group; and Steven J. van Enk of Bell Labs, Lucent Technologies.

Writer: 
Robert Tindol
Writer: 

World Network Speed Record Shattered for Third Consecutive Year

Caltech, SLAC, Fermilab, CERN, Michigan, Florida, Brookhaven, Vanderbilt and Partners in the UK, Brazil, Korea and Japan Set 131.6 Gigabit Per Second Mark During the SuperComputing 2005 Bandwidth Challenge

SEATTLE, Wash.—An international team of scientists and engineers for the third consecutive year has smashed the network speed record, moving data along at an average rate of 100 gigabits per second (Gbps) for several hours at a time. A rate of 100 Gbps is sufficient for transmitting five feature-length DVD movies on the Internet from one location to another in a single second.

The winning "High-Energy Physics" team is made up of physicists, computer scientists, and network engineers led by the California Institute of Technology, the Stanford Linear Accelerator Center (SLAC), Fermilab, CERN, and the University of Michigan and partners at the University of Florida, Vanderbilt, and the Brookhaven National Lab, as well as international participants from the UK (University of Manchester and UKLight), Brazil (Rio de Janeiro State University, UERJ, and the State Universities of São Paulo, USP and UNESP), Korea (Kyungpook National University, KISTI) and Japan (the KEK Laboratory in Tsukuba), who joined forces to set a new world record for data transfer, capturing first prize at the Supercomputing 2005 (SC|05) Bandwidth Challenge (BWC).

The HEP team's demonstration of "Distributed TeraByte Particle Physics Data Sample Analysis" achieved a peak throughput of 151 Gbps and an official mark of 131.6 Gbps measured by the BWC judges on 17 of the 22 optical fiber links used by the team, beating their previous mark for peak throughput of 101 Gbps by 50 percent. In addition to the impressive transfer rate for DVD movies, the new record data transfer speed is also equivalent to serving 10,000 MPEG2 HDTV movies simultaneously in real time, or transmitting all of the printed content of the Library of Congress in 10 minutes.

The team sustained average data rates above the 100 Gbps level for several hours for the first time, and transferred a total of 475 terabytes of physics data among the team's sites throughout the U.S. and overseas within 24 hours. The extraordinary data transport rates were made possible in part through the use of the FAST TCP protocol developed by Associate Professor of Computer Science and Electrical Engineering Steven Low and his Caltech Netlab team, as well as new data transport applications developed at SLAC and Fermilab and an optimized Linux kernel developed at Michigan.

Professor of Physics Harvey Newman of Caltech, head of the HEP team and US CMS Collaboration Board Chair, who originated the LHC Data Grid Hierarchy concept, said, "This demonstration allowed us to preview the globally distributed Grid system of more than 100 laboratory and university-based computing facilities that is now being developed in the U.S., Latin America, and Europe in preparation for the next generation of high-energy physics experiments at CERN's Large Hadron Collider (LHC) that will begin operation in 2007.

"We used a realistic mixture of streams, including the organized transfer of multiterabyte datasets among the laboratory centers at CERN, Fermilab, SLAC, and KEK, plus numerous other flows of physics data to and from university-based centers represented by Caltech, Michigan, Florida, Rio de Janeiro and São Paulo in Brazil, and Korea, to effectively use the remainder of the network capacity.

"The analysis of this data will allow physicists at CERN to search for the Higgs particles thought to be responsible for mass in the universe, supersymmetry, and other fundamentally new phenomena bearing on the nature of matter and space-time, in an energy range made accessible by the LHC for the first time."

The largest physics collaborations at the LHC, CMS and ATLAS each encompass more than 2,000 physicists and engineers from 160 universities and laboratories. In order to fully exploit the potential for scientific discoveries, the many petabytes of data produced by the experiments will be processed, distributed, and analyzed using a global Grid. The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport terabyte-scale data samples on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" of already-understood particle interactions. This data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Matt Crawford, head of the Fermilab network team at SC|05 said, "The realism of this year's demonstration represents a major step in our ability to show that the unprecedented systems required to support the next round of high-energy physics discoveries are indeed practical. Our data sources in the bandwidth challenge were some of our mainstream production storage systems and file servers, which are now helping to drive the searches for new physics at the high-energy frontier at Fermilab's Tevatron, as well the explorations of the far reaches of the universe by the Sloan Digital Sky Survey."

Les Cottrell, leader of the SLAC team and assistant director of scientific computing and computing services, said, "Some of the pleasant surprises at this year's challenge were the advances in throughput we achieved using real applications to transport physics data, including bbcp and xrootd developed at SLAC. The performance of bbcp used together with Caltech's FAST protocol and an optimized Linux kernel developed at Michigan, as well as our xrootd system, were particularly striking. We were able to match the performance of the artificial data transfer tools we used to reach the peak rates in past years."

Future optical networks incorporating multiple 10 Gbps links are the foundation of the Grid system that will drive the scientific discoveries. A "hybrid" network integrating both traditional switching and routing of packets and dynamically constructed optical paths to support the largest data flows is a central part of the near-term future vision that the scientific community has adopted to meet the challenges of data-intensive science in many fields. By demonstrating that many 10 Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic Grid supporting many terabyte and larger data transactions is practical.

Shawn McKee, associate research scientist in the University of Michigan Department of Physics and leader of the UltraLight Network technical group, said, "This achievement is an impressive example of what a focused network effort can accomplish. It is an important step towards the goal of delivering a highly capable end-to-end network-aware system and architecture that meet the needs of next-generation e-science."

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and plan to deploy a new generation of revolutionary Internet applications. Multigigabit end-to-end network performance will empower scientists to form "virtual organizations" on a planetary scale, sharing their collective computing and data resources in a flexible way. In particular, this is vital for projects on the frontiers of science and engineering in "data intensive" fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

The new bandwidth record was achieved through extensive use of the SCInet network infrastructure at SC|05. The team used 15 10 Gbps links to Cisco Systems Catalyst 6500 Series Switches at the Caltech Center for Advanced Computing Research (CACR) booth, and seven 10 Gbps links to a Catalyst 6500 Series Switch at the SLAC/Fermilab booth, together with computing clusters provided by Hewlett Packard, Sun Microsystems, and IBM, and a large number of 10 gigabit Ethernet server interfaces-more than 80 provided by Neterion, and 14 by Chelsio.

The external network connections to Los Angeles, Sunnyvale, the Starlight facility in Chicago, and Florida included the Cisco Research, Internet2/HOPI, UltraScience Net and ESnet wavelengths carried by National Lambda Rail (NLR); Internet2's Abilene backbone; the three wavelengths of TeraGrid; an ESnet link provided by Qwest; the Pacific Wave link; and Canada's CANARIE network. International connections included the US LHCNet links (provisioned by Global Crossing and Colt) between Chicago, New York, and CERN, the CHEPREO/WHREN link (provisioned by LANautilus) between Miami and Sao Paulo, the UKLight link, the Gloriad link to Korea, and the JGN2 link to Japan.

Regional connections included six 10 Gbps wavelengths provided with the help of CIENA to Fermilab; two 10 Gbps wavelengths to the Caltech campus provided by Cisco Systems' research waves across NLR and California's CENIC network; two 10 Gbps wavelengths to SLAC provided by ESnet and UltraScienceNet; three wavelengths between Starlight and the University of Michigan over Michigan Lambda Rail (MiLR); and wavelengths to Jacksonville and Miami across Florida Lambda Rail (FLR). During the test, several of the network links were shown to operate at full capacity for sustained periods.

While the SC|05 demonstration required a major effort by the teams involved and their sponsors, in partnership with major research and education network organizations in the U.S., Europe, Latin America, and Pacific Asia, it is expected that networking on this scale in support of the largest science projects (such as the LHC) will be commonplace within the next three to five years. The demonstration also appeared to stress the network and server systems used, so the team is continuing its test program to put the technologies and methods used at SC|05 into production use, with the goal of attaining the necessary level of reliability in time for the start of the LHC research program.

As part of the SC|05 demonstrations, a distributed analysis of simulated LHC physics data was done using the Grid-enabled Analysis Environment (GAE) developed at Caltech for the LHC and many other major particle physics experiments, as part of the Particle Physics Data Grid (PPDG), GriPhyN/iVDGL, Open Science Grid, and DISUN projects. This involved transferring data to CERN, Florida, Fermilab, Caltech, and Brazil for processing by clusters of computers, and finally aggregating the results back to the show floor to create a dynamic visual display of quantities of interest to the physicists. In another part of the demonstration, file servers at the SLAC/FNAL booth and in Manchester also were used for disk-to-disk transfers between Seattle and the UK.

The team used Caltech's MonALISA (MONitoring Agents using a Large Integrated Services Architecture) system to monitor and display the real-time data for all the network links used in the demonstration. It simultaneously monitored more than 14,000 grid nodes in 200 computing clusters. MonALISA (http://monalisa.caltech.edu) is a highly scalable set of autonomous self-describing agent-based subsystems that are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and Grid systems, as well as the scientific applications themselves.

The network has been deployed through exceptional support by Cisco Systems, Hewlett Packard, Neterion, Chelsio, Sun Microsystems, IBM, and Boston Ltd., as well as the network engineering staffs of National LambdaRail, Internet2's Abilene Network, ESnet, TeraGrid, CENIC, MiLR, FLR, Pacific Wave, AMPATH, RNP and ANSP/FAPESP in Brazil, KISTI in Korea, UKLight in the UK, JGN2 in Japan, and the Starlight international peering point in Chicago.

The demonstration and the developments leading up to it were made possible through the strong support of the U.S. Department of Energy Office of Science and the National Science Foundation, in cooperation with the funding agencies of the international partners.

Further information about the demonstration may be found at: http://ultralight.caltech.edu/web-site/sc05 http://www-iepm.slac.stanford.edu/monitoring/bulk/sc2005/hiperf.html http://supercomputing.fnal.gov/ http://monalisa.caltech.edu:8080/Slides/SC2005BWC/SC2005_BWCTalk11705.ppt and http://scinet.supercomp.org/2005/bwc/results/summary.html

About Caltech: With an outstanding faculty, including five Nobel laureates, and such off-campus facilities as the Jet Propulsion Laboratory, Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,000 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities. http://www.caltech.edu

About SLAC: The Stanford Linear Accelerator Center (SLAC) is one of the world's leading research laboratories. Its mission is to design, construct, and operate state-of-the-art electron accelerators and related experimental facilities for use in high-energy physics and synchrotron radiation research. In the course of doing so, it has established the largest known database in the world, which grows at 1 terabyte per day. That, and its central role in the world of high-energy physics collaboration, places SLAC at the forefront of the international drive to optimize the worldwide, high-speed transfer of bulk data. http://www.slac.stanford.edu/

About CACR: Caltech's Center for Advanced Computing Research (CACR) performs research and development on leading edge networking and computing systems, and methods for computational science and engineering. Some current efforts at CACR include the National Virtual Observatory, ASC Center for Simulation of Dynamic Response of Materials, Particle Physics Data Grid, GriPhyN, Computational Infrastructure for Geophysics, Cascade High Productivity Computing System, and the TeraGrid. http://www.cacr.caltech.edu/

About Netlab: Netlab is the Networking Laboratory at Caltech led by Steven Low, where FAST TCP has been developed. The group does research in the control and optimization of protocols and networks, and designs, analyzes, implements, and experiments with new algorithms and systems. http://netlab.caltech.edu/FAST/

About the University of Michigan: The University of Michigan, with its size, complexity, and academic strength, the breadth of its scholarly resources and the quality of its faculty and students, is one of America's great public universities and one of the world's premiere research institutions. The university was founded in 1817 and has a total enrollment of 54,300 on all campuses. The main campus is in Ann Arbor, Michigan, and has 39,533 students (fall 2004). With over 600 degree programs and $739M in FY05 research funding, the university is one of the leaders in innovation and research. For more information, see http://www.umich.edu.

About the University of Florida: The University of Florida (UF), located in Gainesville, is a major public, comprehensive, land-grant, research university. The state's oldest, largest, and most comprehensive university, UF is among the nation's most academically diverse public universities. It has a long history of established programs in international education, research, and service and has a student population of approximately 49,000. UF is the lead institution for the GriPhyN and iVDGL projects and is a Tier-2 facility for the CMS experiment. For more information, see http://www.ufl.edu.

About Fermilab: Fermi National Accelerator Laboratory (Fermilab) is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Universities Research Association, Inc. Experiments at Fermilab's Tevatron, the world's highest-energy particle accelerator, generate petabyte-scale data per year, and involve large, international collaborations with requirements for high-volume data movement to their home institutions. The laboratory actively works to remain on the leading edge of advanced wide-area network technology in support of its collaborations.

About CERN: CERN, the European Organization for Nuclear Research, has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About StarLight: StarLight is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications. Operational since summer 2001, StarLight is a 1 GE and 10 GE switch/router facility for high-performance access to participating networks, and also offers true optical switching for wavelengths. StarLight is being developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), the International Center for Advanced Internet Research (iCAIR) at Northwestern University, and the Mathematics and Computer Science Division at Argonne National Laboratory, in partnership with Canada's CANARIE and the Netherlands' SURFnet. STAR TAP and StarLight are made possible by major funding from the U.S. National Science Foundation to UIC. StarLight is a service mark of the Board of Trustees of the University of Illinois. See www.startap.net/starlight.

About the University of Manchester: The University of Manchester has been created by combining the strengths of UMIST (founded in 1824) and the Victoria University of Manchester (founded in 1851) to form the largest single-site university in the UK, with 34,000 students. On Friday, October 22, 2004, it received its Royal Charter from Queen Elizabeth II, with an unprecedented £300M capital investment program. Twenty-three Nobel Prize winners have studied at Manchester, continuing a proud tradition of innovation and excellence. Rutherford conducted the research that led to the splitting of the atom there, and the world's first stored-program electronic digital computer successfully executed its first program there in June 1948. The Schools of Physics, Computational Science, Computer Science and the Network Group, together with the E-Science North West Centre research facility, are very active in developing a wide range of e-science projects and Grid technologies. See www.manchester.ac.uk.

About UERJ (Rio de Janeiro): Founded in 1950, the Rio de Janeiro State University (UERJ; http//www.uerj.br) ranks among the ten largest universities in Brazil, with more than 23,000 students. UERJ's five campuses are home to 22 libraries, 412 classrooms, 50 lecture halls and auditoriums, and 205 laboratories. UERJ is responsible for important public welfare and health projects through its centers of medical excellence, the Pedro Ernesto University Hospital (HUPE) and the Piquet Carneiro Day-care Policlinic Centre, and it is committed to the preservation of the environment. The UERJ High Energy Physics group includes 15 faculty, postdoctoral, and visiting PhD physicists, and 12 PhD and master's students, working on experiments at Fermilab (D0) and CERN (CMS). The group has constructed a Tier2 center to enable it to take part in the Grid-based data analysis planned for the LHC, and has originated the concept of a Brazilian "HEP Grid," working in cooperation with USP and several other universities in Rio and São Paulo.

About UNESP (São Paulo): Created in 1976 with the administrative union of several isolated institutes of higher education in the state of Saõ Paulo, the São Paulo State University, UNESP, has campuses in 24 different cities in the State of São Paulo. The university has 25,000 undergraduate students and almost 10,000 graduate students. Since 1999 the university has had a group participating in the DZero Collaboration of Fermilab, which is operating the São Paulo Regional Analysis Center (SPRACE). See http://www.unesp.br.

About USP (São Paulo): The University of São Paulo, USP, is the largest institution of higher education and research in Brazil, and the third largest in Latin America. The university has most of its 35 units located on the campus in the state capital. It has around 40,000 undergraduate students and around 25,000 graduate students. It is responsible for almost 25 percent of all Brazilian papers and publications indexed on the Institute for Scientific Information (ISI). The SPRACE cluster is located at the Physics Institute. See http://www.usp.br.

About Kyungpook National University (Daegu): Kyungpook National University is one of the leading universities in Korea, especially in physics and information science. The university has 13 colleges and nine graduate schools with 24,000 students. It houses the Center for High-Energy Physics (CHEP) in which most Korean high-energy physicists participate. CHEP (chep.knu.ac.kr) was approved as one of the designated Excellent Research Centers supported by the Korean Ministry of Science.

About Vanderbilt: One of America's top 20 universities, Vanderbilt University is a private research university of 6,319 undergraduates and 4,566 graduate and professional students. The university comprises 10 schools, a public policy institute, a distinguished medical center, and the Freedom Forum First Amendment Center. Located a mile and a half southwest of downtown Nashville, the campus is in a park-like setting. Buildings on the original campus date to its founding in 1873, and the Peabody section of campus has been registered as a National Historic Landmark since 1966. Vanderbilt ranks 24th in the value of federal research grants awarded to faculty members, according to the National Science Foundation.

About the Particle Physics Data Grid (PPDG): The Particle Physics Data Grid; see www.ppdg.net) is developing and deploying production Grid systems that vertically integrate experiment-specific applications, Grid technologies, Grid and facility computation, and storage resources to form effective end-to-end capabilities. PPDG is a collaboration of computer scientists with a strong record in Grid technology and physicists with leading roles in the software and network infrastructures for major high-energy and nuclear experiments. PPDG's goals and plans are guided by the immediate and medium-term needs of the physics experiments and by the research and development agenda of the computer science groups.

About GriPhyN and iVDGL: GriPhyN (www.griphyn.org) and iVDGL (www.ivdgl.org) are developing and deploying Grid infrastructure for several frontier experiments in physics and astronomy. These experiments together will utilize petaflops of CPU power and generate hundreds of petabytes of data that must be archived, processed, and analyzed by thousands of researchers at laboratories, universities, and small colleges and institutes spread around the world. The scale and complexity of this "petascale" science drive GriPhyN's research program to develop Grid-based architectures, using "virtual data" as a unifying concept. IVDGL is deploying a Grid laboratory where these technologies can be tested at large scale and where advanced technologies can be implemented for extended studies by a variety of disciplines.

About CHEPREO: Florida International University (FIU), in collaboration with partners at Florida State University, the University of Florida, and the California Institute of Technology, has been awarded an NSF grant to create and operate an interregional Grid-enabled Center for High-Energy Physics Research and Educational Outreach (CHEPREO; www.chepreo.org) at FIU. CHEPREO encompasses an integrated program of collaborative physics research on CMS, network infrastructure development, and educational outreach at one of the largest minority universities in the US. The center is funded by four NSF directorates: Mathematical and Physical Sciences, Scientific Computing Infrastructure, Elementary, Secondary and Informal Education, and International Programs.

About the Open Science Grid: The OSG makes innovative science possible by bringing multidisciplinary collaborations together with the latest advances in distributed computing technologies. This shared cyberinfrastructure, built by research groups from U.S. universities and national laboratories, receives support from the National Science Foundation and the U.S. Department of Energy's Office of Science. For more information about the OSG, visit www.opensciencegrid.org.

About Internet2®: Led by more than 200 U.S. universities working with industry and government, Internet2 develops and deploys advanced network applications and technologies for research and higher education, accelerating the creation of tomorrow's Internet. Internet2 recreates the partnerships among academia, industry, and government that helped foster today's Internet in its infancy. For more information, visit: www.internet2.edu.

About the Abilene Network: Abilene, developed in partnership with Qwest Communications, Juniper Networks, Nortel Networks and Indiana University, provides nationwide high-performance networking capabilities for more than 225 universities and research facilities in all 50 states, the District of Columbia, and Puerto Rico. For more information on Abilene, see http://abilene.internet2.edu/.

About the TeraGrid: The TeraGrid, funded by the National Science Foundation, is a multiyear effort to build a distributed national cyberinfrastructure. TeraGrid entered full production mode in October 2004, providing a coordinated set of services for the nation's science and engineering community. TeraGrid's unified user support infrastructure and software environment allow users to access storage and information resources as well as over a dozen major computing systems at nine partner sites via a single allocation, either as stand-alone resources or as components of a distributed application using Grid software capabilities. Over 40 teraflops of computing power, 1.5 petabytes of online storage, and multiple visualization, data collection, and instrument resources are integrated at the nine TeraGrid partner sites. Coordinated by the University of Chicago and Argonne National Laboratory, the TeraGrid partners include the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign (UIUC), San Diego Supercomputer Center (SDSC) at the University of California, San Diego (UCSD), the Center for Advanced Computing Research (CACR) at the California Institute of Technology (Caltech), the Pittsburgh Supercomputing Center (PSC), Oak Ridge National Laboratory, Indiana University, Purdue University, and the Texas Advanced Computing Center (TACC) at the University of Texas-Austin.

About National LambdaRail: National LambdaRail (NLR) is a major initiative of U.S. research universities and private sector technology companies to provide a national-scale infrastructure for research and experimentation in networking technologies and applications. NLR puts the control, the power, and the promise of experimental network infrastructure in the hands of the nation's scientists and researchers. Visit http://www.nlr.net for more information.

About CENIC: CENIC (www.cenic.org) is a not-for-profit corporation serving the California Institute of Technology, California State University, Stanford University, University of California, University of Southern California, California Community Colleges, and the statewide K-12 school system. CENIC's mission is to facilitate and coordinate the development, deployment, and operation of a set of robust multi-tiered advanced network services for this research and education community.

About ESnet: The Energy Sciences Network (ESnet; www.es.net) is a high-speed network serving thousands of Department of Energy scientists and collaborators worldwide. A pioneer in providing high-bandwidth, reliable connections, ESnet enables researchers at national laboratories, universities, and other institutions to communicate with each other using the collaborative capabilities needed to address some of the world's most important scientific challenges. Managed and operated by the ESnet staff at Lawrence Berkeley National Laboratory, ESnet provides direct high-bandwidth connections to all major DOE sites, multiple cross connections with Internet2/Abilene, connections to Europe via GEANT and to Japan via SuperSINET, and fast interconnections to more than 100 other networks. Funded principally by DOE's Office of Science, ESnet services allow scientists to make effective use of unique DOE research facilities and computing resources, independent of time and geographic location.

About Qwest: Qwest Communications International Inc. (NYSE: Q) is a leading provider of voice, video, and data services. With more than 40,000 employees, Qwest is committed to the "Spirit of Service" and to providing world-class services that exceed customers' expectations for quality, value, and reliability. For more information, please visit the Qwest Web site at www.qwest.com.

About UKLight: The UKLight facility (www.uklight.ac.uk) was set up in 2003 with a grant of £6.5M from HEFCE (the Higher Education Funding Council for England) to provide an international experimental testbed for optical networking and support projects working on developments towards optical networks and the applications that will use them. UKLight will bring together leading-edge applications, Internet engineering for the future, and optical communications engineering, and enable UK researchers to join the growing international consortium that currently spans Europe and North America. A "Point of Access" (PoA) in London provides international connectivity with 10 Gbit network connections to peer facilities in Chicago (StarLight) and Amsterdam (NetherLight). UK research groups gain access to the facility via extensions to the 10Gbit SuperJANET development network, and a national dark fiber facility is under development for use by the photonics research community. Management of the UKLight facility is being undertaken by UKERNA on behalf of the Joint Information Systems Committee (JISC).

About AMPATH: Florida International University's Center for Internet Augmented Research and Assessment (CIARA) has developed an international, high-performance research connection point in Miami, Florida, called AMPATH (AMericasPATH; www.ampath.fiu.edu). AMPATH's goal is to enable wide-bandwidth digital communications between U.S. and international research and education networks, as well as between a variety of U.S. research programs in the region. AMPATH in Miami acts as a major international exchange point (IXP) for the research and education networks in South America, Central America, Mexico, and the Caribbean. The AMPATH IXP is home for the WHREN-LILA high-performance network link connecting Latin America to the U.S., funded by the NSF (award #0441095) and the Academic Network of São Paulo (award #2003/13708-0).

About the Academic Network of São Paulo (ANSP): ANSP unites São Paulo's University networks with Scientific and Technological Research Centers in São Paulo, and is managed by the State of São Paulo Research Foundation (FAPESP). The ANSP Network is another example of international collaboration and exploration. Through its connection to WHREN-LILA, all of the institutions connected to ANSP will be involved in research with U.S. universities and research centers, offering significant contributions and the potential to develop new applications and services. This connectivity with WHREN-LILA and ANSP will allow researchers to enhance the quality of current data, inevitably increasing the quality of new scientific development. See http://www.ansp.br.

About RNP: RNP, the National Education and Research Network of Brazil, is a not-for-profit company that promotes the innovative use of advanced networking with the joint support of the Ministry of Science and Technology and the Ministry of Education. In the early 1990s, RNP was responsible for the introduction and adoption of Internet technology in Brazil. Today, RNP operates a nationally deployed multigigabit network used for collaboration and communication in research and education throughout the country, reaching all 26 states and the Federal District, and provides both commodity and advanced research Internet connectivity to more than 300 universities, research centers, and technical schools. See http://www.rnp.br.

About KISTI: KISTI (Korea Institute of Science and Technology Information), which was assigned to play the pivotal role in establishing the national science and technology knowledge information infrastructure, was founded through the merger of the Korea Institute of Industry and Technology Information (KINITI) and the Korea Research and Development Information Center (KORDIC) in January, 2001. KISTI is under the supervision of the Office of the Prime Minister and will play a leading role in building the nationwide infrastructure for knowledge and information by linking the high-performance research network with its supercomputers.

About Hewlett Packard: HP is a technology solutions provider to consumers, businesses, and institutions globally. The company's offerings span IT infrastructure, global services, business and home computing, and imaging, and printing. More information about HP (NYSE, Nasdaq: HPQ) is available at www.hp.com.

About Sun Microsystems: Since its inception in 1982, a singular vision-"The Network Is The Computer(TM)"-has propelled Sun Microsystems, Inc. (Nasdaq: SUNW) to its position as a leading provider of industrial-strength hardware, software, and services that make the Net work. Sun can be found in more than 100 countries and on the World Wide Web at http://sun.com.

About IBM: IBM is the world's largest information technology company, with 80 years of leadership in helping businesses innovate. Drawing on resources from across IBM and key business partners, IBM offers a wide range of services, solutions, and technologies that enable customers, large and small, to take full advantage of the new era of e-business. For more information about IBM, visit www.ibm.com.

About Boston Limited: With over 12 years of experience, Boston Limited (www.boston.co.uk) is a UK-based specialist in high-end workstation, server, and storage hardware. Boston's solutions bring the latest innovations to market, such as PCI-Express, DDR II, and Infiniband technologies. As the pan-European distributor for Supermicro, Boston Limited works very closely with key manufacturing partners, as well as strategic clients within the academic and commercial sectors, to provide cost-effective solutions with exceptional performance.

About Neterion, Inc.: Founded in 2001, Neterion Inc. has locations in Cupertino, California, and Ottawa, Canada. Neterion delivers 10 Gigabit Ethernet hardware and software solutions that solve customers' high-end networking problems. The Xframe(r) line of products is based on Neterion-developed technologies that deliver new levels of performance, availability and reliability in the datacenter. Xframe, Xframe II, and Xframe E include full IPv4 and IPv6 support and comprehensive stateless offloads that preserve the integrity of current TCP/IP implementations without "breaking the stack." Xframe drivers are available for all major operating systems, including Microsoft Windows, Linux, Hewlett-Packard's HP-UX, IBM's AIX, Sun's Solaris and SGI's Irix. Neterion has raised over $42M in funding, with its latest C round taking place in June 2004. Formerly known as S2io, the company changed its name to Neterion in January 2005. Further information on the company can be found at http://www.neterion.com/

About Chelsio Communications: Chelsio Communications is leading the convergence of networking, storage, and clustering interconnects with its robust, high-performance, and proven protocol acceleration technology. Featuring a highly scalable and programmable architecture, Chelsio is shipping 10-Gigabit Ethernet adapter cards with protocol offload, delivering the low latency and superior throughput required for high-performance computing applications. For more information, visit the company online at www.chelsio.com.

About the National Science Foundation: The NSF is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense...." With an annual budget of about $5.5 billion, it is the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science, and the social sciences, NSF is the major source of federal backing.

About the DOE Office of Science: DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the nation, and ensures U.S. world leadership across a broad range of scientific disciplines. The Office of Science also manages 10 world-class national laboratories with unmatched capabilities for solving complex interdisciplinary problems, and it builds and operates some of the nation's most advanced R&D user facilities, located at national laboratories and universities. These facilities are used by more than 19,000 researchers from universities, other government agencies, and private industry each year.

 

 

 

Writer: 
Robert Tindol
Writer: 
Exclude from News Hub: 
No

Watson Lecture: Exploring Einstein's Legacy

PASADENA, Calif.- November 25 marks the 90th anniversary of Einstein's formulation of his theory of general relativity, which describes gravity as a consequence of the warping of space and time.

Since then, physicists have been trying to understand and test general relativity's predictions, including the existence of black holes (which are made not of matter but of whirling space and warped time), gravitational waves, and the acceleration of the universe. "We don't understand the predictions very well because we are not clever enough to solve Einstein's equations when spacetime is highly warped and dynamical," says Kip Thorne, the Richard P. Feynman Professor of Theoretical Physics at the California Institute of Technology.

In his November 16 talk, "Einstein's General Relativity, from 1905 to 2005: Warped Spacetime, Black Holes, Gravitational Waves, and the Accelerating Universe," Thorne will discuss the progress that physicists have made in understanding warped spacetime, and he will discuss prospects for rapid future progress using gravitational wave detectors such as LIGO and supercomputer simulations.

"Einstein's predictions have turned out to reach into the domain of our every day technology. For example, time flows more slowly on the earth than it does in the Global Positioning System's satellites high above the surface of the earth. The software that computes where we are from the GPS signals must correct for the warping of time from there to here, or the system would fail," Thorne says.

Cosmologists deal with the warping of space and time "all over the sky," Thorne says, because the whole universe is warped. In the Big Bang, the birth of the universe, "everything came out of a singularity, a place where space and time were infinitely warped," he says. "My hope is that after this lecture the listener will understand what we mean by warped spacetime, and how Einstein came up with such a crazy idea in the first place."

The talk is the third program of the 2005-2006 Earnest C. Watson Lecture Series, and the last of four special lectures in Caltech's Einstein Centennial Lecture Series. The series celebrates the centennial of Einstein's annus mirabilis (miracle year) in 1905, when, at the ripe age of 26, he published three seminal papers proving the dual particle and wave nature of light and the existence and size of molecules, and creating the special theory of relativity and his revolutionary E=mc2 equation.

Thorne's lecture will take place at 8 p.m. in Beckman Auditorium, 332 S. Michigan Avenue south of Del Mar Boulevard, on the Caltech campus in Pasadena. Seating is available on a free, no-ticket-required, first-come, first-served basis. Caltech has offered the Watson Lecture Series since 1922, when it was conceived by the late Caltech physicist Earnest Watson as a way to explain science to the local community.

For more information, call 1(888) 2CALTECH (1-888-222-5832) or (626) 395-4652. ###

Contact: Kathy Svitil (626) 395-8022 ksvitil@caltech.edu

Visit the Caltech Media Relations Web site at: http://pr.caltech.edu/media

Writer: 
KS
Tags: 
Writer: 

Survey of Early Universe Uncovers Mature Galaxy Eight Times More Massive Than Milky Way

PASADENA, Calif.--A massive galaxy seen when the universe was only 800 million years old has been discovered by teams of astronomers using NASA's Spitzer and Hubble Space Telescopes.

The galaxy's large mass and maturity come as a surprise, because experts previously thought that early galaxies in the young universe should be less prominent agglomerations of stars, rather than giant collections of hundreds of billions of stars as populous or more so than the Milky Way. The researchers are particularly intrigued by the fact that star formation in the galaxy seems to have already been completed. This implies that the bulk of the activity that built up the galaxy had occurred even earlier.

"This is truly a significant object," says Richard Ellis, who is the Steele Family Professor of Astronomy at the California Institute of Technology and a member of the discovery team. "Although we are looking back to when the universe was only 6 percent of its present age, this galaxy has already built up a mass in stars eight times that of the Milky Way.

"If the distance measurement to this object holds up to further scrutiny, the fact such a galaxy has already completed its star formation implies a yet earlier period of intense activity," Ellis adds. "It's like crossing the ocean and meeting a lone seagull, a forerunner of land ahead. There is now every reason to search beyond this object for the cosmic dawn when the first such systems switched on!"

The galaxy was pinpointed among approximately 10,000 others in a small patch of sky called the Hubble Ultra Deep Field (UDF). It is believed to be about as far away as the most distant galaxies known.

Bahram Mobasher of the Space Telescope Science Institute, leader of the science team, explains, "We found this galaxy in Hubble's infrared images of the UDF and expected it to be young and small, like other known galaxies at similar distances. Instead, we found evidence that it is remarkably mature and much more massive. This is the surprising discovery."

The galaxy's great distance was deduced from the fact that Hubble does not see the galaxy in visible light (despite the fact that the UDF is the deepest image ever taken in optical light). This indicates that the galaxy's blue light has been absorbed by traveling billions of light-years through intervening hydrogen gas. The galaxy was detected using Hubble's Near Infrared Camera and Multi-Object Spectrometer (NICMOS), and with an infrared camera on the Very Large Telescope (VLT) at the European Southern Observatory. At those near-infrared wavelengths it is very faint and red.

The big surprise is how much brighter the galaxy is in images at slightly longer infrared wavelengths from the Spitzer Space Telescope. Spitzer is sensitive to the light from older, redder stars, which should make up most of the mass in a galaxy. The infrared brightness of the galaxy suggests that it is very massive.

Two other Spitzer observations, one reported earlier by Ellis and his colleagues at the University of Exeter, UK, and the other by Haojing Yan of the Spitzer Science Center, had already revealed evidence for mature stars in more ordinary, less massive galaxies at similar distances, when the universe was less than one billion years old. However, the new observation extends this notion of surprisingly mature galaxies to an object which is perhaps ten times more massive, and which seemed to form its stars even earlier in the history of the universe.

The team estimated the distance to this galaxy by combining the information provided by the Hubble, Spitzer, and VLT observations. The relative brightness of the galaxy at different wavelengths is influenced by the expanding universe, and allows astronomers to estimate its distance. At the same time, they can also get an idea of the make-up of the galaxy in terms of the mass and age of its stars.

Efforts by Dan Stark, a graduate student at Caltech, using both the giant 10 m Keck and 8 m Gemini telescopes failed to pinpoint the galaxy's distance via spectroscopic methods-the astronomers' conventional tool for estimating cosmic distances. "We have to admit," says Stark, "that we have now reached the point where we are studying sources which lie beyond the spectroscopic capabilities of our current ground-based facilities. It may take the next generation of telescopes, such as the James Webb Space Telescope and Caltech's proposed Thirty Meter Telescope, to confirm the galaxy's distance."

While astronomers generally believe most galaxies were built up piecewise by mergers of smaller galaxies, the discovery of this object suggests that at least a few galaxies formed quickly and in their entirety long ago. For such a large galaxy, this would have been a tremendously explosive event of star birth.

The findings will be published in the December 20, 2005, issue of the Astrophysical Journal.

JPL manages the Spitzer Space Telescope mission for NASA. Science operations are conducted at the Spitzer Science Center at the California Institute of Technology in Pasadena. JPL is a division of Caltech. Spitzer's infrared array camera, which took the picture of the galaxy, was built by NASA Goddard Space Flight Center, Greenbelt, Md.

Electronic images and additional information are available at

http://hubblesite.org/news/2005/28 http://www.spitzer.caltech.edu/Media/releases/ssc2005-19/

Further information relating to the James Webb Space Telescope and the proposed Thirty Meter Telescope (a collaboration between the California Institute of Technology the University of California, the Association of Universities for Research in Astronomy, and the Association of Canadian Universities for Research in Astronomy), can be found at:

http://www.jwst.nasa.gov/ http://www.tmt.org/

Contacts:

Professor Richard Ellis (cell) 626-676-5530 rse@astro.caltech.edu

Daniel Stark (cell) 626-315-2939 dps@astro.caltech.edu

Dr Bahram Mobasher (cell) 443-812-8149 mobasher@ststci.edu

Robert Tindol (Media Relations, Caltech): (office) 626-395-3631 tindol@cal

Writer: 
RT

Most Distant Explosion in Universe Detected; Smashes Previous Record

WASHINGTON, D.C.--Scientists using the NASA Swift satellite and several ground-based telescopes, including Palomar Observatory's robotic 60-inch telescope, have detected the most distant explosion yet, a gamma-ray burst from the edge of the visible universe.

This powerful burst, likely marking the death of a massive star as it collapsed into a black hole, was detected on September 4. It comes from an era soon after stars and galaxies first formed, about 500 million to 1 billion years after the Big Bang. The science team cannot yet determine the nature of the exploded star; a detailed analysis is forthcoming.

The 60-inch telescope at Palomar Observatory, which is owned and operated by the California Institute of Technology, observed the burst at visible wavelengths. At about the same time, a team led by Daniel Reichart of the University of North Carolina undertook near-infrared observations with the SOAR (Southern Observatory for Astrophysical Research) telescope, located in Chile. A bright near-infrared source was detected in the SOAR observations but completely absent in the Palomar data.

Building upon these pieces of information, a team led by Nobuyuki Kawai of the Tokyo Institute of Technology used the Subaru Observatory on Mauna Kea, in Hawaii, to confirm the distance and fine-tune the redshift measurement to 6.29 via a technique called spectroscopy. A redshift of 6.29 translates to a distance of about 13 billion light-years from Earth. The universe is thought to be 13.7 billion years old.

An ordinary afterglow would be easily detected by both facilities, but the fact that the afterglow was not seen by the Palomar 60-inch visible imager, but was readily detected by the SOAR infra-red imager alerted Reichart to the possibility that the afterglow was located at the edge of the universe. For such distant objects, hydrogen in the intergalactic medium absorbs visible rays but not infrared rays. Hence, invisibility in the visible spectrum indicated that the object was extremely far away.

"This is uncharted territory," said Reichart, who spearheaded the distance measurement. "This burst smashes the old distance record by 500 million light-years. We are finally starting to see the remnants of some of the oldest objects in the universe."

The Caltech team, led by Derek Fox, until recently a postdoctoral fellow and now a professor at the Pennsylvania State University, and Caltech graduate student Brad Cenko, did the actual observations with the Palomar telescope.

"The key step in identifying these much sought-after gamma-ray bursts is the dual combination of detection in the near-infrared and non-detection in the visible," says Fox. "We sincerely hope that the 60-inch will continue not detecting more gamma-ray bursts!"

"I am elated that the Palomar 60-inch telescope contributed to the first vital step in inferring the great distance to this burst," added Cenko, who spent the last two years robotocizing the 60-inch telescope. "The existence of such distant GRBs has been postulated for quite some time, but the detection makes me feel secure that I have a sound thesis topic."

Shri Kulkarni, the principal investigator of the Palomar 60-inch robotic telescope and the MacArthur Professor of Astronomy and Planetary Science at Caltech, noted that "the discovery has highlighted the important niche for smaller telescopes, especially robotic telescopes, in undertaking cutting-edge research for objects literally at the edge of the universe. We have hit paydirt, thanks to Caltech's investment in the 60-inch telescope."

The only object ever discovered at a greater distance was a quasar at a redshift of 6.4. Whereas quasars are supermassive black holes containing the mass of billions of stars, this burst comes from a single star. Yet gamma-ray bursts might be plentiful, according to Donald Lamb of the University of Chicago.

Scientists measure cosmic distances via redshift, the extent to which light is "shifted" towards the red (lower energy) part of the electromagnetic spectrum during its long journey across the universe. The greater the distance, the higher the redshift.

The September 4 burst is named GRB 050904, for the date it was detected. The previous most distant gamma-ray burst had a redshift of 4.5.

"We designed Swift to look for faint bursts coming from the edge of the universe," said Neil Gehrels of NASA's Goddard Space Flight Center, who is the Swift principal investigator. "Now we've got one, and it's fascinating. For the first time we can learn about individual stars from near the beginning of time. There are surely many more out there."

Writer: 
Robert Tindol
Writer: 

Caltech Scientists Create Tiny Photon Clock

PASADENA--In a new development that could be useful for future electronic devices, applied physicists at the California Institute of Technology have created a tiny disk that vibrates steadily like a tuning fork while it is pumped with light. This is the first micro-mechanical device that has been operated at a steady frequency by the action of photons alone.

Reporting in recently published issues of the journals Optics Express (July 11) and Physical Review Letters (June 10 and July 11), Kerry Vahala and group members Hossein Rokhsari, Tal Carmon, and Tobias Kippenberg, explain how the tiny, disk-shaped resonator made of silica can be made to vibrate mechanically when hit by laser light. The disk, which is less than the width of a human hair, vibrates about 80 million times per second when its rim is pumped with light.

According to Vahala, who is the Jenkins Professor of Information Science and Technology and Professor of Applied Physics, the effect is due to properties of the disk that allow it to store light very efficiently, and also to the fact that light exerts "radiation pressure." In much the same way that NASA's solar sails will catch photons from the sun to power spaceships to other worlds, the disk builds up light energy so that the disk itself swells.

"The light makes hundreds of thousands of orbits around the rim of the disk," Vahala explains. "This causes the disk to literally stretch, owing to the radiation pressure of the photons."

Once the disk has inflated, its physical properties change so that the light energy is lost, and the disk then deflates. The cycle then repeats itself, and this repetition continues in a very orderly fashion as long as the light is pumped into the disk.

In effect, this repetitive process makes the disk a very efficient clock, somewhat similar to the quartz crystal that is made to vibrate from electrical current for the regulation of a battery-powered wristwatch. The differences between the optically driven clock and the traditional electrical one, however, create a design element that could provide new electro-optic functions within the context of integrated circuits.

The researchers also note that whereas the basic operation of the device can be understood at the classical level, such a device could be used to study interactions between radiation and macroscopic mechanical motion at the quantum level. Several groups have already proposed theoretically using radiation pressure as a mechanism to investigate such interactions.

Also, the device could be of help in designing the next-generation Laser Interferometer Gravitational-Wave Observatory (LIGO). A National Science Foundation-funded project operated by Caltech and MIT, LIGO and has been created to detect the phenomenon known as gravitational waves, predicted by Einstein decades ago.

LIGO is designed in such a way that laser light bounces between mirrors along a five-mile right-angle circuit. The light is allowed to build up in the two arms of the detector so as to increase the possibility that gravitational waves will eventually be detected from exotic astrophysical objects such as colliding black holes and supernovae.

But designers have been concerned to ensure that the same radiation-pressure-driven instability does not appear in the LIGO system as its sensitivity is boosted. The work by the Vahala group, though at a vastly smaller size scale, therefore could be of help in the current plans for improvement of the LIGO detectors in Hanford, Washington and Livingston, Louisiana.

"This work demonstrates a mechanism that needs to be understood better," Vahala explains. "It has moved from theory to existence, and that is always exciting."

The paper, "Radiation-pressure-driven micro-mechanical oscillator," appearing in the July 11 issue of the journal Optics Express, is available on-line at http://www.opticsexpress.org/abstract.cfm?URI=OPEX-13-14-5293.

 

 

Writer: 
Robert Tindol
Writer: 

KamLAND Detector Provides New Way to Study Heat from Radioactive Materials Within Earth

PASADENA, Calif.--Much of the heat within our planet is caused by the radioactive decay of the elements uranium and thorium. Now, an international team of particle physicists using a special detector in Japan has demonstrated a novel method of measuring that radioactive heat.

In the July 28 issue of the journal Nature, the physicists report on measurements of electron antineutrinos they have detected from within Earth by using KamLAND, the Kamioka Liquid Scintillator Anti-Neutrino Detector. These data indicate that Earth itself generates about 20 billion kilowatts (or terawatts) of power from underground radioactive decays.

According to Robert McKeown, a physicist at the California Institute of Technology and one of the authors of the paper, the results show that this novel approach to geophysical research is feasible. "Neutrinos and their corresponding antiparticles, antineutrinos, are remarkable for their ability to pass unhindered through large bodies of matter like the entire Earth, and so can give geophysicists a powerful method to access the composition of the planet's interior."

McKeown credits the discovery with the unique KamLAND experimental apparatus. The antineutrino detector was primarily built to study antineutrinos emitted by Japanese nuclear power plants. The KamLAND experiment has already resulted in several breakthroughs in experimental particle physics, including the 2002 discovery that antineutrinos emitted by the power plants do indeed change flavor as they travel through space. This result helped solve a longstanding mystery related to the fact that the number of neutrinos from the sun was apparently too small to be reconciled with our current understanding of nuclear fusion. The new results turn from nuclear reactors and the sun to the Earth below. To detect geoneutrinos (or antineutrinos arising from radioactive decays within the planet), the researchers carefully shielded the detector from background radiation and cosmic sources, and also compensated for the antineutrinos that have come from Japan's 53 nuclear power reactors.

The decays of both uranium and thorium have been well understood for decades, with both decays eventually resulting in stable isotopes of lead. KamLAND is the first detector built with the capability to detect the antineutrinos from these radioactive decays.

The researchers plan to continue running the KamLAND experiments for several years. By reducing the trace residual radioactivity in the detector, they hope to increase the sensitivity of the experiment to geoneutrinos and neutrinos from the sun. The additional data will also allow them to better constrain the oscillation of neutrinos as they change their flavors, and perhaps to catch neutrinos from interstellar space if any supernovae occur in our galaxy.

Other members of McKeown's team at Caltech's Kellogg Radiation Lab are Christopher Mauger, a postdoctoral scholar in physics, and Petr Vogel, a senior research associate emeritus in physics. Other partners in the study include the Research Center for Neutrino Science at Tohuku University in Japan, the University of Alabama, the University of California at Berkeley and the Lawrence Berkeley National Laboratory, Drexel University, the University of Hawaii, the University of New Mexico, the University of North Carolina, Kansas State University, Louisiana State University, Stanford University, Duke University, North Carolina State University, the University of Tennessee, the Institute of High Energy Physics in Beijing, and the University of Bordeaux in France.

The project is supported in part by the U.S. Department of Energy.

 

 

Writer: 
Robert Tindol
Writer: 

Researchers devise plasma experiment that shows how astrophysical jets are formed

PASADENA, Calif.--Applied physicists at the California Institute of Technology have devised a plasma experiment that shows how huge long, thin jets of material shoot out from exotic astrophysical objects such as young stars, black holes, and galactic nuclei.

Reporting in an upcoming issue of the journal Physical Review Letters, applied physics professor Paul Bellan, his graduate student Gunsu Yun, and postdoctoral scholar Setthivoine You describe how they create jets of plasma at will in an experimental device known as a "planar spheromak gun." The researchers form the jets by sending an intense electric current through a gas to form a plasma, after applying a background magnetic field to the whole system. The magnetized plasma then naturally tends to shoot out of the gun in the form of a long collimated filament.

According to Bellan, his research group is the first to achieve an experimental result showing how astrophysical jets are formed. Theorists have done mathematical modeling and computer simulations to show how known magnetohydrodynamic effects could explain the jet phenomenon, but the Bellan experiment actually creates similar jets in a lab device.

"We're not claiming to make scale models, but I think we've captured the essence of astrophysical jets," says Bellan, who has been working on this and related projects at Caltech since the late 1990s.

Although there are differences between astrophysical jets and the ones created in the spheromak gun, Bellan says there are also important similarities that link the 13-inch-long plasma jets created in the lab to the enormous jets in outer space. The similarity is primarily in the way that the magnetic flux tubes are straightened through a sort of squeezing effect that points to a common collimation process.

Astrophysical jets are accelerated by magnetic forces, but also carry along magnetic fields, the researchers explain. These magnetic fields are frozen into the plasma that makes up the jet and wrapped around the jet like rubber bands around a paper tube. The flowing plasma piles up, much like fast traffic coming up on slower traffic on a freeway, and this pile-up increases the plasma density just like the density of cars increases in a traffic jam.

The frozen-in bandlike magnetic field lines also become squeezed together in this "traffic jam," and so, just like rubber bands piling up on a paper tube, pinch down the diameter of the plasma jet, making it thin and even more dense.

Why do the researchers think this is an accurate portrayal of astrophysical jets? Because this is precisely how they make similar but smaller jets in their experiment.

"Very dense, fast, thin plasma jets observed in our laboratory experiments have been shown to be in good agreement with this picture," explains You.

Bellan says that the research stems from work he and his group have done for years in the formidable and longstanding effort to make fusion power an eventual reality. The current results have implications for the goal of containing the extremely hot plasma required for fusion, as well as for explaining certain exotic events in the cosmos.

 

 

Writer: 
RT

Andromeda Galaxy Three Times Bigger in Diameter Than Previously Thought

MINNEAPOLIS--The lovely Andromeda galaxy appeared as a warm fuzzy blob to the ancients. To modern astronomers millennia later, it appeared as an excellent opportunity to better understand the universe. In the latter regard, our nearest galactic neighbor is a gift that keeps on giving.

Scott Chapman, from the California Institute of Technology, and Rodrigo Ibata, from the Observatoire Astronomique de Strasbourg in France, have led a team of astronomers in a project to map out the detailed motions of stars in the outskirts of the Andromeda galaxy. Their recent observations with the Keck telescopes show that the tenuous sprinkle of stars extending outward from the galaxy are actually part of the main disk itself. This means that the spiral disk of stars in Andromeda is three times larger in diameter than previously estimated.

At the annual summer meeting of the American Astronomical Society today, Chapman will outline the evidence that there is a vast, extended stellar disk that makes the galaxy more than 220,000 light-years in diameter. Previously, astronomers looking at the visible evidence thought Andromeda was about 70,000 to 80,000 light-years across. Andromeda itself is about 2 million light-years from Earth.

The new dimensional measure is based on the motions of about 3,000 of the stars some distance from the disk that were once thought to be merely the "halo" of stars in the region and not part of the disk itself. By taking very careful measurements of the "radial velocities," the researchers were able to determine precisely how each star was moving in relation to the galaxy.

The results showed that the outlying stars are sitting in the plane of the Andromeda disk itself and, moreover, are moving at a velocity that shows them to be in orbit around the center of the galaxy. In essence, this means that the disk of stars is vastly larger than previously known.

Further, the researchers have determined that the nature of the "inhomogeneous rotating disk"-in other words, the clumpy and blobby outer fringes of the disk-shows that Andromeda must be the result of satellite galaxies long ago slamming together. If that were not the case, the stars would be more evenly spaced.

Ibata says, "This giant disk discovery will be very hard to reconcile with computer simulations of forming galaxies. You just don't get giant rotating disks from the accretion of small galaxy fragments."

The current results, which are the subject of two papers already available and a third yet to be published, are made possible by technological advances in astrophysics. In this case, the Keck/DEIMOS multi-object spectrograph affixed to the Keck II Telescope possesses the mirror size and light-gathering capacity to image stars that are very faint, as well as the spectrographic sensitivity to obtain highly accurate radial velocities.

A spectrograph is necessary for the work because the motion of stars in a faraway galaxy can only be detected within reasonable human time spans by inferring whether the star is moving toward us or away from us. This can be accomplished because the light comes toward us in discrete frequencies due to the elements that make up the star.

If the star is moving toward us, then the light tends to cram together, so to speak, making the light higher in frequency and "bluer." If the star is moving away from us, the light has more breathing room and becomes lower in frequency and "redder."

If stars on one side of Andromeda appear to be coming toward us, while stars on the opposite side appear to be going away from us, then the stars can be assumed to orbit the central object.

The extended stellar disk has gone undetected in the past because stars that appear in the region of the disk could not be known to be a part of the disk until their motions were calculated. In addition, the inhomogeneous "fuzz" that makes up the extended disk does not look like a disk, but rather appears to be a fragmented, messy halo built up from many previous galaxies' crashing into Andromeda, and it was assumed that stars in this region would be going every which way.

"Finding all these stars in an orderly rotation was the last explanation anyone would think of," says Chapman.

On the flip side, finding that the bulk of the complex structure in Andromeda's outer region is rotating with the disk is a blessing for studying the true underlying stellar halo of the galaxy. Using this new information, the researchers have been able to carefully measure the random motions of stars in the stellar halo, probing its mass and the form of the elusive dark matter that surrounds it.

Although the main work was done at the Keck Observatory, the original images that posed the possibility of an extended disk were taken with the Isaac Newton Telescope's Wide-Field Camera. The telescope, located in the Canary Islands, is intended for surveys, and in the case of this study, served well as a companion instrument.

Chapman says that further work will be needed to determine whether the extended disk is merely a quirk of the Andromeda galaxy, or is perhaps typical of other galaxies.

The main paper with which today's AAS news conference is concerned will be published this year in The Astrophysical Journal with the title "On the Accretion Origin of a Vast Extended Stellar Disk Around the Andromeda Galaxy." In addition to Chapman and Ibata, the other authors are Annette Ferguson, University of Edinburgh; Geraint Lewis, University of Sydney; Mike Irwin, Cambridge University; and Nial Tanvir, University of Hertfordshire.

 

 

Writer: 
Robert Tindol
Writer: 

Pages

Subscribe to RSS - PMA