Physicist Hirosi Ooguri Awarded for Novel Research on Black Holes

PASADENA, Calif.-Hirosi Ooguri, the Kavli Professor of Theoretical Physics at the California Institute of Technology, is a corecipient of the first ever Leonard Eisenbud Prize for Mathematics and Physics, awarded by the American Mathematical Society (AMS). The prize, created in 2006, has gone to Ooguri and coauthors Andrew Strominger and Cumrun Vafa of Harvard University for their paper "Black hole attractors and the topological string," published in 2004.

This work stems from concepts formulated by scientists Jacob Bekenstein and Stephen Hawking. Originally, scientists thought that a black hole must be simple in structure and somewhat dull as a phenomenon. In the 1970s, however, Bekenstein and Hawking proposed that a black hole would have entropy, and that its quantum configuration would have an exponentially large number of possibilities, much as there are a number of ways you can arrange the furniture in your bedroom.

In what the AMS calls a "beautiful and highly unexpected proposal," Ooguri and his coauthors related the property of black holes to state-of-the-art mathematics in higher dimensions. A new geometric method in six dimensions called topological string theory, whose development has been inspired by superstring theory, turned out to be essential in explaining the origin of the black hole entropy.

"We had an answer, which was topological string theory," says Ooguri. But they did not know how it could be applied. "It turns out counting the states of black holes was the question we had been looking for. This work was the discovery of the question." Ooguri says that this prize is exciting not just for his work, but because it recognizes the connection between physics and mathematics. Ooguri had trouble understanding physics while in high school until he took calculus.

"Mathematics is a language, and we need that language to understand the physics of our universe," says Ooguri. Mathematics and physics complement each other. Discoveries in physics can catalyze developments in mathematics, and vice versa.

The $5,000 prize was awarded to Ooguri, Strominger, and Vafa on January 7, 2008, at the AMS meeting in San Diego, the largest annual gathering of mathematicians in the world.

The AMS was founded in 1888 to advance mathematical research and scholarship. It aims to promote mathematical research and its uses through programs and services, to strengthen mathematical education, and to foster awareness and appreciation of mathematics and its connection to other disciplines and to everyday life. The society has 28,000 individual members in the United States and around the world.

David Eisenbud, a former president of the AMS, established the Leonard Eisenbud Prize for Mathematics and Physics in memory of his father, a mathematical physicist who died in 2004. The prize honors work that connects the two fields. The prize will be awarded every three years for a work published in the preceding six years.

John Schwarz, one of Ooguri's colleagues and the Brown Professor of Theoretical Physics at Caltech, says, "Hirosi Ooguri is one of the leading theoretical physicists in the world. Research on string theory and quantum field theory has had a profound impact on fundamental mathematics in recent times, and this is epitomized by Ooguri's contributions. I am delighted that he is receiving this richly deserved recognition."

 

Writer: 
Jacqueline Scahill
Writer: 
Exclude from News Hub: 
Yes

LIGO Sheds Light on Cosmic Event

PASADENA, Calif.-- An analysis by the international LIGO (Laser Interferometer Gravitational-Wave Observatory) Scientific Collaboration has excluded one previously leading explanation for the origin of an intense gamma-ray burst that occurred last winter. Gamma-ray bursts are among the most violent and energetic events in the universe, and scientists have only recently begun to understand their origins.

The LIGO project, which is funded by the National Science Foundation, was designed and is operated by the California Institute of Technology and the Massachusetts Institute of Technology for the purpose of detecting cosmic gravitational waves and for the development of gravitational-wave observations as an astronomical tool. Research is carried out by the LIGO Scientific Collaboration, a group of 580 scientists at universities around the United States and in 11 foreign countries. The LIGO Scientific Collaboration interferometer network includes the GEO600 interferometer, located in Hannover, Germany, funded by the Max-Plank-Gesellschaft/Science and Technologies Facilities Council and designed and operated by scientists from the Max Planck Institute for Gravitational Physics and partners in the United Kingdom.

Each of the L-shaped LIGO interferometers (including the 2-km and 4-km detectors in Hanford, Washington, and a 4-km instrument in Livingston, Louisiana) uses a laser split into two beams that travel back and forth down long arms, each of which is a beam tube from which the air has been evacuated. The beams are used to monitor the distance between precisely configured mirrors. According to Albert Einstein's 1916 general theory of relativity, the relative distance between the mirrors will change very slightly when a gravitational wave--a distortion in space-time, produced by massive accelerating objects, that propagates outward through the universe--passes by. The interferometer is constructed in such a way that it can detect a change of less than a thousandth the diameter of an atomic nucleus in the lengths of the arms relative to each other.

On February 1, 2007, the Konus-Wind, Integral, Messenger, and Swift gamma-ray satellites measured a short but intense outburst of energetic gamma rays originating in the direction of M31, the Andromeda galaxy, located 2.5 million light-years away. The majority of such short (less than two seconds in duration) gamma-ray bursts (GRBs) are thought to emanate from the merger and coalescence of two massive but compact objects, such as neutron stars or black-hole systems. They can also come from astronomical objects known as soft gamma-ray repeaters, which are less common than binary coalescence events and emit less energetic gamma rays.

During the intense blast of gamma rays, known as GRB070201, the 4-km and 2-km gravitational-wave interferometers at the Hanford facility were in science mode and collecting data. They did not, however, measure any gravitational waves in the aftermath of the burst.

That non-detection was itself significant.

The burst had occurred along a line of sight that was consistent with it originating from one of Andromeda's spiral arms, and a binary coalescence event--the merger of two neutron stars or black holes, for example--was considered among the most likely explanations. Such a monumental cosmic event occurring in a nearby galaxy should have generated gravitational waves that would be easily measured by the ultrasensitive LIGO detectors. The absence of a gravitational-wave signal meant GRB070201 could not have originated in this way in Andromeda. Other causes for the event, such as a soft gamma-ray repeater or a binary merger from a much further distance, are now the most likely contenders.

LIGO's contribution to the study of GRB070201 marks a milestone for the project, says Caltech's Jay Marx, LIGO's executive director: "Having achieved its design goals two years ago, LIGO is now producing significant scientific results. The nondetection of a signal from GRB070201 is an important step toward a very productive synergy between gravitational-wave and other astronomical communities that will contribute to our understanding of the most energetic events in the cosmos." "This is the first time that the field of gravitational-wave physics has made a significant contribution to the gamma-ray astronomical community, by searching for GRBs in a way that electromagnetic observations cannot," adds David Reitze, a professor of physics at the University of Florida and spokesperson for the LIGO Collaboration.

Up until now, Reitze says, astronomers studying GRBs relied solely on data obtained from telescopes conducting visible, infrared, radio, X-ray, and gamma-ray observations. Gravitational waves offer a new window into the nature of these events.

"We are still baffled by short GRBs. The LIGO observation gives a tantalizing hint that some short GRBs are caused by soft gamma repeaters. It is an important step forward," says Neil Gehrels, the lead scientist of the Swift mission at NASA's Goddard Space Flight Center.

"This result is not only a breakthrough in connecting observations in the electromagnetic spectrum to gravitational-wave searches, but also in the constructive integration of teams of complementary expertise. Our findings imply that multimessenger astronomy will become a reality within the next decade, opening a wonderful opportunity to gain insight on some of the most elusive phenomena of the universe," says Szabolcs Márka, an assistant professor of physics at Columbia University.

The next major construction milestone for LIGO will be the Advanced LIGO Project, which is expected to start in 2008. But Advanced LIGO, which will utilize the infrastructure of LIGO, will be 10 times more sensitive. Advanced LIGO will incorporate advanced designs and technologies for mirrors and lasers that have been developed by the GEO project and have allowed the GEO detector to achieve enough sensitivity to participate in this discovery despite its smaller size.

The increased sensitivity will be important because it will allow scientists to detect cataclysmic events such as black-hole and neutron-star collisions at 10-times-greater distances.

Writer: 
Kathy Svitil
Writer: 

High Energy Physicists Set New Record for Network Data Transfer

80+ Gbps Sustained Rates for Hours Set a New Standard and Demonstrate that Current and Next Generation Long-Range Networks Can Be Used Efficiently by Small Computing Clusters

RENO, Nevada--Building on six years of record-breaking developments, an international team of physicists, computer scientists, and network engineers led by the California Institute of Technology, the University of Michigan, the National Institute of Information Technology in Pakistan, Polytehnica University in Romania, Fermilab, Brookhaven National Laboratory, and CERN, and partners from Brazil (Rio de Janeiro State University, UERJ, and two of the State Universities of São Paulo, USP and UNESP) and Korea (Kyungpook National University, KISTI) joined forces to set new records for sustained data transfer among storage systems during the SuperComputing 2007 (SC07) conference.

Caltech's exhibit at SC07 by the High Energy Physics (HEP) group and the Center for Advanced Computing Research (CACR) demonstrated applications for globally distributed data analysis for the Large Hadron Collider (LHC) at CERN, along with Caltech's collaboration system EVO (Enabling Virtual Organizations; http://evo.caltech.edu), near real-time simulations of earthquakes in the Southern California region, experiences in time-domain astronomy with Google Sky, and recent results in multiphysics multiscale modeling.

The focus of the exhibit was the HEP team's record-breaking demonstration of storage-to-storage data transfer over wide-area networks from a single rack of servers on the exhibit floor. The high-energy physics team's demonstration of "High Speed Data Distribution for Physics Discoveries at the Large Hadron Collider" achieved a bidirectional peak throughput of 88 gigabits per second (Gbps) and a sustained data flow of more than 80 Gbps for two hours among clusters of servers on the show floor and at Caltech, Michigan, Fermilab, CERN, Brazil, Korea, and locations in the US LHCNet network in Chicago, New York, and Amsterdam.

Following up on the previous record transfer at 17.8 Gbps among storage systems on a single 10-Gbps link at SC06, the team used just four pairs of servers and disk arrays to sustain bidirectional transfers of 18 to 19 Gbps on the "UltraLight" link between Caltech and Reno for more than a day, often touching the theoretical limit on a single link. By sustaining rates of more than 40 Gbps at times in both directions, the results showed that a well-designed and configured single rack of servers is now capable of saturating the next generation of wide-area network links, which have a capacity of 40 Gbps in each direction.

The record-setting demonstration was made possible through the use of seven 10-Gbps links to SC07 provided by SCinet, CENIC, National Lambda Rail, and Internet2, together with a fully populated Cisco 6500E series switch-router, 10-gigabit Ethernet network interfaces provided by Intel and Myricom, and a fiber channel disk array provided by Data Direct Networks equipped with 4-Gbps host bus adapters from QLogic. The server equipment consisted of 36 widely available Supermicro systems using dual quad-core Intel Xeon processors, and Western Digital SATA disks.

One of the key advances in this demonstration was Fast Data Transfer (FDT; http://monalisa.cern.ch/FDT), a Java application developed by the Caltech team in close collaboration with the Polytehnica Bucharest team. FDT runs on all major platforms and uses the NIO libraries to achieve stable disk reads and writes coordinated with smooth data flow across long-range networks. The FDT application streams a large set of files across an open TCP socket, so that a large data set composed of thousands of files, as is typical in high-energy physics applications, can be sent or received at full speed, without the network transfer restarting between files. FDT works with Caltech's MonALISA system to dynamically monitor the capability of the storage systems as well as the network path in real-time, and sends data out to the network at a moderated rate that achieves smooth data flow across long-range networks.

By combining FDT with FAST TCP, developed by Steven Low of Caltech's computer science department, together with an optimized Linux kernel, provided by Shawn McKee of Michigan, known as the "UltraLight kernel," the team reached an unprecedented throughput level of 10 gigabytes/sec with a single rack of servers, limited only by the speed of the disk systems. Additionally, the team found that its combination of an advanced application, TCP protocol stack and kernel, and the use of real-time monitoring and multiple threads to sustain the data flow, performed extremely well even on network links with significant levels of packet loss.

The 10-Gbps network connections used included a dedicated link via CENIC to Caltech; two National Lambda Rail (NLR) FrameNet links and two NLR PacketNet links to Los Angeles and Seattle; Pacific Wave, and the Internet2 network to Chicago. Onward links included multiple links to Fermilab provided by ESnet; two links between Starlight in Chicago and Michigan provided by MiLR; US LHCNet (co-managed by Caltech and CERN) across the Atlantic; the GLORIAD link to Korea; and Florida Lambda Rail and the CHEPREO/WHREN-LILA link.

Overall, this year's demonstration, following the team's record memory-to-memory transfer rate of 151 Gbps at SuperComputing 2005 and its storage-to-storage record on a single link at SuperComputing 2006, represents a major milestone in providing practical, widely deployable applications capable of massive data transfers. The applications at SC07 exploited advances in state-of-the-art TCP-based data transport, servers (Intel Woodcrest-based systems), storage systems, and the Linux kernel over the last 24 months. FDT also represents a clear advance in basic data transport capability over wide-area networks compared to 2005, in that 20 Gbps could be sustained in a few streams memory-to-memory over long distances very stably for many hours, using a single 10-gigabit Ethernet link very close to full capacity in both directions.

The two largest physics collaborations at the LHC, CMS and ATLAS, each encompassing more than 2,000 physicists and engineers from 170 universities and laboratories, are about to embark on a new round of exploration at the frontier of high energies, breaking new ground in our understanding of the nature of matter and space-time and searching for new particles, when the LHC accelerator and their experiments begin operation next summer. In order to fully exploit the potential for scientific discoveries, the many petabytes of data produced by the experiments will be processed, distributed, and analyzed using a global Grid of 130 computing and storage facilities located at laboratories and universities around the world.

The key to discovery is the analysis phase, where individual physicists and small groups repeatedly access, and sometimes extract and transport, multiterabyte data sets on demand, in order to optimally select the rare "signals" of new physics from potentially overwhelming "backgrounds" from already-understood particle interactions. This data will amount to many tens of petabytes in the early years of LHC operation, rising to the exabyte range within the coming decade.

Professor Harvey Newman of Caltech, head of the HEP team and US CMS Collaboration board chair, who originated the LHC Data Grid Hierarchy concept, said, "This demonstration was a major milestone showing the robustness and production-readiness of a new class of data-transport applications where each of the LHC Tier-1 and Tier-2 computing clusters could be used to acquire or distribute data fully using the capacity of the current generation of 10-Gbps, as well as the next generation of 40-Gbps network links. We also demonstrated the real-time analysis of some of the data using `ROOTlets,' a distributed form of the ROOT system (http://root.cern.ch) that is an essential element of high-energy physicists' arsenal of tools for large-scale data analysis.

"These demonstrations provided a new, more agile and flexible view of the globally distributed LHC Grid system that spans the U.S., Europe, Asia, and Latin America, along with several hundred computing clusters serving individual groups of physicists. By substantially reducing the difficulty of transporting terabyte-and-larger-scale data sets among the sites, we are enabling physicists throughout the world to have a much greater role in the next round of physics discoveries expected soon after the LHC starts."

David Foster, head of Communications and Networking at CERN, said, "The efficient use of high-speed networks to transfer large data sets is an essential component of CERN's LHC Computing Grid (LCG) infrastructure that will enable the LHC experiments to carry out their scientific missions."

Iosif Legrand, senior software and distributed system engineer at Caltech, the technical coordinator of the MonALISA and FDT projects, said, "We demonstrated a realistic, worldwide deployment of distributed, data-intensive applications capable of effectively using and coordinating high-performance networks. A distributed agent-based system was used to dynamically discover network and storage resources, and to monitor, control, and orchestrate efficient data transfers among hundreds of computers."

Richard Cavanaugh of the University of Florida, technical coordinator of the UltraLight project that is developing the next generation of network-integrated grids aimed at LHC data analysis, said, "By demonstrating that many 10-Gbps wavelengths can be used efficiently over continental and transoceanic distances (often in both directions simultaneously), the high-energy physics team showed that this vision of a worldwide dynamic Grid supporting many terabyte-and-larger data transactions is practical."

Shawn McKee, associate research scientist in the University of Michigan department of physics and leader of the UltraLight network technical group, said, "This achievement is an impressive example of what a focused network and storage system effort can accomplish. It is an important step towards the goal of delivering a highly capable end-to-end network-aware system and architecture that meet the needs of next-generation e-Science."

Paul Sheldon of Vanderbilt University, who leads the NSF-funded Research and Education Data Depot Network (REDDnet) project that is deploying a distributed storage infrastructure, commented on the innovative network storage technology that helped the group achieve such high performance in wide-area, disk-to-disk transfers. "When you combine this network-storage technology, including its cost profile, with the remarkable tools that Harvey Newman's networking team has produced, I think we are well positioned to address the incredible infrastructure demands that the LHC experiments are going to make on our community worldwide."

The team hopes this new demonstration will encourage scientists and engineers in many sectors of society to develop and plan to deploy a new generation of revolutionary Internet applications. Multigigabit/s end-to-end network performance will empower scientists to form "virtual organizations" on a planetary scale, sharing their collective computing and data resources in a flexible way. In particular, this is vital for projects on the frontiers of science and engineering, in data-intensive fields such as particle physics, astronomy, bioinformatics, global climate modeling, geosciences, fusion, and neutron science.

The new bandwidth record was achieved through extensive use of the SCinet network infrastructure at SC07. The team used all 10 of the 10-Gbps links coming into the showfloor, connected to two Cisco Systems Catalyst 6500 Series switches at the Caltech booth, together with 10-gigabit Ethernet server interfaces provided by Intel and Myricom.

As part of the SC07 demonstration, a distributed analysis of simulated LHC physics data was carried using the Grid-enabled Analysis Environment (GAE) developed at Caltech for the LHC. This demonstration involved the use of the Clarens Web Services portal developed at Caltech, the use of Root-based analysis software, and numerous architectural components developed in the framework of Caltech's "Grid Analysis Environment." The analysis made use of a new component in the Grid system: "Rootlets" hosted by Clarens servers. Each Rootlet is a full instantiation of CERN's Root tool, created on demand by the distributed clients in the Grid. The design and deployment of the Rootlets/Clarens system was carried out under the auspices of an STTR grant for collaboration between Deep Web Technologies (www.deepwebtech.com) of New Mexico, Caltech, and Indiana University.

The team used Caltech's MonALISA (MONitoring Agents using a Large Integrated Services Architecture-http://monalisa.caltech.edu) system to monitor and display the real-time data for all the network links used in the demonstration. MonALISA is a Dynamic, Distributed Service System that is capable of collecting any type of information from different systems, analyzing it in near-real time, and providing support for automated control decisions and global optimization of workflows in complex grid systems. It is currently used to monitor 340 sites, more than 50,000 computing nodes, and tens of thousands of concurrent jobs running on different grid systems and scientific communities.

MonALISA is a highly scalable set of autonomous, self-describing, agent-based subsystems which are able to collaborate and cooperate in performing a wide range of monitoring tasks for networks and Grid systems, as well as the scientific applications themselves. Vanderbilt demonstrated the capabilities of their L-Store middleware, a scalable, open-source, and wide-area-capable form of storage virtualization that builds on the Internet Backplane Protocol (IBP) and logistical networking technology developed at the University of Tennessee. Offering both scalable metadata management and software-based fault tolerance, L-Store creates an integrated system that provides a single file-system image across many IBP storage "depots" distributed across wide-area and/or local-area networks. Reading and writing between sets of depots at the Vanderbilt booth at SC06 and Caltech in California, the team achieved a network throughput, disk to disk, of more than one GByte/sec. On the floor, the team was able to sustain throughputs of 3.5 GByte/sec between a rack of client computers and a rack of storage depots. These two racks communicated across SCinet via four 10-GigE connections.

Further information about the demonstration may be found at: http://supercomputing.caltech.edu

About Caltech: With an outstanding faculty, including five Nobel laureates, and such off-campus facilities as the Jet Propulsion Laboratory, Palomar Observatory, and the W. M. Keck Observatory, the California Institute of Technology is one of the world's major research centers. The Institute also conducts instruction in science and engineering for a student body of approximately 900 undergraduates and 1,200 graduate students who maintain a high level of scholarship and intellectual achievement. Caltech's 124-acre campus is situated in Pasadena, California, a city of 135,000 at the foot of the San Gabriel Mountains, approximately 30 miles inland from the Pacific Ocean and 10 miles northeast of the Los Angeles Civic Center. Caltech is an independent, privately supported university, and is not affiliated with either the University of California system or the California State Polytechnic universities. http://www.caltech.edu

About CACR: Caltech's Center for Advanced Computing Research (CACR) performs research and development on leading-edge networking and computing systems, and methods for computational science and engineering. Some current efforts at CACR include the National Virtual Observatory, ASC Center for Simulation of Dynamic Response of Materials, Computational Infrastructure for Geophysics, Cascade High Productivity Computing System, and the TeraGrid. http://www.cacr.caltech.edu/

About CERN: CERN, the European Organization for Nuclear Research, has its headquarters in Geneva. At present, its member states are Austria, Belgium, Bulgaria, the Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Italy, the Netherlands, Norway, Poland, Portugal, Slovakia, Spain, Sweden, Switzerland, and the United Kingdom. Israel, Japan, the Russian Federation, the United States of America, Turkey, the European Commission, and UNESCO have observer status. For more information, see http://www.cern.ch.

About Netlab: Caltech's Networking Laboratory, led by Professor Steven Low, develops FAST TCP. The group does research in the control and optimization of protocols and networks, and designs, analyzes, implements, and experiments with new algorithms and systems. http://netlab.caltech.edu

About NIIT: NIIT (NUST Institute of Information Technology), established in 1999 as the IT wing of National University of Sciences and Technology, ranks today with premier engineering institutions in Pakistan . It possesses state-of-the-art equipment and prides itself on its faculty, a team of highly capable and dedicated professionals. NIIT is also reputed abroad for its collaborative research linkages with CERN, Geneva; Stanford (SLAC), U.S.A.; Caltech, U.S.A; and EPFL, Switzerland, just to name a few. NIIT continually seeks to stay in the forefront as a center of academic and research excellence. Efforts are under way to seek collaboration with sound Pakistani universities willing to participate in joint research projects. NIIT is also engaged in close interaction with indigenous industrial entrepreneurs in IT, electronics, and communication engineering through NIIT-Corporate Advisory councils. http://www.niit.edu.pk

About the University of Michigan: The University of Michigan, with its size, complexity, and academic strength, the breadth of its scholarly resources, and the quality of its faculty and students, is one of America's great public universities and one of the world's premier research institutions. The university was founded in 1817 and has a total enrollment of 54,300 on all campuses. The main campus is in Ann Arbor, Michigan, and has 39,533 students (fall 2004). With over 600 degree programs and $739M in FY05 research funding, the university is one of the leaders in innovation and research. For more information, see http://www.umich.edu.

About Fermilab: Fermi National Accelerator Laboratory (Fermilab) is a national laboratory funded by the Office of Science of the U.S. Department of Energy, operated by Fermi Research Alliance, LLC. Experiments at Fermilab's Tevatron, the world's highest-energy particle accelerator, generate petabytes of data per year, and involve large, international collaborations with requirements for high-volume data movement to their home institutions. It is also the western hemisphere Tier-1 data host for the upcoming CMS experiment at the HC. The laboratory actively works to remain on the leading edge of advanced wide-area network technology in support of its science collaborations.

About UERJ (Rio de Janeiro): Founded in 1950, the Rio de Janeiro State University (UERJ; http//www.uerj.br) ranks among the ten largest universities in Brazil, with more than 23,000 students. UERJ's five campuses are home to 22 libraries, 412 classrooms, 50 lecture halls and auditoriums, and 205 laboratories. UERJ is responsible for important public welfare and health projects through its centers of medical excellence, the Pedro Ernesto University Hospital (HUPE) and the Piquet Carneiro Day-care Policlinic Centre, and it is committed to the preservation of the environment. The UERJ High Energy Physics group includes 15 faculty, postdoctoral, and visiting PhD physicists and 12 PhD and master's students, working on experiments at Fermilab (D0) and CERN (CMS). The group has constructed a Tier2 center to enable it to take part in the Grid-based data analysis planned for the LHC, and has originated the concept of a Brazilian "HEP Grid," working in cooperation with USP and several other universities in Rio and São Paulo.

About UNESP (São Paulo): Created in 1976 with the administrative union of several isolated institutes of higher education in the State of São Paulo, the São Paulo State University, UNESP, has 39 institutes in 23 different cities in the State of São Paulo. The university has 33,500 undergraduate students in 168 different courses and almost 13,000 graduate students. Since 1999 the university has had a group participating in the DZero Collaboration of Fermilab, which is operating the São Paulo Regional Analysis Center (SPRACE). This group is now a member of CMS Collaboration of CERN. See http://www.unesp.br.

About USP (São Paulo): The University of São Paulo, USP, is the largest institution of higher education and research in Brazil, and the third in size in Latin America. The university has most of its 35 units located on the campus of the capital of the state. It has around 40,000 undergraduate students and around 25,000 graduate students. It is responsible for almost 25 percent of all Brazilian papers and publications indexed on the Institute for Scientific Information (ISI). The SPRACE cluster is located at the Physics Institute. See http://www.usp.br.

About Kyungpook National University (Daegu): Kyungpook National University is one of the leading universities in Korea, especially in physics and information science. The university has 13 colleges and 9 graduate schools with 24,000 students. It houses the Center for High Energy Physics (CHEP), in which most Korean high-energy physicists participate. CHEP (chep.knu.ac.kr) was approved as one of the designated Excellent Research Centers supported by the Korean Ministry of Science.

About GLORIAD: GLORIAD (GLObal RIng network for Advanced application development) is the first round-the-world high-performance ring network jointly established by Korea, the United States, Russia, China, Canada, the Netherlands, and the Nordic countries, with optical networking tools that improve networked collaboration with e-Science and Grid applications. It is currently constructing a dedicated lightwave link connecting the scientific organizations in GLORIAD partners. See http://www.gloriad.org/.

About CHEPREO: Florida International University (FIU), in collaboration with partners at Florida State University, the University of Florida, and the California Institute of Technology, has been awarded an NSF grant to create and operate an interregional Grid-enabled Center from High-Energy Physics Research and Educational Outreach (CHEPREO; www.chepreo.org) at FIU. CHEPREO encompasses an integrated program of collaborative physics research on CMS, network infrastructure development, and educational outreach at one of the largest minority universities in the U.S. The center is funded by four NSF directorates, including Mathematical and Physical Sciences, Scientific Computing Infrastructure, Elementary, Secondary and Informal Education, and International Programs.

About Internet2®: Led by more than 200 U.S. universities working with industry and government, Internet2 develops and deploys advanced network applications and technologies for research and higher education, accelerating the creation of tomorrow's Internet. Internet2 recreates the partnerships among academia, industry, and government that helped foster today's Internet in its infancy. For more information, visit: www.internet2.edu.

About National LambdaRail: National LambdaRail (NLR) is a major initiative of U.S. research universities and private-sector technology companies to provide a national-scale infrastructure for research and experimentation in networking technologies and applications. NLR puts the control, the power, and the promise of experimental network infrastructure in the hands of the nation's scientists and researchers. Visit http://www.nlr.net for more information.

About StarLight: StarLight is an advanced optical infrastructure and proving ground for network services optimized for high-performance applications. Operational since summer 2001, StarLight is a 1 GE and 10 GE switch/router facility for high-performance access to participating networks and also offers true optical switching for wavelengths. StarLight is being developed by the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC), the International Center for Advanced Internet Research (iCAIR) at Northwestern University, and the Mathematics and Computer Science Division at Argonne National Laboratory, in partnership with Canada's CANARIE and the Netherlands' SURFnet. STAR TAP and StarLight are made possible by major funding from the U.S. National Science Foundation to UIC. StarLight is a service mark of the Board of Trustees of the University of Illinois. See www.startap.net/starlight.

About the Florida LambdaRail: Florida LambdaRail LLC (FLR) is a Florida limited liability company formed by member higher education institutions to advance optical research and education networking within Florida. Florida LambdaRail is a high-bandwidth optical network that links Florida's research institutions and provides a next-generation network in support of large-scale research, education outreach, public/private partnerships, and information technology infrastructure essential to Florida's economic development. For more information: http://www.flrnet.org.

About CENIC: CENIC (www.cenic.org) is a not-for-profit corporation serving the California Institute of Technology, California State University, Stanford University, University of California, University of Southern California, California Community Colleges, and the statewide K-12 school system. CENIC's mission is to facilitate and coordinate the development, deployment, and operation of a set of robust multitiered advanced network services for this research and education community.

About ESnet: The Energy Sciences Network (ESnet; www.es.net) is a high-speed network serving thousands of Department of Energy scientists and collaborators worldwide. A pioneer in providing high-bandwidth, reliable connections, ESnet enables researchers at national laboratories, universities, and other institutions to communicate with each other using the collaborative capabilities needed to address some of the world's most important scientific challenges. Managed and operated by the ESnet staff at Lawrence Berkeley National Laboratory, ESnet provides direct high-bandwidth connections to all major DOE sites, multiple cross connections with Internet2/Abilene, and connections to Europe via GEANT and to Japan via SuperSINET, as well as fast interconnections to more than 100 other networks. Funded principally by DOE's Office of Science, ESnet services allow scientists to make effective use of unique DOE research facilities and computing resources, independent of time and geographic location.

About AMPATH: Florida International University's Center for Internet Augmented Research and Assessment (CIARA) has developed an international, high-performance research connection point in Miami, Florida, called AMPATH (AMericasPATH; www.ampath.fiu.edu). AMPATH's goal is to enable wide-bandwidth digital communications between U.S. and international research and education networks, as well as a variety of U.S. research programs in the region. AMPATH in Miami acts as a major international exchange point (IXP) for the research and education networks in South America, Central America, Mexico, and the Caribbean. The AMPATH IXP is home for the WHREN-LILA high-performance network link connecting Latin America to the U.S., funded by the NSF and the Academic Network of São Paulo.

About the Academic Network of São Paulo (ANSP): ANSP unites São Paulo's University networks with Scientific and Technological Research Centers in São Paulo, and is managed by the State of São Paulo Research Foundation (FAPESP). The ANSP Network is another example of international collaboration and exploration. Through its connection to WHREN-LILA, all of the institutions connected to ANSP will be involved in research with U.S. universities and research centers, offering significant contributions and the potential to develop new applications and services. This connectivity with WHREN-LILA and ANSP will allow researchers to enhance the quality of current data, inevitably increasing the quality of new scientific developments. http://www.ansp.br

About RNP: RNP, the National Education and Research Network of Brazil, is a not-for-profit company that promotes the innovative use of advanced networking, with the joint support of the Ministry of Science and Technology and the Ministry of Education. In the early 1990s, RNP was responsible for the introduction and adoption of Internet technology in Brazil. Today, RNP operates a nationally deployed multigigabit network used for collaboration and communication in research and education throughout the country, reaching all 26 states and the Federal District, and provides both commodity and advanced research Internet connectivity to more than 300 universities, research centers, and technical schools. http://www.rnp.br

About KISTI: KISTI (Korea Institute of Science and Technology Information) is a national institute under the supervision of MOST (Ministry of Science and Technology) of Korea and is playing a leading role in building the nationwide infrastructure for advanced application researches by linking the supercomputing resources with the optical research network, KREONet2. The National Supercomputing Center in KISTI is carrying out national e-Science and Grid projects as well as the GLORIAD-KR project and will become the most important institution based on e-Science and advanced network technologies. See http://www.kisti.re.kr.

About Intel: Intel, the world leader in silicon innovation, develops technologies, products, and initiatives to continually advance how people work and live. Additional information about Intel is available at www.intel.com/pressroom and blogs.intel.com.

About Myricom: Founded in 1994, Myricom Inc. created Myrinet, the High-Performance Computing (HPC) interconnect technology used in many thousands of computing clusters in more than 50 countries. With its Myri-10G solutions, Myricom achieved a convergence at 10-Gigabit data rates between its low-latency Myrinet technology and mainstream Ethernet. Myri-10G bridges the gap between the rigorous demands of traditional HPC and the growing need for affordable computing speed in enterprise data centers. Myricom solutions are sold direct and through channels. Myri-10G clusters are supplied by OEM computer companies and by leading cluster integrators worldwide. Privately held and based in Arcadia, California, Myricom achieved and has sustained profitability since 1995 with 12 consecutive profitable years through 2006.

About Data Direct Networks: DataDirect Networks is the leading provider of scalable storage systems for performance- and capacity-driven applications. DataDirect's S2A (Silicon Storage Appliance) architecture enables modern applications such as video streaming, content delivery, modeling and simulation, backup and archiving, cluster and supercomputing, and real-time collaborative workflows that are driving the explosive demand for storage performance and capacity. DataDirect's S2A technology and solutions solve today's most challenging storage requirements, including providing shared, high-speed access to a common pool of data, minimizing data center footprints and storage costs for massive archives, reducing simulation computational times, and capturing and serving massive amounts of digital content. www.datadirectnet.com

About QLogic: QLogic is a leading supplier of high-performance storage networking solutions, including Fibre Channel host bus adapters (HBAs), blade server embedded Fibre Channel switches, Fibre Channel stackable switches, iSCSI HBAs, iSCSI routers and storage services platforms for enabling advanced storage-management applications. The company is also a leading supplier of server networking products, including InfiniBand host channel adapters that accelerate cluster performance. QLogic products are delivered to small-to-medium businesses and large enterprises around the world via its channel partner community. QLogic products are also powering solutions from leading companies like Cisco, Dell, EMC, Hitachi Data Systems, HP, IBM, NEC, Network Appliance, and Sun Microsystems. QLogic is a member of the S&P 500 Index. For more information go to www.qlogic.com.

About the National Science Foundation: The NSF is an independent federal agency created by Congress in 1950 "to promote the progress of science; to advance the national health, prosperity, and welfare; to secure the national defense...." With an annual budget of about $5.5 billion, it is the funding source for approximately 20 percent of all federally supported basic research conducted by America's colleges and universities. In many fields such as mathematics, computer science, and the social sciences, NSF is the major source of federal backing.

About the DOE Office of Science: DOE's Office of Science is the single largest supporter of basic research in the physical sciences in the nation and ensures U.S. world leadership across a broad range of scientific disciplines. The Office of Science also manages 10 world-class national laboratories with unmatched capabilities for solving complex interdisciplinary problems, and it builds and operates some of the nation's most advanced R&D user facilities, located at national laboratories and universities. These facilities are used by more than 19,000 researchers from universities, other government agencies, and private industry each year.

Acknowledgements The demonstration and the developments leading up to it were made possible through the strong support of the partner network organizations mentioned, the U.S. Department of Energy Office of Science, and the National Science Foundation, in cooperation with the funding agencies of the international partners, through the following grants: US LHCNet (DOE DE-FG02-05-ER41359),WAN In Lab (NSF EIA-0303620), UltraLight (NSF PHY-0427110), DISUN (NSF PHY-0533280), CHEPREO/WHREN-LILA (NSF PHY-0312038/OCI-0441095 and FAPESP Projeto 04/14414-2), as well as the NSF-funded PLaNetS, FAST TCP and CAIGEE projects, and the US LHC Research Program funded jointly by DOE and NSF.

Writer: 
Jill Perry
Writer: 

A Giant Step toward Infinitesimal Machinery

Pasadena, Calif.--What are the ultimate limits to miniaturization? How small can machinery--with internal workings that move, turn, and vibrate--be produced? What is the smallest scale on which computers can be built? With uncanny and characteristic insight, these are questions that the legendary Caltech physicist Richard Feynman asked himself in the period leading up to a famous 1959 lecture, the first on a topic now called nanotechnology. In a newly announced global Alliance for Nanosystems VLSI (very-large-scale integration), researchers at Caltech's Kavli Nanoscience Institute (KNI) in Pasadena, California, and at the Laboratoire d'Electronique et de Technologie de l'Information-Micro- and Nano-Technologies (CEA/LETI-MINATEC) in Grenoble, France, are working together to take the pursuit of this vision to an entirely new level.

For about three decades after Feynman's lecture, scientists paid little heed to what was apparently viewed as his fanciful dreams in this regard. But more recently, particularly in the past two decades, the field of nanotechnology has been solidly established. Underlying this is an immense amount of careful research, carried out in laboratories worldwide-work that has been realized one advance at a time.

To date, almost all of these pioneering investigations have focused upon solitary components and individual physical effects at the nanoscale. (One nanometer is a billionth of a meter, about ten times the size of a hydrogen atom and a million times smaller than the period at the end of this sentence.) These components hold great promise as the fundamental building blocks of complex future nanosystems, that is, as the ultraminiature machines and computers of Feynman's dreams. But, so far, very little work has actually been carried out to assemble these individual elements into complex architectures.

The Alliance for Nanosystems VLSI (NanoVLSI) is an unprecedented partnership founded to break this impasse. It is an international collaboration between researchers in nanoscience at Caltech and in microsystems science and engineering at CEA/LETI-MINATEC, one of the world's premier, state-of-the-art microelectronics research foundries.

Michael Roukes, professor of physics, applied physics, and bioengineering at Caltech, who was the founding director of Caltech's KNI, has spearheaded the initiative from the academic side. "There is a lot of hype about 'nano' being the solution to many different problems," says Roukes. "It's time for us to start delivering, but to do that we have to think about how to assemble and produce complex systems containing thousands of devices all singing in harmony."

Why complex systems? Huge programs, with millions of lines of code, make up the operating software for today's laptop computers. These must run on microelectronic chips that now integrate several hundred million transistors to achieve their immense computational power. Nanotechnology has the potential of carrying this kind of complexity into entirely new realms, going beyond electronic computation to include capabilities, for example, for detection of very small amounts of chemical and biological molecules, or for measurements on individual living cells within complex microfluidic systems, to name just a few. The first generation of these new chemical processors-on-a-chip is still quite simple compared to their ultimate potential. But already they are spawning new tools for research in the life sciences and medicine and new applications in clinical diagnosis.

A systems approach to nanotechnology is required to ramp up the complexity of these systems-on-a-chip. But achieving this requires access to the kind of multibillion-dollar fabrication capabilities used to build today's microprocessor chips. In such environments, standardized processes are the rule without exception. Experimentation with unconventional materials and techniques is strenuously avoided, since cutting-edge processes are highly susceptible to contamination. Extremely high quality at these foundries must be preserved to maintain production yield. But innovation must occur somewhere. For three decades, CEA/LETI-MINATEC has been fulfilling a critical role, pioneering the introduction of novel processes into state-of-the-art protocols used to produce VLSI microelectronic systems en masse. Within this new alliance, CEA/LETI-MINATEC researchers are now turning their attention to new challenges at the nanoscale. "The Alliance for Nanosystems VLSI is a perfect illustration of the potential for innovation generated by the meeting of science and technology," says Dr. Laurent Malier, the director of CEA/LETI-MINATEC. "I am excited to see Caltech and CEA/LETI-MINATEC share this ambition."

Those today who are working to advance nanoscale research and technology still find much inspiration in Feynman's early insights. He saw no fundamental reasons barring the assembly of machines and computers down to the smallest possible dimensions, namely, that of nature's basic building blocks-atoms and molecules. Step-by-step, with the help of partnerships like NanoVLSI, nanotechnology is approaching this dream.

For more information about the Alliance for Nanosystems VLSI, visit http://www.nanovlsi.org. For KNI, visit http://kni.caltech.edu, and for CEA/LETI-MINATEC, http://www-leti.cea.fr.

Writer: 
Jill Perry
Writer: 

Small Explorer Mission to Detect Black Holes Scheduled for 2011 Launch

PASADENA, Calif.-- NASA has given the go-ahead to restart an astrophysics mission that will provide a greater capability for using high-energy Xrays to detect black holes than any existing instrument has.

The Nuclear Spectroscopic Telescope Array, or NuSTAR, has been designed to answer fundamental questions about the universe, such as: How are black holes distributed through the cosmos? How were the elements of the universe created? What powers the most extreme active galaxies? NuSTAR will expand our ability to understand the origins and to predict the destinies of stars and galaxies.

NASA had cancelled the NuSTAR mission in 2006 due to funding pressures within the Science Mission Directorate, but is now ready to proceed to flight development. Expected launch is 2011.

In November 2003, NuSTAR was one of six proposals selected from 36 submitted to NASA's Explorer Program to fund lower-cost, highly focused, rapid-development scientific spacecraft. Fiona Harrison, professor of physics and astronomy at the California Institute of Technology, is the NuSTAR principal investigator. "It's great that NASA was able to restart the mission," says Harrison. "I'm incredibly excited about our planned science program, as well as the unanticipated things we are bound to discover with a new telescope this sensitive."

Harrison's team has been working on NuSTAR technology for more than 10 years. They began with a balloon payload, the High Energy Focusing Telescope (HEFT). They developed optics and detectors that together could image the universe at X-ray energies above where any mission has operated before. They tested these new technologies on the HEFT balloon experiment, and then compiled them on NuSTAR to make a telescope far more sensitive than any that has observed the high-energy X-ray sky. The mission also incorporates an extendable structure that was developed by the Jet Propulsion Laboratory and Alliant Techsystems Inc. for the Shuttle Radar Topography Mission and is now being used to fit the NuSTAR telescope into a small, inexpensive launch vehicle.

"We are very excited to be able restart the NuSTAR mission," says Alan Stern, associate administrator for the Science Mission Directorate at NASA Headquarters in Washington. "NuSTAR has more than 500 times the sensitivity of previous instruments that detect black holes. It's a great opportunity for us to explore an important astronomical frontier." Both Stern and Harrison point out that instruments like these have become smaller and more efficient, thereby reducing the mission's cost. "It's amazing that by using NASA's smallest mission platform, the Small Explorers, we can build something more capable than large missions that have operated at these energies," says Harrison.

NASA anticipates that NuSTAR will bridge a gap in astrophysics mission flights between the 2009 launch of the Wide-field Infrared Survey Explorer and the 2013 launch of the James Webb Space Telescope. The spacecraft will use high-energy Xrays to map areas of the sky and will complement astrophysics missions that explore the cosmos in other regions of the electromagnetic spectrum.

"NuSTAR will perform deep observations in hard Xrays to detect black holes of all sizes, and other exotic phenomena. It will perform cutting-edge science using advanced technologies and help to provide a balance between small and large missions in the astrophysics portfolio," says Jon Morse, director of the Astrophysics Division at NASA Headquarters.

NASA's Jet Propulsion Laboratory, Pasadena, California, manages the NuSTAR mission. The Goddard Space Flight Center, Greenbelt, Maryland, manages the Explorer Program for the Science Mission Directorate. Orbital Sciences Corporation, Dulles, Virginia, is the industry partner for the mission.

For more information about the NuSTAR mission, visit http://www.nustar.caltech.edu.

Writer: 
Elisabeth Nadin
Writer: 

Smallest Galaxies Solve a Big Problem

PASADENA, Calif.--An unusual population of the darkest, most lightweight galaxies known has shed new light on a cosmic conundrum. Astronomers used the W. M. Keck Observatory in Hawaii to show that the recently uncovered dwarf galaxies each contain 99 percent of a mysterious type of matter known as dark matter. Dark matter has gravitational effects on ordinary atoms but does not produce any light. It accounts for the majority of the mass in the universe.

New observations of eight of these galaxies now suggest that the "Missing Dwarf Galaxy" problem--a discrepancy between the number of extremely small, faint galaxies that cosmological theories predict should exist near the Milky Way, and the number that have actually been observed--is not as severe as previously thought, and may have been solved completely.

"These new dwarf galaxies are fascinating systems, not only because of their major contribution to the Missing Dwarf problem, but also as individual galaxies," says Josh Simon, a Millikan Postdoctoral Scholar at the California Institute of Technology and the lead author of the study. "We had no idea that such small galaxies could even exist until these objects were discovered last year."

The Missing Dwarf Galaxy puzzle comes from a prediction of the "Cold Dark Matter" model, which explains the growth and evolution of the universe. The theory predicts that large galaxies like the Milky Way should be surrounded by a swarm of up to several hundred smaller galaxies, known as "dwarf galaxies" because of their diminutive size. But until recently, only 11 such companions were known to be orbiting the Milky Way. To explain why the missing dwarfs were not seen, theorists suggested that although hundreds of the galaxies indeed may exist near the Milky Way, most have few, if any, stars. If so, they would be comprised almost entirely of dark matter.

In the past two years, researchers struggling to prove the existence of nearly invisible galaxies were aided by images from the Sloan Digital Sky Survey that revealed as many as 12 additional very faint dwarf galaxies near the Milky Way. The new systems are unusually small, even compared to other dwarf galaxies; the least massive among them contain only 1 percent as many stars as the most minuscule galaxies previously known.

"When they were discovered, we didn't know if all of these ultrafaint objects were actually galaxies," says Marla Geha, a Plaskett Research Fellow at the National Research Council Canada's Herzberg Institute of Astrophysics. "We thought some of them might simply be globular star clusters, or that they could be the shredded remnants of ancient galaxies torn apart by the Milky Way long ago. To test these possibilities, we needed to measure their masses."

Simon and Geha used the DEIMOS spectrograph on the 10-meter Keck II telescope at the W. M. Keck Observatory in Hawaii to conduct follow-up studies of eight of the new galaxies. The duo used the Doppler effect--a shift in the wavelength of the light coming from the galaxies caused by their motion with respect to the earth--to determine the speeds of stars within the dwarf galaxies. "Because stars in galaxies move only under the influence of gravity, their speeds are determined by the total mass of the galaxy," says Simon. To the researchers' surprise, each system was among the smallest ever measured, more than 10,000 times less massive than the Milky Way.

"It seems that very small, ultrafaint galaxies are far more plentiful than we thought. I'm astonished that so many tiny, dark matter-dominated galaxies have now been discovered," says Geha.

"The formation of such small galaxies is not very well understood from a theoretical perspective," adds Simon. "Explaining how stars form inside these remarkably tiny galaxies is difficult, and so it is hard to predict exactly how many star-containing dwarfs we should find near the Milky Way. Our work narrows the gap between the Cold Dark Matter theory and observations by significantly increasing the number of Milky Way dwarf galaxies and telling us more about the properties of these galaxies."

Numerous, repeated measurements of 814 stars in the eight dwarf galaxies were obtained at W. M. Keck Observatory. The speeds of the stars, ranging from about 4 to 7 km/s, were much slower than the stellar velocities in any other known galaxy. For comparison, the sun orbits the center of the Milky Way at about 220 km/s. The astronomers measured precise speeds for 18 to 214 stars in each galaxy, three times more stars per galaxy than any previous study.

"This is a significant paper," says Taft Armandroff, director of the W. M. Keck Observatory, whose research includes the study of dwarf galaxies. "It is a compelling example of how large, ground-based telescopes can precisely measure the orbits of distant stars on the sky to just a few kilometers per second. I expect DEIMOS will soon tell us about the chemical composition of these stars to help us better understand how star formation takes place in such small galaxies."

Some parameters of the Cold Dark Matter theory can now be updated to match observed conditions in the local universe. Based on the masses measured for the new dwarf galaxies, Simon and Geha conclude that the fierce ultraviolet radiation given off by the first stars, born just a few hundred million years after the Big Bang, may have blown away all of the hydrogen gas from dwarf galaxies also forming at that time. The loss of gas prevented the galaxies from creating new stars, leaving them very faint, or, in many cases, completely dark. When this effect is included in theoretical models, the number of expected dwarf galaxies agrees with the number of observed dwarf galaxies.

"One implication of our results is that up to a few hundred completely dark galaxies really should exist in the Milky Way's cosmic neighborhood," says Geha. "If the Cold Dark Matter model is correct they have to be out there, and the next challenge for astronomers will be finding a way to detect their presence."

Although the Sloan Digital Sky Survey was successful in finding a dozen ultrafaint dwarfs, it covered only about 25 percent of the sky. Future surveys that scan the remainder of the sky are expected to discover as many as 50 additional dark matter-dominated dwarf galaxies orbiting the Milky Way. Telescopes for one such effort, the Pan-STARRS project on Maui, are now under construction.

The paper, "Kinematics of the Ultra-Faint Milky Way Satellites: Solving the Missing Satellite Problem," will be published in the November 10 issue of The Astrophysical Journal. Funding for the project was provided by Caltech under the Millikan Fellowship program and by the Herzberg Institute of Astrophysics of the National Research Council Canada, and through a grant from the National Science Foundation (AST 0071048).

# # #

Scientists available for comment:

Joshua D. Simon California Institute of Technology Pasadena, California USA Phone: (626) 395-3693 jsimon@astro.caltech.edu

Marla Geha National Research Council Canada, Hertzberg Institute of Astrophysics Victoria, BC Canada Phone: (250) 363-8103 marla.geha@nrc-cnrc.gc.ca

Taft Armandroff Director W. M. Keck Observatory Phone: (808) 881-3855 newsletter@keck.hawaii.edu

Writer: 
Kathy Svitil
Writer: 

Astronomers Find Largest Exoplanet to Date

PASADENA, Calif.—An international team of astronomers has discovered the largest-radius and lowest-density exoplanet of all those whose mass and radius are known. It is a gas-giant planet about twice the size of Jupiter, and is likely to have a curved cometlike tail. It has been named TrES-4, to indicate that it is the fourth planet detected by the Trans-atlantic Exoplanet Survey (TrES) network of telescopes.

TrES-4 is in the constellation Hercules and is the 19th transiting planet discovered so far. It orbits the star catalogued as GSC02620-00648, which is about 440 parsecs (1,435 light-years) away from Earth.

A transiting planet is one that passes directly in front of its host star as seen from Earth. When a transiting planet passes between its star and Earth, the planet blocks some of the light from the star in a manner similar to that caused by the moon's passing between the sun and Earth during a solar eclipse. In the case of TrES-4, this reduces the starlight by one percent, a tiny, yet detectable, effect.

TrES-4 is noteworthy for having a radius 1.67 times that of Jupiter, yet a mass only 0.84 times Jupiter's, resulting in an extremely low density of 0.222 g cm-3. In comparison, Jupiter has a density of 1.3 g cm-3. The density of TrES-4 is so low that the planet would float on water.

The transiting planet also causes the star to undergo a small orbital motion, but measuring this effect (from which we can tell the mass of the planet) requires much larger telescopes, such as the Keck 10-meter telescope in Hawaii, as was used in the case of TrES-4. Measuring the mass of the planet is a vital step in confirming that the transiting object is indeed a planet and not a star.

"We continue to be surprised by how relatively large these giant planets can be", says Francis O'Donovan, a graduate student in astronomy at the California Institute of Technology who operates one of the TrES telescopes. "But if we can explain the sizes of these bloated planets in their harsh environments, it may help us better understand our own solar system's planets and their formation."

The study's lead author, Georgi Mandushev of Lowell Observatory in Arizona, noted the challenges such big planets present for theories of planet formation and evolution: "This find presents a new puzzle for astronomers who model the structure and atmospheres of giant planets. It highlights the diversity of physical properties among giant planets around other stars and indicates that we can expect more discoveries of unusual and enigmatic exoplanets in the near future."

TrES is a global network of three small telescopes utilizing mostly amateur-astronomy components and off-the-shelf four-inch camera lenses: Sleuth telescope at Caltech's Palomar Observatory in San Diego County; the Planet Search Survey Telescope (PSST) at Lowell Observatory; and the STellar Astrophysics and Research on Exoplanets (STARE) telescope in the Canary Islands.

Planet TrES-4 makes a complete revolution around its parent star every 3.55 days, so a year on this planet is shorter than a week on Earth. The planet is about seven million kilometers away from its star—over ten times closer than Mercury is to the Sun—and so it is heated by the intense starlight to about 1600 degrees Kelvin (about 2300 degrees Fahrenheit).

In terms of mass and distance to its sun, TrES-4 is similar to HD209458b, and like that planet, it may have an extended outer atmosphere. Astronomers hypothesize that the outer atmospheric layers may be able to escape the planet's gravity and form a curved cometlike tail.

To look for transits, the small telescopes are automated to take wide-field timed exposures of the clear skies on as many nights as possible. When an observing run is completed for a particular field—usually over an approximately two-month period—the data are run through software that corrects for various sources of distortion and noise.

The end result is a "light curve" for each of the thousands of stars in the field. If the software detects regular variations in the light curve for an individual star, then the astronomers do additional work to see if the source of the variation is indeed a transiting planet. One possible alternative is that the object passing in front of the star is another star, fainter and smaller.

In order to accurately measure the size and other properties of TrES-4, astronomers used the 0.8-meters telescope at Lowell Observatory, the 1.2-meter telescope at the Whipple Observatory (both in Arizona) and the 10-meter Keck Telescope in Hawaii.

Observations were carried out from September 2006 to April 2007.

The paper about the discovery of this extrasolar planet, "TrES-4: A Transiting Hot Jupiter of Very Low Density," has been accepted for publication by the Astrophysical Journal.

The paper's authors are Georgi Mandushev and Edward Dunham of Lowell Observatory; Francis O'Donovan, a graduate student at Caltech; Lynne Hillenbrand, an associate professor of astronomy at Caltech; David Charbonneau (Alfred P. Sloan Research Fellow), Guillermo Torres, David W. Latham, Gáspár Bakos (Hubble Fellow), Alessandro Sozzetti, José Fernández and Guilbert Esquerdo of the Harvard-Smithsonian Center for Astrophysics in Cambridge, Massachusetts; Mark Everett of the Planetary Science Institute in Tucson, Arizona; Timothy Brown of Las Cumbres Observatory Global Telescope Network; and Markus Rabus and Juan A. Belmonte of the Instituto de Astrofísica de Canarias in Tenerife, Spain.

This research is funded by NASA through its Origins program. The paper is available online at http://arxiv.org/.

Writer: 
Robert Tindol
Writer: 

International Consortium Is Created to Build World's Largest Submillimeter Telescope

PASADENA, Calif.—Five institutions from North America and Europe have created a consortium to oversee the building of a 25-meter submillimeter telescope on a high elevation in Chile. When completed in 2013, the $100 million instrument will be the premier telescope of its kind in the world.

The project is formally known as the Cornell Caltech Atacama Telescope (CCAT), and has been in the works since a $2 million feasibility/concept design study was begun in 2004 by the California Institute of Technology and Cornell University. Now that the study has been completed, the partners are moving to the next phase of the process.

The consortium members are the California Institute of Technology and its Jet Propulsion Laboratory (JPL), Cornell University, the University of Colorado at Boulder, the University of British Columbia, and the United Kingdom Astronomy Technology Centre, which is part of the Science and Technology Facilities Council.

According to deputy project manager Simon Radford, who is based on the Caltech campus, the telescope will employ recent advances in technology that will provide unprecedented views of astronomical phenomena that cannot be studied at other wavelengths.

Because submillimeter-wavelength astronomy is especially effective for imaging phenomena that do not emit much visible light, the Atacama telescope will allow observations of stars and planets forming from swirling disks of gas and dust, will make measurements to determine the composition of the molecular clouds from which the stars are born, and could even discover large numbers of galaxies undergoing huge bursts of star formation in the very distant universe.

Also, the 25-meter telescope could be used to study the origin of large-scale structure in the universe.

The Atacama telescope will be located at an 18,400-foot altitude atop Cerro Chajnantor in Chile's Atacama Desert. The high altitude and dry conditions are important for submillimeter research, which is hampered by moisture in the air.

Of the projected $100 million cost, $20 million will go to state-of-the-art instrumentation. In particular, large submillimeter cameras will complement the huge size of the dish, which, at 25 meters, will have more than twice the area of the largest submillimeter telescope currently in existence.

The new cameras are made possible by recent developments in sensitive superconducting detectors, an area in which Caltech physics professor Jonas Zmuidzinas and his colleagues have been making important contributions. The new wide-field cameras will produce very sensitive panoramic images of the submillimeter sky.

Scientists from Caltech and JPL who will be involved in the project include Andrew Blain, Geoff Blake, Paul Goldsmith, Sunil Golwala, Andrew Lange, Tom Phillips, Anthony Readhead, Anneila Sargent, Eugene Serabyn, Tom Soifer, and Michael Werner, among others. The director of CCAT is Riccardo Giovanelli of Cornell, and the project manager is Thomas Sebring, also based at Cornell.

The 25-meter telescope is a natural progression in Caltech and JPL's long-standing interest in submillimeter and infrared astronomy, which started in the 1960s with the first infrared sky survey, carried out by professors Robert Leighton and Gerry Neugebauer on Mount Wilson.

In 1983, under Neugebauer's leadership, JPL launched the Infrared Astronomical Satellite, or IRAS, which discovered huge numbers of infrared-bright objects. This success paved the way to JPL's current infrared mission, the Spitzer Space Telescope. Meanwhile, Leighton went on to design a 10.4-meter submillimeter telescope, which by 1987 led to the construction and operation of the Caltech Submillimeter Observatory (CSO) on Mauna Kea, Hawaii. The CSO is funded by the National Science Foundation, and Tom Phillips, a professor of physics at Caltech, serves as director.

The CSO is fitted with sensitive submillimeter detectors and cameras, making it ideal for seeking out and observing the diffuse gases and their constituent molecules, crucial to understanding star formation. This experience served as the foundation for JPL's participation in the European Space Agency's Herschel Space Observatory.

The advantages of the new telescope, in addition to technological advances in instrumentation and the dry sky of the Atacama region, will also include a larger and more accurate mirror. The 25-meter telescope should provide six to 12 times the light-gathering ability of the CSO, depending on the exact wavelength. Also, the larger diameter and better surface will result in much sharper images of the sky.

The CCAT is designed to emphasize wide-field surveys of the submillimeter sky that will guide follow-up observations with telescope arrays such as the Combined Array for Research in Millimeter-wave Astronomy (CARMA), which Caltech played a leading role in developing, and the international Atacama Large Millimeter Array (ALMA), also located in northern Chile.

"CCAT will be a particularly important complement to ALMA," said Caltech astronomy professor Anneila Sargent, director of CARMA and chair of the interim CCAT board. "CCAT will enable consortium scientists to make optimal use of ALMA's submillimeter capabilities to address fundamental questions about star and galaxy formation."

A great opportunity therefore exists for submillimeter astronomy. In fact, an independent blue-ribbon panel chaired by Robert W. Wilson, 1978 Nobel Laureate who earned his doctorate in physics at Caltech, recently reported that the Atacama project "will revolutionize astronomy in the submillimeter/far infrared band and enable significant progress in unraveling the cosmic origin of stars, planets, and galaxies.

"CCAT is very timely and cannot wait," the panel said.

"It is a very exciting time for submillimeter astronomy," Zmuidzinas said when the 2004 feasibility study began. "We are making rapid progress on all fronts-in detectors, instruments, and new facilities-and this is leading to important scientific discoveries."

For more information, go to http://www.submm.org.

Writer: 
Robert Tindol
Writer: 

Dwarf Star Gulps Giant to Form Supernova

PASADENA, Calif.—A team of European and American astronomers has announced the discovery of the best evidence yet for the nature of the star systems that explode as type Ia supernovae. The team obtained a unique set of observations with the European Southern Observatory's Very Large Telescope and the Keck I 10-meter telescope in Hawaii.

The researchers were able to detect the signature of the material that surrounded the star before it exploded. The evidence strongly supports the scenario in which the explosion occurred in a double-star system where a white dwarf is fed by a red giant.

"The powerful 10 meter Keck Telescope with its recently refurbished high-resolution spectrograph finally gives us the capability to follow these supernovae for months, as we have done here. We are now busy taking advantage of this new window of opportunity," says Avishay Gal-Yam, an astronomer at the California Institute of Technology who led the Caltech research team. "This is really an exciting avenue that Keck opens up to us."

Because type Ia supernovae are extremely luminous and quite similar to one another, these exploding events have been used extensively as cosmological reference beacons to trace the expansion of the universe.

However, despite significant recent progress, the nature of the stars that explode and the physics that governs these powerful explosions have remained poorly understood.

In the most widely accepted models of type Ia supernovae, the pre-explosion white dwarf star interacts with a much larger companion star. Because of the proximity of the two stars and the strong gravitational attraction produced by the very compact white dwarf, the companion star continuously loses mass, "feeding" the white dwarf. When the mass of the white dwarf exceeds a critical value slightly higher than the mass of the sun, it explodes.

"To shed some light on the systems that explode as type Ia supernovae, we decided to search for signatures of the material transferred to the white dwarf in the surrounding material," says Ferdinando Patat, lead author of the paper reporting the results in this week's issue of Science Express, the online version of the research journal Science.

The team of astronomers studied in great detail SN 2006X, a type Ia supernova that exploded 70 million light-years away from us, in the stunning spiral galaxy Messier 100.

The observations were made with the Ultraviolet and Visual Echelle Spectrograph mounted at ESO's 8.2 meter Very Large Telescope on four different occasions, over a time span of four months. A fifth late-time epoch spectrum of SN 2006X was secured with the Keck Telescope. The astronomers also made use of radio data obtained with National Radio Astronomy Observatory's Very Large Array as well as with images extracted from the NASA/European Space Agency Hubble Space Telescope archive.

The most remarkable finding is the clear evolution seen in the absorption profile of the sodium lines over the few months following the explosion. This, the astronomers deduce, is linked to the presence of a number of expanding shells surrounding the system. These shells are left over from the star that was force-feeding the white dwarf until its sudden catastrophic and spectacular death.

"The material we have uncovered probably lies in a series of shells having a radius of the order of 0.05 light-years, or roughly 3,000 times the distance between Earth and the sun," explains Patat. "The material is moving with a velocity of 50 km/s, implying that the material would have been ejected some 50 years before the explosion."

Such a velocity is typical for the winds of red giant stars. The system that exploded was thus most likely composed of a white dwarf that acted as a giant "vacuum cleaner," drawing gas off of its red giant companion. In this case, however, the cannibal act proved mortal for the white dwarf. This is the first time that any clear and direct evidence for material surrounding the exploding star has been found.

"We are still not certain whether SN 2006X is a unique case, or is instead representative of all type Ia supernovae. Additional studies of similar objects will be crucial for determining that-and we are already working on observations of another type Ia supernova," says Caltech astronomer Josh Simon, who is leading a similar study of the recent event SN 2007af using observations collected at Keck and other facilities. "We should know even more within the next year," he concludes.

The team is composed of Patat and Luca Pasquini (ESO), Poonam Chandra and Roger Chevalier (University of Virginia), Stephen Justham, Philipp Podsiadlowski, and Christian Wolf (University of Oxford), Gal-Yam and Simon (Caltech), Ian Crawford (Birkbeck College London), Paolo Mazzali, Wolfgang Hillebrandt, and Nancy Elias-Rosa (Max Planck Institute for Astrophysics), Adi Pauldrach (Ludwig Maximilians University), Kenichi Nomoto (University of Tokyo), Stefano Benetti, Enrico Cappellaro, Alvio Renzini, Franco Sabbadin, and Massimo Turatto (INAF-Astronomical Observatory), Douglas Leonard (San Diego State University), and Andrea Pastorello (Queen's University Belfast).

A high-resolution image of SN 2006X in the spiral galaxy Messier 100 is available at http://www.eso.org/public/outreach/press-rel/pr-2006/phot-08-06.html

Writer: 
Robert Tindol
Writer: 

Caltech, JPL, Northrop Grumman to Celebrate 50 Years of Space Exploration

PASADENA, Calif.--Before October 1957, space flight was a thing of fantasy. Today we are experienced space explorers with unlimited voyages to undertake. Where is space flight's next horizon? What constitutes sensible space investment? How did the space pioneers accomplish their goals? These topics will be addressed at "50 Years in Space: An International Aerospace Conference Celebrating 50 Years of Space Technology," which will take place from September 19 to 21 at the California Institute of Technology.

The conference is hosted by Caltech, the Graduate Aeronautical Laboratories at Caltech (GALCIT), Northrop Grumman Corporation, and the Jet Propulsion Laboratory.

NASA Administrator Michael Griffin, astronaut Harrison "Jack" Schmitt, space industry pioneers and experts, and representatives from foreign space programs will speak on the history of space exploration, sensible space investment, and the future of space exploration from the perspectives of the aerospace industry, academia, government, and science. The opening keynote speaker will be the chairman of Northrop Grumman, Ronald Sugar.

"Our speakers represent all the institutions that essentially created and successfully sustained space exploration," said Ares Rosakis, Theodore von Karman Professor of Aeronautics and Mechanical Engineering and GALCIT director, and co-organizer of the conference with Dwight Streit, vice president, foundation technologies in Northrop Grumman's Space Technology sector. "This group crosses international and institutional boundaries. Each of our speakers is a preeminent expert in at least one of the many disciplines required for space travel. Their passion for space science and technology will make this conference the definitive observance worldwide commemorating 50 years in space," Streit noted.

"Many technologies developed as a result of space exploration have become integral terrestrial technologies--and our efforts benefit society in surprising ways that are completely separate from their initial impetus. As we look to the future, we will see how this important aspect of aeronautics continues--especially in the areas of tracking weather changes, global temperatures, and greenhouse gases, as well as the formations of the earth's crust related to seismic activity," Rosakis said.

The launch of Sputnik on October 4, 1957, began the space age. Within weeks, the Ramo-Wooldridge Corporation spun off Space Technology Laboratories (STL), with Simon Ramo as its president. STL and Ramo-Wooldridge became part of TRW Inc. in 1958, and then eventually part of Northrop Grumman in 2002.

In 1958, the JPL-built Explorer 1 put the U.S. in the space race, followed soon thereafter by Pioneer 1, built by TRW and the first spacecraft launched by NASA.

Ramo, the "R" in TRW, earned his PhD at Caltech in 1936. TRW's Space and Electronics Group became the Space Technology sector at Northrop Grumman. The president of the company's Space Technology sector, Alexis Livanos (also a Caltech graduate, having earned his bachelor's, master's, and PhD at Caltech), will give a special tribute to Ramo, 94, at the conference.

Livanos will join JPL director Charles Elachi (who earned his MS and PhD at Caltech), and Caltech president Jean-Lou Chameau as chairs of the conference. Elachi and Chameau will also be speaking.

Caltech alumnus Harrison "Jack" Schmitt, a geologist, one of the last two men to walk on the moon, and a NASA adviser, will be joined by Ed Stone, former director of JPL, and Gentry Lee, chief engineer for the Planetary Flight Systems Directorate at JPL, for a "look back" at the accomplishments of the past 50 years, many of which they bravely spearheaded. JPL, which became part of NASA after its formation in 1958, remains at the center of robotic planetary exploration and Earth-observing science. JPL is managed by Caltech.

Representatives of the top-tier space programs around the globe will also be present, including NASA's Griffin; European Space Agency Director General Jean-Jacques Dordain; President of Centre National d'Études Spatiales Yannick d'Escatha; and Masato Nakamura of the Japanese Institute of Space and Astronautical Science, all of whom will discuss the future of space exploration.

Miles O'Brien, CNN chief technology and environment correspondent, will moderate a panel discussion titled "Space and the Environment: Sensible Space Investment." Participating in the panel, and also presenting a separate talk, is A.P.J. Abdul Kalam, the 11th president of India and a noted scientist and aeronautical engineer.

Other distinguished guests include keynote speaker John C. Mather, James Webb Space Telescope senior project scientist; Elon Musk, SpaceX CEO; Burt Rutan, founder of Scaled Composites; and Hayden Planetarium Director Neil deGrasse Tyson. Mather was awarded the 2006 Nobel Prize in Physics for his work in the areas of black body form, cosmic microwave background radiation, and Big-Bang theory. PayPal creator Musk, whose space-transportation company, SpaceX, has opened up a whole new segment of the aerospace industry, will be speaking on a panel discussing the future of space exploration from an industry perspective. Closing keynote speaker Tyson is the recipient of eight honorary doctorates and was named one of Time magazine's 100 Most Influential People of 2007.

Several speakers will address the aerospace industry's perspective on the future of space flight. These include Musk; David Thompson, chairman and CEO of Orbital Science Corporation; Joanne Maguire, executive vice president, space systems, at Lockheed Martin; and David Whelan, corporate vice president, Boeing.

The perspective from academia will come from, among others, Caltech alumna and president of Purdue University France Córdova and Charles Kennel, the former director of Scripps Institution of Oceanography. Ronald Sega, undersecretary, United States Air Force, and the Defense Department's executive agent for space, will also speak on the future of space exploration.

Participants will be able to view large replicas of spacecrafts, rovers, and satellites. "This is more than a sit-and-listen event," said Rosakis. "It is an interactive learning experience. Guests will meet and exchange ideas with like-minded people and professionals in between formal presentations. The displays and replicas will also add to the guests' visual understanding of space exploration. They will be able to understand what the presence of these structures really feels like."

Full registration is $550. To register, go to http://www.galcit.caltech.edu/space50/. Registration is on a first-come, first-served basis, and seating is limited.

Caltech, JPL, Northrop Grumman, California Space Authority employees, Southern California high-school and college students and teachers with ID are welcome to attend the talks free of charge, but they must register via the website. 

Writer: 
Jill Perry
Writer: 

Pages

Subscribe to RSS - PMA