Caltech to Offer Online Courses through edX

To expand its involvement in online learning, the California Institute of Technology will offer courses through the online education platform edX beginning this October.

The edX course platform is an online learning initiative launched in 2012 by founding partners Harvard University and the Massachusetts Institute of Technology (MIT). Caltech's rigorous online course offerings will join those of 28 other prestigious colleges and universities in the edX platform's "xConsortium."

This new partnership with edX comes one year after Caltech offered three courses through the online learning platform Coursera in fall 2012. The Institute will now offer courses through both platforms.

"Coursera and edX have some foundational differences which are of interest to the faculty," says Cassandra Horii, director of teaching and learning programs at Caltech. Both organizations offer their courses at no cost to participating students; edX, however, operates as a nonprofit and plans to partner with only a small number of institutions, whereas Coursera—a for-profit, self-described "social entrepreneurship company"—partners with many institutions and state university systems.

The two platforms also emphasize different learning strategies, says Horii. "Coursera has a strong organizational principle built around lectures, so a lot of the interactivity is tied right into the video," she says. Though edX still enables the use of video lectures, a student can customize when he or she would like to take quizzes and use learning resources. In addition, edX allows faculty to embed a variety of learning materials—like textbook chapters, discussions, diagrams, and tables—directly into the platform's layout.

In the future, data collected from both platforms could provide valuable information about how students best learn certain material, especially in the sciences. "Caltech occupies this advanced, really rigorous scientific education space, and in general our interest in these online courses is to maintain that rigor and quality," Horii says. "So, with these learning data, we have some potential contributions to make to the general understanding of learning in this niche that we occupy."

Even before joining edX and Coursera, Caltech had already become an example in the growing trend of Massive Open Online Courses (MOOCs). Yaser Abu-Mostafa, professor of electrical engineering and computer science, developed his own MOOC on machine learning, called "Learning from Data," and offered it on YouTube and iTunes U beginning in April 2012.

Since its debut, Abu-Mostafa's MOOC has reached more than 200,000 participants, and it received mention in the NMC Horizon Report: 2013 Higher Education Edition—the latest edition of an annual report highlighting important trends in higher education. The course will be offered again in fall 2013 on iTunes U, and is now also open for enrollment in edX.

Although Caltech is now actively exploring several outlets for online learning, the Institute's commitment to educational outreach is not a recent phenomenon. In the early 1960s, Caltech physicist Richard Feynman reorganized the Institute's introductory physics course, incorporating contemporary research topics and making the course more engaging for students. His lectures were recorded and eventually incorporated into a widely popular physics book, The Feynman Lectures on Physics, which has sold millions of copies in a dozen languages.

Continuing in the tradition set by Feynman, the MOOCs at Caltech seek to provide a high-quality learning environment that is rigorous but accessible. "No dumbing down of courses for popular consumption . . . no talking over people's heads either; at Caltech, we explain things well because we understand them well," adds Abu-Mostafa.

More information on Caltech's online learning opportunities is available on the Online Education website.

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community
Friday, October 4, 2013

Undergraduate Teaching Assistant Orientation

Monday, August 12, 2013
Cahill, Hameetman Auditorium

Magnetic Fields: A Window to a Planet's Interior and Habitability

Thirty Meter Telescope Project Partners Sign Master Agreement

Document formalizes Caltech’s collaboration among several other international institutions.

Scientific authorities on the Thirty Meter Telescope (TMT) project announced on Friday that they have now signed a master agreement formalizing project goals and providing a governing framework for the international collaboration. TMT is a partnership among Caltech, the University of California, and a number of astronomical observatories and institutions from Canada, China, India, and Japan.

The agreement establishes an official commitment among the partners and delineates the rights and obligations of its global collaborators. These measures are intended to ensure steady progress for the TMT, which is planned to start construction in April 2014 and begin scientific operations in 2022.

"We are pleased with this vote of confidence from the scientific authorities," said Edward Stone, the David Morrisroe Professor of Physics and vice provost for special projects at Caltech, and vice chair of the TMT Board. "Their signing of this master agreement is a key endorsement of TMT's scientific merits as well as the project's overall implementation plan."

Frontpage Title: 
Thirty Meter Telescope Project Partners Sign Master Agreement
Writer: 
Exclude from News Hub: 
No
Thursday, September 26, 2013
Ramo Auditorium

Graduate TA Orientation & Teaching Conference

Friday, July 19, 2013
Cahill, Hameetman Auditorium

"Are We Alone?" Public lecture by Dr. Jill Tarter

Voyager: Getting to Know the Magnetic Highway

Since last August, NASA's farthest-flung robotic envoy, Voyager I, has been exploring a distant region just shy of interstellar space, some 11 billion miles (18 billion kilometers) away. Now, in three new papers in Science Express, Voyager scientists, including Caltech's Ed Stone, are sharing what they have learned about the peculiar region known as the "magnetic highway."

In this unexpected region, energetic particles from inside the heliosphere—the "bubble" containing charged particles streaming away from the sun—have escaped and disappeared along a highway of magnetic field lines. Their disappearance has allowed Voyager to see even low-energy cosmic rays that are zipping into the heliosphere from interstellar space.

Stone, the David Morrisroe Professor of Physics at Caltech and the mission's project scientist since 1972, says that what's really interesting is that, although Voyager is still within the realm of the sun's magnetic field, the spacecraft is already getting a taste of what's outside the bubble.

"In this region, Voyager is providing us with a preview of what we're going to see once we reach interstellar space," Stone says. "We're no longer just measuring the high-energy cosmic rays that can get inside the bubble. We're now seeing lower energy cosmic rays as well. We believe we have gotten the first indication of what the total intensity of galactic cosmic rays is in nearby interstellar space."

For more on the findings, read the full NASA release.

Writer: 
Kimm Fesenmaier
Tags: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Notes from the Back Row: "Quantum Entanglement and Quantum Computing"

John Preskill, the Richard P. Feynman Professor of Theoretical Physics, is hooked on quanta. He was applying quantum theory to black holes back in 1994 when mathematician Peter Shor (BS '81), then at Bell Labs, showed that a quantum computer could factor a very large number in a very short time. Much of the world's confidential information is protected by codes whose security depends on numerical "keys" large enough to not be factorable in the lifetime of your average evildoer, so, Preskill says, "When I heard about this, I was awestruck." The longest number ever factored by a real computer had 193 digits, and it took "several months for a network of hundreds of workstations collaborating over the Internet," Preskill continues. "If we wanted to factor a 500-digit number instead, it would take longer than the age of the universe." And yet, a quantum computer running at the same processor speed could polish off 193 digits in one-tenth of a second, he says. Factoring a 500-digit number would take all of two seconds.

While an ordinary computer chews through a calculation one bite at a time, a quantum computer arrives at its answer almost instantaneously because it essentially swallows the problem whole. It can do so because quantum information is "entangled," a state of being that is fundamental to the quantum world and completely foreign to ours. In the world we're used to, the two socks in a pair are always the same color. It doesn't matter who looks at them, where they are, or how they're looked at. There's no such independent reality in the quantum world, where the act of opening one of a matched pair of quantum boxes determines the contents of the other one—even if the two boxes are at opposite ends of the universe—but only if the other box is opened in exactly the same way. "Quantum boxes are not like soxes," Preskill says. (If entanglement sounds like a load of hooey to you, you're not alone. Preskill notes that Albert Einstein famously derided it back in the 1930s. "He called it 'spooky action at a distance,' and that sounds even more derisive when you say it in German—'Spukhafte Fernwirkungen!'")

An ordinary computer processes "bits," which are units of information encoded in batches of electrons, patches of magnetic field, or some other physical form. The "qubits" of a quantum computer are encoded by their entanglement, and these entanglements come with a big Do Not Disturb sign. Because the informational content of a quantum "box" is unknown until you open it and look inside, qubits exist only in secret, making them ideal for spies and high finance. However, this impenetrable security is also the quantum computer's downfall. Such a machine would be morbidly sensitive—the slightest encroachment from the outside world would demolish the entanglement and crash the system.

Ordinary computers cope with errors by storing information in triplicate. If one copy of a bit gets corrupted, it will no longer match the other two; error-detecting software constantly checks the three copies against one another and returns the flipped bit to its original state. Fixing flipped bits when you're not allowed to look at them seems an impossible challenge on the face of it, but after reading Shor's paper Preskill decided to give it a shot. Over the next few years, he and his grad student Daniel Gottesman (PhD '97) worked on quantum error correction, eventually arriving at a mathematical procedure by which indirectly measuring the states of five qubits would allow an error in any one of them to be fixed.

This changed the barriers facing practical quantum computation from insurmountable to merely incredibly difficult. The first working quantum computers, built in several labs in the early 2000s, were based on lasers interacting with what Preskill describes as "a handful" of trapped ions to perform "a modest number of [logic] operations." An ion trap is about the size of a thermos bottle, but the laser systems and their associated electronics take up several hundred square feet of lab space. With several million logic gates on a typical computer chip, scaling up this technology is a really big problem. Is there a better way? Perhaps. According Preskill, his colleagues at Caltech's Institute for Quantum Information and Matter are working out the details of a "potentially transformative" approach that would allow quantum computers to be made using the same silicon-based technologies as ordinary ones.

 "Quantum Entanglement and Quantum Computing" is available for download in HD from Caltech on iTunesU. (Episode 19)

Writer: 
Douglas Smith
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Birth of a Black Hole

A new kind of cosmic flash may reveal something never seen before: the birth of a black hole.

When a massive star exhausts its fuel, it collapses under its own gravity and produces a black hole, an object so dense that not even light can escape its gravitational grip. According to a new analysis by an astrophysicist at the California Institute of Technology (Caltech), just before the black hole forms, the dying star may generate a distinct burst of light that will allow astronomers to witness the birth of a new black hole for the first time.

Tony Piro, a postdoctoral scholar at Caltech, describes this signature light burst in a paper published in the May 1 issue of the Astrophysical Journal Letters. While some dying stars that result in black holes explode as gamma-ray bursts, which are among the most energetic phenomena in the universe, those cases are rare, requiring exotic circumstances, Piro explains. "We don't think most run-of-the-mill black holes are created that way." In most cases, according to one hypothesis, a dying star produces a black hole without a bang or a flash: the star would seemingly vanish from the sky—an event dubbed an unnova. "You don't see a burst," he says. "You see a disappearance."

But, Piro hypothesizes, that may not be the case. "Maybe they're not as boring as we thought," he says.

According to well-established theory, when a massive star dies, its core collapses under its own weight. As it collapses, the protons and electrons that make up the core merge and produce neutrons. For a few seconds—before it ultimately collapses into a black hole—the core becomes an extremely dense object called a neutron star, which is as dense as the sun would be if squeezed into a sphere with a radius of about 10 kilometers (roughly 6 miles). This collapsing process also creates neutrinos, which are particles that zip through almost all matter at nearly the speed of light. As the neutrinos stream out from the core, they carry away a lot of energy—representing about a tenth of the sun's mass (since energy and mass are equivalent, per E = mc2).

According to a little-known paper written in 1980 by Dmitry Nadezhin of the Alikhanov Institute for Theoretical and Experimental Physics in Russia, this rapid loss of mass means that the gravitational strength of the dying star's core would abruptly drop. When that happens, the outer gaseous layers—mainly hydrogen—still surrounding the core would rush outward, generating a shock wave that would hurtle through the outer layers at about 1,000 kilometers per second (more than 2 million miles per hour).

Using computer simulations, two astronomers at UC Santa Cruz, Elizabeth Lovegrove and Stan Woosley, recently found that when the shock wave strikes the outer surface of the gaseous layers, it would heat the gas at the surface, producing a glow that would shine for about a year—a potentially promising signal of a black-hole birth. Although about a million times brighter than the sun, this glow would be relatively dim compared to other stars. "It would be hard to see, even in galaxies that are relatively close to us," says Piro.

But now Piro says he has found a more promising signal. In his new study, he examines in more detail what might happen at the moment when the shock wave hits the star's surface, and he calculates that the impact itself would make a flash 10 to 100 times brighter than the glow predicted by Lovegrove and Woosley. "That flash is going to be very bright, and it gives us the best chance for actually observing that this event occurred," Piro explains. "This is what you really want to look for."

Such a flash would be dim compared to exploding stars called supernovae, for example, but it would be luminous enough to be detectable in nearby galaxies, he says. The flash, which would shine for 3 to 10 days before fading, would be very bright in optical wavelengths—and at its very brightest in ultraviolet wavelengths.

Piro estimates that astronomers should be able to see one of these events per year on average. Surveys that watch the skies for flashes of light like supernovae—surveys such as the Palomar Transient Factory (PTF), led by Caltech—are well suited to discover these unique events, he says. The intermediate Palomar Transient Factory (iPTF), which improves on the PTF and just began surveying in February, may be able to find a couple of these events per year.

Neither survey has observed any black-hole flashes as of yet, says Piro, but that does not rule out their existence. "Eventually we're going to start getting worried if we don't find these things." But for now, he says, his expectations are perfectly sound.

With Piro's analysis in hand, astronomers should be able to design and fine-tune additional surveys to maximize their chances of witnessing a black-hole birth in the near future. In 2015, the next generation of PTF, called the Zwicky Transient Facility (ZTF), is slated to begin; it will be even more sensitive, improving by several times the chances of finding those flashes. "Caltech is therefore really well-positioned to look for transient events like this," Piro says.

Within the next decade, the Large Synoptic Survey Telescope (LSST) will begin a massive survey of the entire night sky. "If LSST isn't regularly seeing these kinds of events, then that's going to tell us that maybe there's something wrong with this picture, or that black-hole formation is much rarer than we thought," he says.

The Astrophysical Journal Letters paper is titled "Taking the 'un' out of unnovae." This research was supported by the National Science Foundation, NASA, and the Sherman Fairchild Foundation.

Writer: 
Marcus Woo
Contact: 
Writer: 
Exclude from News Hub: 
No

Astronomers Discover Massive Star Factory in Early Universe

Star-forming galaxy is the most distant ever found

PASADENA, Calif.—Smaller begets bigger.

Such is often the case for galaxies, at least: the first galaxies were small, then eventually merged together to form the behemoths we see in the present universe.

Those smaller galaxies produced stars at a modest rate; only later—when the universe was a couple of billion years old—did the vast majority of larger galaxies begin to form and accumulate enough gas and dust to become prolific star factories. Indeed, astronomers have observed that these star factories—called starburst galaxies—became prevalent a couple of billion years after the Big Bang.

But now a team of astronomers, which includes several from the California Institute of Technology (Caltech), has discovered a dust-filled, massive galaxy churning out stars when the cosmos was a mere 880 million years old—making it the earliest starburst galaxy ever observed.

The galaxy is about as massive as our Milky Way, but produces stars at a rate 2,000 times greater, which is a rate as high as any galaxy in the universe. Generating the mass equivalent of 2,900 suns per year, the galaxy is especially prodigious—prompting the team to call it a "maximum-starburst" galaxy.

"Massive, intense starburst galaxies are expected to only appear at later cosmic times," says Dominik Riechers, who led the research while a senior research fellow at Caltech. "Yet, we have discovered this colossal starburst just 880 million years after the Big Bang, when the universe was at little more than 6 percent of its current age." Now an assistant professor at Cornell, Riechers is the first author of the paper describing the findings in the April 18 issue of the journal Nature.

While the discovery of this single galaxy isn't enough to overturn current theories of galaxy formation, finding more galaxies like this one could challenge those theories, the astronomers say. At the very least, theories will have to be modified to explain how this galaxy, dubbed HFLS3, formed, Riechers says.

"This galaxy is just one spectacular example, but it's telling us that extremely vigorous star formation was possible early in the universe," says Jamie Bock, professor of physics at Caltech and a coauthor of the paper.

The astronomers found HFLS3 chock full of molecules such as carbon monoxide, ammonia, hydroxide, and even water. Because most of the elements in the universe—other than hydrogen and helium—are fused in the nuclear furnaces of stars, such a rich and diverse chemical composition is indicative of active star formation. And indeed, Bock says, the chemical composition of HFLS3 is similar to those of other known starburst galaxies that existed later in cosmic history.

Last month, a Caltech-led team of astronomers—a few of whom are also authors on this newer work—discovered dozens of similar galaxies that were producing stars as early as 1.5 billion years after the Big Bang. But none of them existed as early as HFLS3, which has been studied in much greater detail.

Those previous observations were made possible by gravitational lensing, in which large foreground galaxies act as cosmic magnifying glasses, bending the light of the starburst galaxies and making their detection easier. HFLS3, however, is only weakly lensed, if at all. The fact that it was detectable without the help of lensing means that it is intrinsically a bright galaxy in far-infrared light—nearly 30 trillion times as luminous as the sun and 2,000 times more luminous than the Milky Way.

Because the galaxy is enshrouded in dust, it's very faint in visible light. The galaxy's stars, however, heat up the dust, causing it to radiate in infrared wavelengths. The astronomers were able to find HFLS3 as they sifted through data taken by the European Space Agency's Herschel Space Observatory, which studies the infrared universe. The data was part of the Herschel Multi-tiered Extragalactic Survey (HerMES), an effort co-coordinated by Bock to observe a large patch of the sky (roughly 1,300 times the size of the moon) with Herschel.

Amid the thousands of galaxies detected in the survey, HFLS3 appeared as just a faint dot—but a particularly red one. That caught the attention of Darren Dowell, a visiting associate at Caltech who was analyzing the HerMES data. The object's redness meant that its light was being substantially stretched toward longer (and redder) wavelengths by the expansion of the universe. The more distant an object, the more its light is stretched, and so a very red source would be very far away. The only other possibility would be that—because cooler objects emit light at longer wavelengths—the object might be unusually cold; the astronomers' analysis, however, ruled out that possibility. Because it takes the light billions of years to travel across space, seeing such a distant object is equivalent to looking deep into the past. "We were hoping to find a massive starburst galaxy at vast distances, but we did not expect that one would even exist that early in the universe," Riechers says.

To study HFLS3 further, the astronomers zoomed in with several other telescopes. Using the Combined Array for Research in Millimeter-Wave Astronomy (CARMA)—a series of telescope dishes that Caltech helps operate in the Inyo Mountains of California—as well as the Z-Spec instrument on the Caltech Submillimeter Observatory on Mauna Kea in Hawaii, the team was able to study the chemical composition of the galaxy in detail—in particular, the presence of water and carbon monoxide—and measure its distance. The researchers also used the 10-meter telescope at the W. M. Keck Observatory on Mauna Kea to determine to what extent HFLS3 was gravitationally lensed.

This galaxy is the first such object in the HerMES survey to be analyzed in detail. This type of galaxy is rare, the astronomers say, but to determine just how rare, they will pursue more follow-up studies to see if they can find more of them lurking in the HerMES data. These results also hint at what may soon be discovered with larger infrared observatories, such as the new Atacama Large Millimeter/submillimeter Array (ALMA) in Chile and the planned Cerro Chajnantor Atacama Telescope (CCAT), of which Caltech is a partner institution.

The title of the Nature paper is "A Dust-Obscured Massive Maximum-Starburst Galaxy at a Redshift of 6.34." In addition to Riechers, Bock, and Dowell, the other Caltech authors of the paper are visiting associates in physics Matt Bradford, Asantha Cooray, and Hien Nguyen; postdoctoral scholars Carrie Bridge, Attila Kovacs, Joaquin Vieira, Marco Viero, and Michael Zemcov; staff research scientist Eric Murphy; and Jonas Zmuidzinas, the Merle Kingsley Professor of Physics and the Chief Technologist at NASA's Jet Propulsion Laboratory (JPL). There are a total of 64 authors. Bock, Dowell, and Nguyen helped build the Spectral and Photometric Imaging Receiver (SPIRE) instrument on Herschel.

Herschel is a European Space Agency cornerstone mission, with science instruments provided by consortia of European institutes and with important participation by NASA. NASA's Herschel Project Office is based at JPL in Pasadena, California. JPL contributed mission-enabling technology for two of Herschel's three science instruments. The NASA Herschel Science Center, part of the Infrared Processing and Analysis Center at Caltech in Pasadena, supports the U.S. astronomical community. Caltech manages JPL for NASA.

The W. M. Keck Observatory operates the largest, most scientifically productive telescopes on Earth. The two 10-meter optical/infrared telescopes on the summit of Mauna Kea on the island of Hawaii feature a suite of advanced instruments including imagers, multi-object spectrographs, high-resolution spectrographs, integral-field spectroscopy and a world-leading laser guide-star adaptive optics system. The observatory is operated by a private 501(c)(3) nonprofit organization and is a scientific partnership of the California Institute of Technology, the University of California, and NASA.

Writer: 
Marcus Woo
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - PMA