Caltech, JPL Team Captures Movement on Nepal Earthquake Fault Rupture

Using a combination of satellite radar imaging data, GPS data measured in and near Nepal, and seismic observations from instruments around the world, Caltech and JPL scientists have constructed a preliminary picture of what happened below Earth's surface during the recent 7.8-magnitude Gorkha earthquake in Nepal.

The team's observations and models of the April 25, 2015 earthquake, produced through the Advanced Rapid Imaging and Analysis (ARIA) project—a collaboration between Caltech and JPL—include preliminary estimates of the slippage of the fault beneath Earth's surface that resulted in the deaths of thousands of people. In addition, the ARIA scientists have provided first responders and key officials in Nepal with information and maps that show block-by-block building devastation as well as measurements of ground movement at individual locations around the country.

"As the number of orbiting imaging radar and optical satellites that form the international constellation increases, the expected amount of time it takes to acquire an image of an impacted area will decrease, allowing for products such as those we have made for Nepal to become more commonly and rapidly available," says Mark Simons, professor of geophysics at Caltech and a member of the ARIA team. "I fully expect that within five years, this kind of information will be available within hours of a big disaster, ultimately resulting in an ability to save more lives after a disaster and to make assessment and response more efficient in both developed and developing nations."

Over the last five years, Simons and his colleagues in Caltech's Seismological Laboratory and at JPL have been developing the approaches, infrastructure, and technology to rapidly and automatically use satellite-based observations to measure the movement of Earth's surface associated with earthquakes, volcanoes, landslides and other geophysical processes.  

"ARIA is ultimately aimed at providing tools and data—for use by groups ranging from first responders, to government agencies, and individual scientists—that can help improve situational awareness, response, and recovery after many natural disasters," Simons says. "The same products also provide key observational constraints on our physical understanding of the underlying processes such as the basic physics controlling seismogenic behavior of major faults."

ARIA is funded through a combination of support from JPL, Caltech, and NASA.

Frontpage Title: 
Caltech, JPL Team Get Clearer Picture of Nepal Earthquake
Listing Title: 
Caltech, JPL Team Get Clearer Picture of Nepal Earthquake
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Tuesday, May 19, 2015
Guggenheim 101 (Lees-Kubota Lecture Hall) – Guggenheim Aeronautical Laboratory

Science in a Small World - Short Talks

Tuesday, May 19, 2015
Dabney Hall, Garden of the Associates – The Garden of the Associates

Science in a Small World - Poster Session #2

Tuesday, May 19, 2015
Dabney Hall, Garden of the Associates – The Garden of the Associates

Science in a Small World - Poster Session #1

Tracking Photosynthesis from Space

Watching plants perform photosynthesis from space sounds like a futuristic proposal, but a new application of data from NASA's Orbiting Carbon Observatory-2 (OCO-2) satellite may enable scientists to do just that. The new technique, which allows researchers to analyze plant productivity from far above Earth, will provide a clearer picture of the global carbon cycle and may one day help researchers determine the best regional farming practices and even spot early signs of drought.

When plants are alive and healthy, they engage in photosynthesis, absorbing sunlight and carbon dioxide to produce food for the plant, and generating oxygen as a by-product. But photosynthesis does more than keep plants alive. On a global scale, the process takes up some of the man-made emissions of atmospheric carbon dioxide—a greenhouse gas that traps the sun's heat down on Earth—meaning that plants also have an important role in mitigating climate change.

To perform photosynthesis, the chlorophyll in leaves absorbs sunlight—most of which is used to create food for the plants or is lost as heat. However, a small fraction of that absorbed light is reemitted as near-infrared light. We cannot see in the near-infrared portion of the spectrum with the naked eye, but if we could, this reemitted light would make the plants appear to glow—a property called solar induced fluorescence (SIF). Because this reemitted light is only produced when the chlorophyll in plants is also absorbing sunlight for photosynthesis, SIF can be used as a way to determine a plant's photosynthetic activity and productivity.

"The intensity of the SIF appears to be very correlated with the total productivity of the plant," says JPL scientist Christian Frankenberg, who is lead for the SIF product and will join the Caltech faculty in September as an associate professor of environmental science and engineering in the Division of Geological and Planetary Sciences.

Usually, when researchers try to estimate photosynthetic activity from satellites, they utilize a measure called the greenness index, which uses reflections in the near-infrared spectrum of light to determine the amount of chlorophyll in the plant. However, this is not a direct measurement of plant productivity; a plant that contains chlorophyll is not necessarily undergoing photosynthesis. "For example," Frankenberg says, "evergreen trees are green in the winter even when they are dormant."

He adds, "When a plant starts to undergo stress situations, like in California during a summer day when it's getting very hot and dry, the plants still have chlorophyll"—chlorophyll that would still appear to be active in the greenness index—"but they usually close the tiny pores in their leaves to reduce water loss, and that time of stress is also when SIF is reduced. So photosynthesis is being very strongly reduced at the same time that the fluorescence signal is also getting weaker, albeit at a smaller rate."

The Caltech and JPL team, as well as colleagues from NASA Goddard, discovered that they could measure SIF from orbit using spectrometers—standard instruments that can detect light intensity—that are already on board satellites like Japan's Greenhouse Gases Observing Satellite (GOSAT) and NASA's OCO-2.

In 2014, using this new technique with data from GOSAT and the European Global Ozone Monitoring Experiment–2 satellite, the researchers scoured the globe for the most productive plants and determined that the U.S. "Corn Belt"—the farming region stretching from Ohio to Nebraska—is the most photosynthetically active place on the planet. Although it stands to reason that a cornfield during growing season would be actively undergoing photosynthesis, the high-resolution measurements from a satellite enabled global comparison to other plant-heavy regions—such as tropical rainforests.

"Before, when people used the greenness index to represent active photosynthesis, they had trouble determining the productivity of very dense plant areas, such as forests or cornfields. With enough green plant material in the field of view, these greenness indexes can saturate; they reach a maximum value they can't exceed," Frankenberg says. Because of the sensitivity of the SIF measurements, researchers can now compare the true productivity of fields from different regions without this saturation—information that could potentially be used to compare the efficiency of farming practices around the world.

Now that OCO-2 is online and producing data, Frankenberg says that it is capable of achieving higher resolution than the preliminary experiments with GOSAT. Therefore, OCO-2 will be able to provide an even clearer picture of plant productivity worldwide. However, to get more specific information about how plants influence the global carbon cycle, an evenly distributed ground-based network of spectrometers will be needed. Such a network—located down among the plants rather than miles above—will provide more information about regional uptake of carbon dioxide via photosynthesis and the mechanistic link between SIF and actual carbon exchange.

One existing network, called FLUXNET, uses ground-based towers to measure the exchange of carbon dioxide, or carbon flux, between the land and the atmosphere from towers at more than 600 locations worldwide. However, the towers only measure the exchange of carbon dioxide and are unable to directly observe the activities of the biosphere that drive this exchange.

The new ground-based measurements will ideally take place at existing FLUXNET sites, but they will be performed with a small set of high-resolution spectrometers—similar to the kind that OCO-2 uses—to allow the researchers to use the same measurement principles they developed for space. The revamped ground network was initially proposed in a 2012 workshop at the Keck Institute for Space Studies and is expected to go online sometime in the next two years.

In the future, a clear picture of global plant productivity could influence a range of decisions relevant to farmers, commodity traders, and policymakers. "Right now, the SIF data we can gather from space is too coarse of a picture to be really helpful for these conversations, but, in principle, with the satellite and ground-based measurements you could track the fluorescence in fields at different times of day," he says. This hourly tracking would not only allow researchers to detect the productivity of the plants, but it could also spot the first signs of plant stress—a factor that impacts crop prices and food security around the world.

"The measurements of SIF from OCO-2 greatly extend the science of this mission", says Paul Wennberg, R. Stanton Avery Professor of Atmospheric Chemistry and Environmental Science and Engineering, director of the Ronald and Maxine Linde Center for Global Environmental Science, and a member of the OCO-2 science team. "OCO-2 was designed to map carbon dioxide, and scientists plan to use these measurements to determine the underlying sources and sinks of this important gas. The new SIF measurements will allow us to diagnose the efficiency of the plants—a key component of the sinks of carbon dioxide."

By using OCO-2 to diagnose plant activity around the globe, this new research could also contribute to understanding the variability in crop primary productivity and also, eventually, the development of technologies that can improve crop efficiency—a goal that could greatly benefit humankind, Frankenberg says.

This project is funded by the Keck Institute for Space Studies and JPL. Wennberg is also an executive officer for the Environmental Science and Engineering (ESE) program. ESE is a joint program of the Division of Engineering and Applied Science, the division of Chemistry and Chemical Engineering, and the Division of Geological and Planetary Sciences.

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Caltech’s Linde Center Helps Navigate the Southern Ocean

At Caltech's Ronald and Maxine Linde Center for Global Environmental Science, researchers from diverse disciplines work together to investigate Earth's climate and its atmosphere, oceans, and biosphere; their evolution; and how they may change in the future.

In early February, the center hosted a three-day workshop focused on the Southern Ocean around Antarctica. Scientists from around the world working at the intersection of fluid dynamics and biochemistry gathered to summarize our current knowledge of the physical, chemical, and biological processes that are critical to the Southern Ocean's circulation and marine ecosystems. The researchers set out to identify areas where collaboration across disciplines is needed to push that understanding forward. Here are a few of the topics they covered.

The Use of Autonomous Underwater Vehicles for Observation 


Credit: Sunke Schmidtko

The Southern Ocean is one of the most inhospitable places on Earth. Despite the area's importance to the global climate, measurements and data are hard to come by because it is difficult to deploy research vessels in the region, especially in winter. Little, if any, data have been collected in some areas, especially in the deep ocean and underneath ice shelves.

But many new tools now exist to improve data collection and measurement in these remote regions. Autonomous gliders (shown above) have gathered information on currents, water density, and temperature at many depths, helping researchers like workshop participants Nicole Couto (Rutgers University), Mike Meredith (British Antarctic Survey), as well as Caltech's Andrew Thompson, assistant professor of environmental science and engineering, understand how warm waters are causing ice sheets to melt. Meanwhile, an extensive system of autonomous floats monitors temperature, salinity, dissolved gases and currents in the earth's oceans; moored instruments track what is happening beneath ice shelves; and even Antarctic seals outfitted with sensors provide scientists access to, and information about, some of the ocean's coldest and most inaccessible waters.

Iron Limitation on Phytoplankton Growth


Credit: NASA/Suomi NPP/Norman Kuring

Phytoplankton, microscopic algae that perform photosynthesis, the base of the Southern Ocean food web. These organisms require both nutrients and sunlight to survive. The Southern Ocean is a region where nutrients and sunlight (at least in summer) are plentiful, yet many parts of the Southern Ocean have extremely low phytoplankton concentrations. This is because not all nutrients are treated equally. Take iron, for example. Although iron is needed only in small amounts by phytoplankton, it is scarce throughout most of the Southern Ocean. Iron enters ocean waters by way of dust falling out of the atmosphere, from melting icebergs or glaciers, and from the ocean floor. Meeting participants Phil Boyd (University of Tasmania) and Nicolas Cassar (Duke University) are working to understand how sources of iron will respond to changing atmospheric and oceanic conditions, as well as how Southern Ocean ecosystems will adapt, are important research questions.

Phytoplankton distributions are largely observed by measuring ocean color from space. This image shows data from NASA's MODIS (MODerate resolution Imaging Spectroradiometer) satellite, which measures light coming off the ocean, NASA scientists use this information to determine the concentration of phytoplankton in the water. Here, yellow and orange colors indicate the presence of more phytoplankton.

The Importance of High Spatial Resolution in Ocean Models


Credit: Jeff Schmalz/NASA

The ocean is similar to the atmosphere in that much of the variability is contained in "weather systems," or high- and low-pressure areas. These weather systems create swirling currents, called eddies, that are the ocean equivalent of atmospheric storms. While storms in the atmosphere span hundreds of kilometers, eddies in the ocean only cover a few tens of kilometers. When numerical models, such as those run by meeting participant Andy Hogg (Australia National University), capture these smaller scales, the simulations explode with previously unseen dynamics and produce an energetic circulation that is more vigorous than seen in models that only simulate larger scales.

This image of Chatham Island, off the coast of New Zealand, was taken by MODIS. The blue wispy pattern (upper right) is a phytoplankton bloom that is being stretched and stirred by ocean eddies. Images like this one verify that high-resolution numerical models accurately reproduce oceanic motions and provide insight into how these small-scale currents influence Southern Ocean ecosystems.

Heat Input


Credit: Courtesy of Whit Anderson/The Geophysical Fluid Dynamics Lab in Princeton, NJ

Increasing carbon dioxide concentrations in the atmosphere warm the planet, with roughly 90 percent of the extra energy going into the oceans. The ocean warming that results is not uniform around the globe. Numerical models from the group of meeting participant John Marshall (MIT) suggest that the warming of the Southern Ocean will occur later than that of other oceans. The reason? The Southern Ocean provides a gateway where cold, dense waters, stored in the deep ocean, are brought up to the surface by the ocean circulation and are exposed to the atmosphere. These cold waters have the potential to store a large amount of heat. Understanding when this reservoir will be exhausted is critical to predicting future Southern Ocean temperature changes.

In this sea-surface temperature map created by a NOAA Geophysical Fluid Dynamics Laboratory model, Southern Ocean waters (green and blue) represent regions where cold water rises up to the surface, warms, and moves northward.

The Distribution of Sea Ice


Credit: Hannah Joy-Warren, Stanford graduate student, taken during the Phantastic II cruise to the west Antarctic Peninsula (October/November 2014).

The distribution of sea ice in the Southern Ocean is important for many reasons. For instance, sea ice can act as a cap on the ocean, limiting atmospheric interactions with the ocean surface that may trap carbon in the deep ocean. Recently, Caltech researchers including Thompson and Jess Adkins, professor of geochemistry and global environmental science, discovered a link between the distribution of sea ice in the Southern Ocean and differences in the ocean circulation in our present climate and at the Last Glacial Maximum.

As sea ice retreats, additional melting can be a source of iron to the ocean, influencing phytoplankton growth. The capacity for plankton and other organisms to survive the Antarctic winter is only just beginning to be understood, as explained in a recent review article on sea ice ecosystems by meeting participant Kevin Arrigo (Stanford University). Future under-ice observations are needed to improve our ability to estimate ecosystem changes in polar regions.

Frontpage Title: 
Navigating the Southern Ocean
Listing Title: 
Navigating the Southern Ocean
Exclude from News Hub: 
No
Short Title: 
Navigating the Southern Ocean
News Type: 
Research News

Chemists Create “Comb” that Detects Terahertz Waves with Extreme Precision

Light can come in many frequencies, only a small fraction of which can be seen by humans. Between the invisible low-frequency radio waves used by cell phones and the high frequencies associated with infrared light lies a fairly wide swath of the electromagnetic spectrum occupied by what are called terahertz, or sometimes submillimeter, waves. Exploitation of these waves could lead to many new applications in fields ranging from medical imaging to astronomy, but terahertz waves have proven tricky to produce and study in the laboratory. Now, Caltech chemists have created a device that generates and detects terahertz waves over a wide spectral range with extreme precision, allowing it to be used as an unparalleled tool for measuring terahertz waves.

The new device is an example of what is known as a frequency comb, which uses ultrafast pulsed lasers, or oscillators, to produce thousands of unique frequencies of radiation distributed evenly across a spectrum like the teeth of a comb. Scientists can then use them like rulers, lining up the teeth like tick marks to very precisely measure light frequencies. The first frequency combs, developed in the 1990s, earned their creators (John Hall of JILA and Theordor Hánsch of the Max Planck Institute of Quantum Optics and Ludwig Maximilians University Munich) the 2005 Nobel Prize in physics. These combs, which originated in the visible part of the spectrum, have revolutionized how scientists measure light, leading, for example, to the development of today's most accurate timekeepers, known as optical atomic clocks.

The team at Caltech combined commercially available lasers and optics with custom-built electronics to extend this technology to the terahertz, creating a terahertz frequency comb with an unprecedented combination of spectral coverage and precision. Its thousands of "teeth" are evenly spaced across the majority of the terahertz region of the spectrum (0.15-2.4 THz), giving scientists a way to simultaneously measure absorption in a sample at all of those frequencies.

The work is described in a paper that appears in the online version of the journal Physical Review Letters and will be published in the April 24 issue. The lead author is graduate student and National Science Foundation fellow Ian Finneran, who works in the lab of Geoffrey A. Blake, professor of cosmochemistry and planetary sciences and professor of chemistry at Caltech.

Blake explains the utility of the new device, contrasting it with a common radio tuner. "With radio waves, most tuners let you zero in on and listen to just one station, or frequency, at a time," he says. "Here, in our terahertz approach, we can separate and process more than 10,000 frequencies all at once. In the near future, we hope to bump that number up to more than 100,000."

That is important because the terahertz region of the spectrum is chock-full of information. Everything in the universe that is warmer than about 10 degrees Kelvin (-263 degrees Celsius) gives off terahertz radiation. Even at these very low temperatures molecules can rotate in space, yielding unique fingerprints in the terahertz. Astronomers using telescopes such as Caltech's Submillimeter Observatory, the Atacama Large Millimeter Array, and the Herschel Space Observatory are searching stellar nurseries and planet-forming disks at terahertz frequencies, looking for such chemical fingerprints to try to determine the kinds of molecules that are present and thus available to planetary systems. But in just a single chunk of the sky, it would not be unusual to find signatures of 25 or more different molecules.

To be able to definitively identify specific molecules within such a tangle of terahertz signals, scientists first need to determine exact measurements of the chemical fingerprints associated with various molecules. This requires a precise source of terahertz waves, in addition to a sensitive detector, and the terahertz frequency comb is ideal for making such measurements in the lab.

"When we look up into space with terahertz light, we basically see this forest of lines related to the tumbling motions of various molecules," says Finneran. "Unraveling and understanding these lines is difficult, as you must trek across that forest one point and one molecule at a time in the lab. It can take weeks, and you would have to use many different instruments. What we've developed, this terahertz comb, is a way to analyze the entire forest all at once."

After the device generates its tens of thousands of evenly spaced frequencies, the waves travel through a sample—in the paper, the researchers provide the example of water vapor. The instrument then measures what light passes through the sample and what gets absorbed by molecules at each tooth along the comb. If a detected tooth gets shorter, the sample absorbed that particular terahertz wave; if it comes through at the baseline height, the sample did not absorb at that frequency.

"Since we know exactly where each of the tick marks on our ruler is to about nine digits, we can use this as a diagnostic tool to get these frequencies really, really precisely," says Finneran. "When you look up in space, you want to make sure that you have such very exact measurements from the lab."

In addition to the astrochemical application of identifying molecules in space, the terahertz comb will also be useful for studying fundamental interactions between molecules. "The terahertz is unique in that it is really the only direct way to look not only at vibrations within individual large molecules that are important to life, but also at vibrations between different molecules that govern the behavior of liquids such as water," says Blake.

Additional coauthors on the paper, "Decade-Spanning High-Precision Terahertz Frequency Comb," include current Caltech graduate students Jacob Good, P. Brandon Carroll, and Marco Allodi, as well as recent graduate Daniel Holland (PhD '14). The work was supported by funding from the National Science Foundation.

Writer: 
Kimm Fesenmaier
Frontpage Title: 
“Combing” Through Terahertz Waves
Listing Title: 
“Combing” Through Terahertz Waves
Contact: 
Writer: 
Exclude from News Hub: 
No
Short Title: 
“Combing” Through Terahertz Waves
News Type: 
Research News

Understanding the Earth at Caltech

Created by: 
Teaser Image: 
Listing Title: 
Understanding the Earth at Caltech
Frontpage Title: 
Understanding the Earth at Caltech
Slideshow: 
Credit: Courtesy J. Andrade/Caltech

The ground beneath our feet may seem unexceptional, but it has a profound impact on the mechanics of landslides, earthquakes, and even Mars rovers. That is why civil and mechanical engineer Jose Andrade studies soils as well as other granular materials. Andrade creates computational models that capture the behavior of these materials—simulating a landslide or the interaction of a rover wheel and Martian soil, for instance. Though modeling a few grains of sand may be simple, predicting their action as a bulk material is very complex. "This dichotomy…leads to some really cool work," says Andrade. "The challenge is to capture the essence of the physics without the complexity of applying it to each grain in order to devise models that work at the landslide level."

Credit: Kelly Lance ©2013 MBARI

Geobiologist Victoria Orphan looks deep into the ocean to learn how microbes influence carbon, nitrogen, and sulfur cycling. For more than 20 years, her lab has been studying methane-breathing marine microorganisms that inhabit rocky mounds on the ocean floor. "Methane is a much more powerful greenhouse gas than carbon dioxide, so tracing its flow through the environment is really a priority for climate models and for understanding the carbon cycle," says Orphan. Her team recently discovered a significantly wider habitat for these microbes than was previously known. The microbes, she thinks, could be preventing large volumes of the potent greenhouse gas from entering the oceans and reaching the atmosphere.

Credit: NASA/JPL-Caltech

Researchers know that aerosols—tiny particles in the atmosphere—scatter and absorb incoming sunlight, affecting the formation and properties of clouds. But it is not well understood how these effects might influence climate change. Enter chemical engineer John Seinfeld. His team conducted a global survey of the impact of changing aerosol levels on low-level marine clouds—clouds with the largest impact on the amount of incoming sunlight Earth reflects back into space—and found that varying aerosol levels altered both the quantity of atmospheric clouds and the clouds' internal properties. These results offer climatologists "unique guidance on how warm cloud processes should be incorporated in climate models with changing aerosol levels," Seinfeld says.

Credit: Yan Hu/Aroian Lab/UC San Diego

Tiny parasitic worms infect nearly half a billion people worldwide, causing gastrointestinal issues, cognitive impairment, and other health problems. Biologist Paul Sternberg is on the case. His lab recently analyzed the entire 313-million-nucleotide genome of the hookworm Ancylostoma ceylanicum to determine which genes turn on when the worm infects its host. A new family of proteins unique to parasitic worms and related to the early infection process was identified; the discovery could lead to new treatments targeting those genes. "A parasitic infection is a balance between the parasites trying to suppress the immune system and the host trying to attack the parasite," Sternberg observes, "and by analyzing the genome, we can uncover clues that might help us alter that balance in favor of the host."

Credit: K.Batygin/Caltech

Earth is special, not least because our solar system has a unique (as far as we know) orbital architecture: its rocky planets have relatively low masses compared to those around other sun-like stars. Planetary scientist Konstantin Batygin has an explanation. Using computer simulations to describe the solar system's early evolution, he and his colleagues showed that Jupiter's primordial wandering initiated a collisional cascade that ultimately destroyed the first generation population of more massive planets once residing in Earth's current orbital neighborhood. This process wiped the inner solar system's slate clean and set the stage for the formation of the planets that exist today. "Ultimately, what this means," says Batygin, "is that planets truly like Earth are intrinsically not very common."

Credit: Nicolás Wey-Gόmez/Caltech

Human understanding of the world has evolved over centuries, anchored to scientific and technological advancements and our ability to map uncharted territories. Historian Nicolás Wey-Gόmez traces this evolution and how the age of discovery helped shape culture and politics in the modern era. Using primary sources such as letters and diaries, he examines the assumptions behind Europe's encounter with the Americas, focusing on early portrayals of native peoples by Europeans. "The science and technology that early modern Europeans recovered from antiquity by way of the Arab world enabled them to imagine lands far beyond their own," says Wey-Gómez. "This knowledge provided them with an essential framework to begin to comprehend the peoples they encountered around the globe."

Body: 

At Caltech, researchers study the Earth from many angles—from investigating its origins and evolution to exploring its geology and inner workings to examining its biological systems. Taken together, their findings enable a more nuanced understanding of our planet in all its complexity, helping to ensure that it—and we—endure. This slideshow highlights just a few of the Earth-centered projects happening right now at Caltech.

Exclude from News Hub: 
Yes

An Earthquake Warning System in Our Pockets?

Researchers Test Smartphones for Advance-Notice System

While you are checking your email, scrolling through social-media feeds, or just going about your daily life with your trusty smartphone in your pocket, the sensors in that little computer could also be contributing to an earthquake early warning system. So says a new study led by researchers at Caltech and the United States Geological Survey (USGS). The study suggests that all of our phones and other personal electronic devices could function as a distributed network, detecting any ground movements caused by a large earthquake, and, ultimately, giving people crucial seconds to prepare for a temblor.

"Crowd-sourced alerting means that the community will benefit by data generated by the community," said Sarah Minson (PhD '10), a USGS geophysicist and lead author of the study, which appears in the April 10 issue of the new journal Science Advances. Minson completed the work while a postdoctoral scholar at Caltech in the laboratory of Thomas Heaton, professor of engineering seismology.

Earthquake early warning (EEW) systems detect the start of an earthquake and rapidly transmit warnings to people and automated systems before they experience shaking at their location. While much of the world's population is susceptible to damaging earthquakes, EEW systems are currently operating in only a few regions around the globe, including Japan and Mexico. "Most of the world does not receive earthquake warnings mainly due to the cost of building the necessary scientific monitoring networks," says USGS geophysicist and project lead Benjamin Brooks.

Despite being less accurate than scientific-grade equipment, the GPS receivers in smartphones are sufficient to detect the permanent ground movement, or displacement, caused by fault motion in earthquakes that are approximately magnitude 7 and larger. And, of course, they are already widely distributed. Once displacements are detected by participating users' phones, the collected information could be analyzed quickly in order to produce customized earthquake alerts that would then be transmitted back to users.

"Thirty years ago it took months to assemble a crude picture of the deformations from an earthquake. This new technology promises to provide a near-instantaneous picture with much greater resolution," says Heaton, a coauthor of the new study.

In the study, the researchers tested the feasibility of crowd-sourced EEW with a simulation of a hypothetical magnitude 7 earthquake, and with real data from the 2011 magnitude 9 Tohoku-oki, Japan earthquake. The results show that crowd-sourced EEW could be achieved with only a tiny percentage of people in a given area contributing information from their smartphones. For example, if phones from fewer than 5,000 people in a large metropolitan area responded, the earthquake could be detected and analyzed fast enough to issue a warning to areas farther away before the onset of strong shaking.

The researchers note that the GPS receivers in smartphones and similar devices would not be sufficient to detect earthquakes smaller than magnitude 7, which could still be potentially damaging. However, smartphones also have microelectromechanical systems (MEMS) accelerometers that are capable of recording any earthquake motions large enough to be felt; this means that smartphones may be useful in earthquakes as small as magnitude 5. In a separate project, Caltech's Community Seismic Network Project has been developing the framework to record and utilize data from an inexpensive array of such MEMS accelerometers.

Comprehensive EEW requires a dense network of scientific instruments. Scientific-grade EEW, such as the USGS's ShakeAlert system that is currently being implemented on the west coast of the United States, will be able to help minimize the impact of earthquakes over a wide range of magnitudes. However, in many parts of the world where there are insufficient resources to build and maintain scientific networks but consumer electronics are increasingly common, crowd-sourced EEW has significant potential.

"The U.S. earthquake early warning system is being built on our high-quality scientific earthquake networks, but crowd-sourced approaches can augment our system and have real potential to make warnings possible in places that don't have high-quality networks," says Douglas Given, USGS coordinator of the ShakeAlert Earthquake Early Warning System. The U.S. Agency for International Development has already agreed to fund a pilot project, in collaboration with the Chilean Centro Sismólogico Nacional, to test a pilot hybrid earthquake warning system comprising stand-alone smartphone sensors and scientific-grade sensors along the Chilean coast.

"Crowd-sourced data are less precise, but for larger earthquakes that cause large shifts in the ground surface, they contain enough information to detect that an earthquake has occurred, information necessary for early warning," says study coauthor Susan Owen of JPL.

Additional coauthors on the paper, "Crowdsourced earthquake early warning," are from the USGS, Carnegie Mellon University–Silicon Valley, and the University of Houston. The work was supported in part by the Gordon and Betty Moore Foundation, the USGS Innovation Center for Earth Sciences, and the U.S. Department of Transportation Office of the Assistant Secretary for Research and Technology.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Explaining Saturn’s Great White Spots

Every 20 to 30 years, Saturn's atmosphere roils with giant, planet-encircling thunderstorms that produce intense lightning and enormous cloud disturbances. The head of one of these storms—popularly called "great white spots," in analogy to the Great Red Spot of Jupiter—can be as large as Earth. Unlike Jupiter's spot, which is calm at the center and has no lightning, the Saturn spots are active in the center and have long tails that eventually wrap around the planet.

Six such storms have been observed on Saturn over the past 140 years, alternating between the equator and midlatitudes, with the most recent emerging in December 2010 and encircling the planet within six months. The storms usually occur when Saturn's northern hemisphere is most tilted toward the sun. Just what triggers them and why they occur so infrequently, however, has been unclear.

Now, a new study by two Caltech planetary scientists suggests a possible cause for these storms. The study was published April 13 in the advance online issue of the journal Nature Geoscience.

Using numerical modeling, Professor of Planetary Science Andrew Ingersoll and his graduate student Cheng Li simulated the formation of the storms and found that they may be caused by the weight of the water molecules in the planet's atmosphere. Because these water molecules are heavy compared to the hydrogen and helium that comprise most of the gas-giant planet's atmosphere, they make the upper atmosphere lighter when they rain out, and that suppresses convection.

Over time, this leads to a cooling of the upper atmosphere. But that cooling eventually overrides the suppressed convection, and warm moist air rapidly rises and triggers a thunderstorm. "The upper atmosphere is so cold and so massive that it takes 20 to 30 years for this cooling to trigger another storm," says Ingersoll.

Ingersoll and Li found that this mechanism matches observations of the great white spot of 2010 taken by NASA's Cassini spacecraft, which has been observing Saturn and its moons since 2004.

The researchers also propose that the absence of planet-encircling storms on Jupiter could be explained if Jupiter's atmosphere contains less water vapor than Saturn's atmosphere. That is because saturated gas (gas that contains the maximum amount of moisture that it can hold at a particular temperature) in a hydrogen-helium atmosphere goes through a density minimum as it cools. That is, it first becomes less dense as the water precipitates out, and then it becomes more dense as cooling proceeds further. "Going through that minimum is key to suppressing the convection, but there has to be enough water vapor to start with," says Li.

Ingersoll and Li note that observations by the Galileo spacecraft and the Hubble Space Telescope indicate that Saturn does indeed have enough water to go through this density minimum, whereas Jupiter does not. In November 2016, NASA's Juno spacecraft, now en route to Jupiter, will start measuring the water abundance on that planet. "That should help us understand not only the meteorology but also the planet's formation, since water is expected to be the third most abundant molecule after hydrogen and helium in a giant planet atmosphere," Ingersoll says.

The work in the paper, "Moist convection in hydrogen atmospheres and the frequency of Saturn's giant storms," was supported by the National Science Foundation and the Cassini Project of NASA.

Writer: 
Kathy Svitil
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pages

Subscribe to RSS - GPS