Yeh and Schwab Named Kavli Nanoscience Institute Codirectors

Caltech professors Nai-Chang Yeh and Keith Schwab have been named codirectors of the Kavli Nanoscience Institute (KNI), a center supporting multidisciplinary nanoscience research on campus and beyond. Yeh and Schwab have both served as KNI board members and are the first to hold the title Fletcher Jones Foundation Codirector of the KNI; the position was recently endowed by a gift from The Fletcher Jones Foundation.

"I look forward to the energy and creativity that Nai-Chang and Keith will bring to the continued evolution of the KNI as a preeminent organization propelling nanoscience forward in diverse application areas ranging from medical engineering to nanophotonics," says Ares Rosakis, Otis Booth Leadership Chair of the Division of Engineering and Applied Science.

Yeh and Schwab follow in the footsteps of professors Michael Roukes and Oskar Painter (MS '95, PhD '01).

"It is an exciting time to conduct research in nanoscience and nanotechnology," Yeh says. "As the new codirectors of KNI, our vision is not only to maintain the current role of KNI but also to make KNI an intellectual hub that facilitates Caltech research in the areas of quantum frontiers, medical/bioengineering, and sustainability," she says.  "We look forward to working with the nanoresearch community on campus and the Caltech administration to advance frontiers of nanoscience and nanotechnology."

Writer: 
Exclude from News Hub: 
Yes

Arnold Appointed New Director of Rosen Bioengineering Center

Now in its sixth year of exploring the intersection between biology and engineering, the Donna and Benjamin M. Rosen Bioengineering Center has chosen Caltech professor Frances Arnold as its new director. Arnold, the Dick and Barbara Dickinson Professor of Chemical Engineering, Bioengineering and Biochemistry began her tenure as director on June 1.

A recipient of the 2011 National Medal of Technology and Innovation, Arnold pioneered methods of "directed evolution" – processes now widely used to create biological catalysts that are important in the production of fuels from renewable resources. She was selected for the directorship because "of her demonstrated leadership in the field of bioengineering," says Stephen Mayo, William K. Bowes Jr. Foundation Chair of the Division of Biology and Biological Engineering.

The Rosen Center supports bioengineering research through the funding of fellows and faculty from many disciplines, including applied physics, chemical engineering, synthetic biology, and computer science.

"Bioengineering is an incredibly exciting field right now," Arnold says. "Solutions to some of the biggest problems in science, medicine, and sustainability will come from the interface between biology and engineering, and Caltech is well positioned to be at the forefront. The Rosen Center will help make that happen with innovative programs for bioengineering research and education."

Writer: 
Exclude from News Hub: 
Yes

Caltech Team Produces Squeezed Light Using a Silicon Micromechanical System

One of the many counterintuitive and bizarre insights of quantum mechanics is that even in a vacuum—what many of us think of as an empty void—all is not completely still. Low levels of noise, known as quantum fluctuations, are always present. Always, that is, unless you can pull off a quantum trick. And that's just what a team led by researchers at the California Institute of Technology (Caltech) has done. The group has engineered a miniature silicon system that produces a type of light that is quieter at certain frequencies—meaning it has fewer quantum fluctuations—than what is usually present in a vacuum.

This special type of light with fewer fluctuations is known as squeezed light and is useful for making precise measurements at lower power levels than are required when using normal light. Although other research groups previously have produced squeezed light, the Caltech team's new system, which is miniaturized on a silicon microchip, generates the ultraquiet light in a way that can be more easily adapted to a variety of sensor applications.

"This system should enable a new set of precision microsensors capable of beating standard limits set by quantum mechanics," says Oskar Painter, a professor of applied physics at Caltech and the senior author on a paper that describes the system; the paper appears in the August 8 issue of the journal Nature. "Our experiment brings together, in a tiny microchip package, many aspects of work that has been done in quantum optics and precision measurement over the last 40 years."

The history of squeezed light is closely associated with Caltech. More than 30 years ago, Kip Thorne, Caltech's Richard P. Feynman Professor of Theoretical Physics, Emeritus, and physicist Carlton Caves (PhD '79) theorized that squeezed light would enable scientists to build more sensitive detectors that could make more precise measurements. A decade later, Caltech's Jeff Kimble, the William L. Valentine Professor and professor of physics, and his colleagues conducted some of the first experiments using squeezed light. Since then, the LIGO (Laser Interferometer Gravitational-Wave Observatory) Scientific Collaboration has invested heavily in research on squeezed light because of its potential to enhance the sensitivity of gravitational-wave detectors.

In the past, squeezed light has been made using so-called nonlinear materials, which have unusual optical properties. This latest Caltech work marks the first time that squeezed light has been produced using silicon, a standard material. "We work with a material that's very plain in terms of its optical properties," says Amir Safavi-Naeini (PhD '13), a graduate student in Painter's group and one of three lead authors on the new paper. "We make it special by engineering or punching holes into it, making these mechanical structures that respond to light in a very novel way. Of course, silicon is also a material that is technologically very amenable to fabrication and integration, enabling a great many applications in electronics."

In this new system, a waveguide feeds laser light into a cavity created by two tiny silicon beams. Once there, the light bounces back and forth a bit thanks to the engineered holes, which effectively turn the beams into mirrors. When photons—particles of light—strike the beams, they cause the beams to vibrate. And the particulate nature of the light introduces quantum fluctuations that affect those vibrations.

Typically, such fluctuations mean that in order to get a good reading of a signal, you would have to increase the power of the light to overcome the noise. But by increasing the power you also introduce other problems, such as introducing excess heat into the system.

Ideally, then, any measurements should be made with as low a power as possible. "One way to do that," says Safavi-Naeini, "is to use light that has less noise."

And that's exactly what the new system does; it has been engineered so that the light and beams interact strongly with each other—so strongly, in fact, that the beams impart the quantum fluctuations they experience back on the light. And, as is the case with the noise-canceling technology used, for example, in some headphones, the fluctuations that shake the beams interfere with the fluctuations of the light. They effectively cancel each other out, eliminating the noise in the light.

"This is a demonstration of what quantum mechanics really says: Light is neither a particle nor a wave; you need both explanations to understand this experiment," says Safavi-Naeini. "You need the particle nature of light to explain these quantum fluctuations, and you need the wave nature of light to understand this interference."

In the experiment, a detector measuring the noise in the light as a function of frequency showed that in a frequency range centered around 28 MHz, the system produces light with less noise than what is present in a vacuum—the standard quantum limit. "But one of the interesting things," Safavi-Naeini adds, "is that by carefully designing our structures, we can actually choose the frequency at which we go below the vacuum." Many signals are specific to a particular frequency range—a certain audio band in the case of acoustic signals, or, in the case of LIGO, a frequency intimately related to the dynamics of astrophysical objects such as circling black holes. Because the optical squeezing occurs near the mechanical resonance frequency where an individual device is most sensitive to external forces, this feature would enable the system studied by the Caltech team to be optimized for targeting specific signals.

"This new way of 'squeezing light' in a silicon micro-device may provide new, significant applications in sensor technology," said Siu Au Lee, program officer at the National Science Foundation, which provided support for the work through the Institute for Quantum Information and Matter, a Physics Frontier Center. "For decades, NSF's Physics Division has been supporting basic research in quantum optics, precision measurements and nanotechnology that laid the foundation for today's accomplishments."

The paper is titled "Squeezed light from a silicon micromechanical resonator." Along with Painter and Safavi-Naeini, additional coauthors on the paper include current and former Painter-group researchers Jeff Hill (PhD '13), Simon Gröblacher (both lead authors on the paper with Safavi-Naeini), and Jasper Chan (PhD '12), as well as Markus Aspelmeyer of the Vienna Center for Quantum Science and Technology and the University of Vienna. The work was also supported by the Gordon and Betty Moore Foundation, by DARPA/MTO ORCHID through a grant from the Air Force Office of Scientific Research, and by the Kavli Nanoscience Institute at Caltech.

Writer: 
Kimm Fesenmaier
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
bbell2's picture

India Establishes Caltech Aerospace Fellowship

The Indian Department of Space / Indian Space Research Organisation (ISRO) has established a fellowship at the California Institute of Technology (Caltech) in the name of Satish Dhawan (1920–2002), a Caltech alumnus (Eng '49, PhD '51) and a pioneer of India's space program.

The Satish Dhawan Fellowship enables one aerospace engineering graduate per year from the Indian Institute of Space Science and Technology (IIST) to study at the Graduate Aerospace Laboratories at Caltech (GALCIT), as Dhawan himself did more than 60 years ago, when GALCIT was the Guggenheim Aeronautical Laboratory. Dhawan went on to serve as the director of the Indian Institute of Science (IISc), chairman of the Indian Space Commission and the ISRO, and president of the Indian Academy of Sciences. In 1969 he was named a Caltech Distinguished Alumnus.

Chaphalkar Aaditya Nitin, an IIST graduate with a Bachelor of Technology degree in aerospace engineering, has been named the first student to receive the fellowship and will start classes at GALCIT in October.

According to GALCIT director Guruswami (Ravi) Ravichandran, the John E. Goode, Jr., Professor of Aerospace and professor of mechanical engineering, ISRO established the fellowship to create a permanent pipeline of aerospace engineering leaders who will guide India's space program into the future.

"India has a very strong domestically grown space program," explains Ravichandran. "The ISRO is hoping to maintain its momentum by training students in much the same way that Dhawan was trained when he went through GALCIT decades ago."

The first three directors of what is now India's National Aerospace Laboratories were GALCIT alumni—Parameswar Nilakantan (MS '42), S. R. Valluri (MS '50, PhD '54), and Roddam Narasimha (PhD '61 "The ISRO is honoring Dhawan and Caltech with this fellowship, and it is also recognizing the historical connections between engineers and scientists in the United States and India," says Ares Rosakis, Caltech's Theodore von Kármán Professor of Aeronautics and Mechanical Engineering and Otis Booth Leadership Chair, Division of Engineering and Applied Science. "It is an endorsement of GALCIT's fundamental research approach and rigorous curriculum.

"Most academic fellowships come from private philanthropy. It is extremely rare for a government institution to endow a fellowship intended for a private research institution such as Caltech," he says.

"GALCIT has had impact on aeronautics and aerospace development in the United States and abroad, not by training engineers in large numbers but by training engineering leaders," Rosakis says. "GALCIT graduates include CEOs of aerospace companies and the heads of departments at places like MIT, Georgia Tech, and the University of Illinois. If we were to create one von Kármán every half century, as well as a few aerospace CEOs, and a few Dhawans, we would be happy."

Writer: 
Brian Bell
Frontpage Title: 
New Aerospace Fellowship
Listing Title: 
New Aerospace Fellowship
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Figuring Out Flow Dynamics

Engineers gain insight into turbulence formation and evolution in fluids

Turbulence is all around us—in the patterns that natural gas makes as it swirls through a transcontinental pipeline or in the drag that occurs as a plane soars through the sky. Reducing such turbulence on say, an airplane wing, would cut down on the amount of power the plane has to put out just to get through the air, thereby saving fuel. But in order to reduce turbulence—a very complicated phenomenon—you need to understand it, a task that has proven to be quite a challenge.

Since 2006, Beverley McKeon, professor of aeronautics and associate director of the Graduate Aerospace Laboratories at the California Institute of Technology (Caltech) and collaborator Ati Sharma, a senior lecturer in aerodynamics and flight mechanics at the University of Southampton in the U.K., have been working together to build models of turbulent flow. Recently, they developed a new and improved way of looking at the composition of turbulence near walls, the type of flow that dominates our everyday life.

Their research could lead to significant fuel savings, as a large amount of energy is consumed by ships and planes, for example, to counteract turbulence-induced drag. Finding a way to reduce that turbulence by 30 percent would save the global economy billions of dollars in fuel costs and associated emissions annually, says McKeon, a coauthor of a study describing the new method published online in the Journal of Fluid Mechanics on July 8.

"This kind of turbulence is responsible for a large amount of the fuel that is burned to move humans, freight, and fluids such as water, oil, and natural gas, around the world," she says. "[Caltech physicist Richard] Feynman described turbulence as 'one of the last unsolved problems of classical physics,' so it is also a major academic challenge."

Wall turbulence develops when fluids—liquid or gas—flow past solid surfaces at anything but the slowest flow rates. Progress in understanding and controlling wall turbulence has been somewhat incremental because of the massive range of scales of motion involved—from the width of a human hair to the height of a multi-floor building in relative terms—says McKeon, who has been studying turbulence for 16 years. Her latest work, however, now provides a way of analyzing a large-scale flow by breaking it down into discrete, more easily analyzed bits. 

McKeon and Sharma devised a new method of looking at wall turbulence by reformulating the equations that govern the motion of fluids—called the Navier-Stokes equations—into an infinite set of smaller, simpler subequations, or "blocks," with the characteristic that they can be simply added together to introduce more complexity and eventually get back to the full equations. But the benefit comes in what can be learned without needing the complexity of the full equations. Calling the results from analysis of each one of those blocks a "response mode," the researchers have shown that commonly observed features of wall turbulence can be explained by superposing, or adding together, a very small number of these response modes, even as few as three. 

In 2010, McKeon and Sharma showed that analysis of these blocks can be used to reproduce some of the characteristics of the velocity field, like the tendency of wall turbulence to favor eddies of certain sizes and distributions. Now, the researchers also are using the method to capture coherent vortical structure, caused by the interaction of distinct, horseshoe-shaped spinning motions that occur in turbulent flow. Increasing the number of blocks included in an analysis increases the complexity with which the vortices are woven together, McKeon says. With very few blocks, things look a lot like the results of an extremely expensive, real-flow simulation or a full laboratory experiment, she says, but the mathematics are simple enough to be performed, mode-by-mode, on a laptop computer.

"We now have a low-cost way of looking at the 'skeleton' of wall turbulence," says McKeon, explaining that similar previous experiments required the use of a supercomputer. "It was surprising to find that turbulence condenses to these essential building blocks so easily. It's almost like discovering a lens that you can use to focus in on particular patterns in turbulence."

Using this lens helps to reduce the complexity of what the engineers are trying to understand, giving them a template that can be used to try to visually—and mathematically—identify order from flows that may appear to be chaotic, she says. Scientists had proposed the existence of some of the patterns based on observations of real flows; using the new technique, these patterns now can be derived mathematically from the governing equations, allowing researchers to verify previous models of how turbulence works and improve upon those ideas.

Understanding how the formulation can capture the skeleton of turbulence, McKeon says, will allow the researchers to modify turbulence in order to control flow and, for example, reduce drag or noise.

"Imagine being able to shape not just an aircraft wing but the characteristics of the turbulence in the flow over it to optimize aircraft performance," she says. "It opens the doors for entirely new capabilities in vehicle performance that may reduce the consumption of even renewable or non-fossil fuels."

Funding for the research outlined in the Journal of Fluid Mechanics paper, titled "On coherent structure in wall turbulence," was provided by the Air Force Office of Scientific Research. The paper is the subject of a "Focus on Fluids" feature article that will appear in an upcoming print issue of the same journal and was written by Joseph Klewicki of the University of New Hampshire. 

Writer: 
Katie Neith
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Pushing Microscopy Beyond Standard Limits

Caltech engineers show how to make cost-effective, ultra-high-performance microscopes

Engineers at the California Institute of Technology (Caltech) have devised a method to convert a relatively inexpensive conventional microscope into a billion-pixel imaging system that significantly outperforms the best available standard microscope. Such a system could greatly improve the efficiency of digital pathology, in which specialists need to review large numbers of tissue samples. By making it possible to produce robust microscopes at low cost, the approach also has the potential to bring high-performance microscopy capabilities to medical clinics in developing countries.

"In my view, what we've come up with is very exciting because it changes the way we tackle high-performance microscopy," says Changhuei Yang, professor of electrical engineering, bioengineering and medical engineering at Caltech.  

Yang is senior author on a paper that describes the new imaging strategy, which appears in the July 28 early online version of the journal Nature Photonics.

Until now, the physical limitations of microscope objectives—their optical lenses— have posed a challenge in terms of improving conventional microscopes. Microscope makers tackle these limitations by using ever more complicated stacks of lens elements in microscope objectives to mitigate optical aberrations. Even with these efforts, these physical limitations have forced researchers to decide between high resolution and a small field of view on the one hand, or low resolution and a large field of view on the other. That has meant that scientists have either been able to see a lot of detail very clearly but only in a small area, or they have gotten a coarser view of a much larger area.

"We found a way to actually have the best of both worlds," says Guoan Zheng, lead author on the new paper and the initiator of this new microscopy approach from Yang's lab. "We used a computational approach to bypass the limitations of the optics. The optical performance of the objective lens is rendered almost irrelevant, as we can improve the resolution and correct for aberrations computationally."

Indeed, using the new approach, the researchers were able to improve the resolution of a conventional 2X objective lens to the level of a 20X objective lens. Therefore, the new system combines the field-of-view advantage of a 2X lens with the resolution advantage of a 20X lens. The final images produced by the new system contain 100 times more information than those produced by conventional microscope platforms. And building upon a conventional microscope, the new system costs only about $200 to implement.

"One big advantage of this new approach is the hardware compatibility," Zheng says, "You only need to add an LED array to an existing microscope. No other hardware modification is needed. The rest of the job is done by the computer."  

The new system acquires about 150 low-resolution images of a sample. Each image corresponds to one LED element in the LED array. Therefore, in the various images, light coming from known different directions illuminates the sample. A novel computational approach, termed Fourier ptychographic microscopy (FPM), is then used to stitch together these low-resolution images to form the high-resolution intensity and phase information of the sample—a much more complete picture of the entire light field of the sample.

Yang explains that when we look at light from an object, we are only able to sense variations in intensity. But light varies in terms of both its intensity and its phase, which is related to the angle at which light is traveling.

"What this project has developed is a means of taking low-resolution images and managing to tease out both the intensity and the phase of the light field of the target sample," Yang says. "Using that information, you can actually correct for optical aberration issues that otherwise confound your ability to resolve objects well."

The very large field of view that the new system can image could be particularly useful for digital pathology applications, where the typical process of using a microscope to scan the entirety of a sample can take tens of minutes. Using FPM, a microscope does not need to scan over the various parts of a sample—the whole thing can be imaged all at once. Furthermore, because the system acquires a complete set of data about the light field, it can computationally correct errors—such as out-of-focus images—so samples do not need to be rescanned.

"It will take the same data and allow you to perform refocusing computationally," Yang says.

The researchers say that the new method could have wide applications not only in digital pathology but also in everything from hematology to wafer inspection to forensic photography. Zheng says the strategy could also be extended to other imaging methodologies, such as X-ray imaging and electron microscopy.

The paper is titled "Wide-field, high-resolution Fourier ptychographic microscopy." Along with Yang and Zheng, Caltech graduate student Roarke Horstmeyer is also a coauthor. The work was supported by a grant from the National Institutes of Health.

Writer: 
Kimm Fesenmaier
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Thursday, September 26, 2013

Graduate TA Orientation & Teaching Conference

Otis Booth Leadership Chair Established at Caltech

$10 Million Will Accelerate Innovation

With a $10 million gift, the Los Angeles–based Otis Booth Foundation has created and endowed the Otis Booth Leadership Chair for the Division of Engineering and Applied Science (EAS) at Caltech. This endowment will provide a permanent source of funds to foster novel research ventures and educational programs.

The $10 million endowment will accelerate the invention of technologies with the potential to benefit society by giving the EAS division chair flexible funds to invest in promising ideas that arise within the division. Caltech has had the highest subject-specific rankings for engineering and technology in the world for three years running (according to the Times Higher Education World University Rankings).

"The first funds from the endowment will support time-sensitive research that is too high risk for most traditional grants," says Ares Rosakis, the inaugural holder of the Booth Leadership Chair, Chair of the Division of Engineering and Applied Science, and the Theodore von Kármán Professor of Aeronautics and Professor of Mechanical Engineering. He also plans to use the endowment to support teaching innovations—including future online courses co-taught by EAS faculty and JPL scientists—and to increase funds for student aid, faculty start-up packages, and cutting-edge research equipment.

"EAS faculty and students make an impact by driving fast-growing fields like medical engineering and addressing urgent challenges such as the development of affordable renewable energy," Rosakis says. "The Booth Leadership Chair will help me and future division chairs support potentially transformational research and educational efforts that could not go forward without early funds. This gift has the potential to benefit the world by fostering EAS's development of vital new technologies and even more vital new young leaders. I am thankful to the Booth family and want to acknowledge their tremendous vision."

"I am excited to see what inventions and ideas become realities as Dr. Rosakis and his successors at the helm of EAS use this endowment now and far into the future," says Lynn Booth, president of the Otis Booth Foundation, a Caltech trustee, and a prominent Los Angeles philanthropist. Booth's late husband, Franklin Otis Booth Jr., established the foundation in 1967. He became an investor, newspaper executive, rancher, and philanthropist after graduating from Caltech in 1944 with a BS in electrical engineering. 

"My husband held Caltech in high regard," says Booth. "I am thrilled that this gift in his honor will connect his name with the pioneering work in EAS forever and give Caltech greater financial flexibility to respond to special opportunities and unforeseen challenges."

"The Booth Leadership Chair will help fuel groundbreaking research and education in engineering and applied sciences, as well as support the development of novel technologies," says President Jean-Lou Chameau. "Our trustee, friend, and visionary partner Lynn Booth and her family and friends at the Otis Booth Foundation are investing in changing the world—the example they set is inspiring."

Over the years, six faculty members in EAS have won the National Medal of Technology and Innovation and two have won the National Medal of Science. Of the 96 professors currently in the division, 30 percent are members of the National Academy of Engineering, 11 percent are members of the National Academy of Science, and 16 percent are fellows of the American Academy of Arts and Sciences.

 

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Michael Roukes Honored by the French Republic

The French Republic honored Michael Roukes, Robert M. Abbey Professor of Physics, Applied Physics, and Bioengineering at the California Institute of Technology (Caltech), with the Chevalier de l'Ordre des Palmes Académiques (Knight of the Order of Academic Palms).

Roukes received his medal and diploma in a luncheon ceremony June 11 on the Caltech campus. Attending the ceremony were representatives from the French government, including Axel Cruau, French consul general in Los Angeles, who presented Roukes with the award; Caltech president Jean-Lou Chameau; and Roukes's family and Institute colleagues.

Emperor Napoleon Bonaparte established the Palmes Académiques awards in 1808 to honor faculty from the University of Paris. During the following two centuries, the French Republic extended eligibility for the award to include other citizens and foreigners who have made significant contributions to French education and culture. There are three grades of Palmes Académiques medals: commandeur, officier, and chevalier.

"I am honored to receive the Order of Academic Palms from the French Republic," Roukes says. "Through the cooperation and collaboration we have fostered, French and American scientists greatly increase their chances of making new discoveries that benefit both nations."

Roukes, who came to Caltech in 1992, is an expert in the field of quantum measurement and applied biotechnology. He was the founding director of the Kavli Nanoscience Institute at Caltech from 2004–2006 and codirector from 2008 to 2013. He has led numerous cross-disciplinary collaborations at Caltech and with researchers from other institutions that explore the frontiers of nanoscience. In 2010, Roukes received the Director's Pioneer Award from the National Institutes of Health for his work developing nanoscale tools for use in medical research.

In his presentation remarks, Cruau cited Roukes's collaborations with French research institutions, in particular the Laboratoire d'Electronique et des Technologies de l'Information (LETI) in Grenoble, France. Roukes and LETI CEO Laurent Malier cofounded the Alliance for Nanosystems VLSI, which focuses on creating complex nanosystems-based tools for science and industry. Cruau also noted Roukes's instrumental role in helping to start the Brain Research through Advancing Innovative Neurotechnologies, or BRAIN, Initiative, a national program proposed by President Barack Obama to build a comprehensive map of activity in the human brain.

Other Palmes Académiques recipients at Caltech are Ares J. Rosakis, von Kármán Professor of Aeronautics and professor of mechanical engineering, and chair of the Division of Engineering and Applied Science, who was named a Commandeur de l'Ordre des Palmes Académiques in 2012, and Guruswami Ravichandran, John E. Goode, Jr., Professor of Aerospace and professor of mechanical engineering, and director of the Graduate Aerospace Laboratories, who received a Chevalier de l'Ordre des Palmes Académiques in 2011.

Writer: 
Brian Bell
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Seeing Data

More data are being created, consumed, and transported than ever before, and in all areas of society, including business, government, health care, and science. The hope and promise is that this influx of information—known as big data—will be transformative: armed with insightful information, businesses will make more money while providing improved customer service, governments will be more efficient, medical science will be better able to prevent and treat diseases, and science will make discoveries that otherwise would not be possible.

But to do all that, people have to be able to make sense of the data. Scientists and engineers are employing new computational techniques and algorithms to do just that, but sometimes, to gain significant insights from data, you have to see it—and as the data become increasingly complex and bountiful, traditional bar graphs and scatter plots just won't do. And so, not only will scientists and engineers have to overcome the technical challenges of processing and analyzing such large amounts of complex data—often in real time, as the data are being collected—but they also will need new ways to visualize and interact with that information.

In a recent symposium hosted at Caltech in collaboration with the Jet Propulsion Laboratory (JPL) and Art Center College of Design in Pasadena, computer scientists, artists, and designers gathered to discuss what they called the "emerging science of big-data visualization." The speakers laid out their vision for the potential of data visualization and demonstrated its utility, power, and beauty with myriad examples.

Data visualization represents a natural intersection of art and science, says Hillary Mushkin, a visiting professor of art and design in mechanical and civil engineering at Caltech and one of the co-organizers of the symposium. Artists and scientists alike pursue endeavors that involve questions, research, and creativity, she explains. "Visualization is another entry point to the same practice—another kind of inquiry that we are already engaged in."

Traditionally, data visualization tends to be reductionist, said self-described data artist Jer Thorp in his talk at the symposium. Charts and graphs are usually used to distill complex data into a simpler message or idea. But the promise of big data is that it contains hidden insight and knowledge. To gain that deeper understanding, he explained, we must embrace the inherent complexity of data. Data visualization, therefore, should be revelatory instead of just reductionist—not simply a way to convey information or find answers, Thorp said, but to generate and cultivate questions. Or, as he put it, the goal is question farming rather than answer farming.

For example, Thorp used Twitter to generate a model of worldwide travel—a model that, he said, was inspired by the desire to actually create a model that describes how viruses are spread. He searched for tweets that included the phrase, "just landed in" and recorded the tweeted destinations. Combining that information with the original locations of the travelers, as listed in their Twitter profiles, Thorp created an animated graphic depicting air travel. Since one way that diseases are spread globally is through air travel, the graphic—while rudimentary, he admitted—could be a starting point for epidemiological models of disease outbreaks.

Data visualization also may be interactive, allowing users to manipulate the data and peel back multiple layers of information. Thorp created such a graphic to represent the first four months of data from NASA's Kepler mission—the space telescope that has discovered thousands of possible planets. A user not only can visualize all the planet candidates from the data set but also can reorganize the planets in terms of size, temperature, or other variables.

Artist Gola Levin demonstrated how art and data can be used to provoke thoughts and ideas. One example is a project called The Secret Life of Numbers, in which he counted the number of pages that Google returned when searching for each integer from 0 to 1,000,000. He used that data to create an interactive graphic that shows interesting trends—the popularity of certain numbers like 911, 1234, or 90210, for example.

Anja-Silvia Goeing, a lecturer in history at Caltech, described the history of data visualization, highlighting centuries-old depictions of data, including maps of diseases in London; drawings and etchings of architecture, mechanical devices, and human anatomy; letter collections and address books as manifestations of social networks; and physical models of crystals and gemstones.

Data visualization, Goeing noted, has been around for many generations. What's new now is the need to visualize lots of complex data, and that, the symposium speakers argued, means we need to change how we think about data visualization. The idea that it is simply a technique or a tool is limiting, said designer Eric Rodenbeck during his presentation. Instead, we must think of visualization as a medium through which data can be explored, understood, and communicated.

This summer, Mushkin, along with the other co-organizers of the symposium—Scott Davidoff, the manager of the human interfaces group at JPL, and Maggie Hendrie, the chair of interaction design at Art Center—are mentoring undergraduate students from around the country to work on data-visualization research projects at Caltech and JPL. One group of students will work with two Caltech researchers—Ralph Adolphs, Bren Professor of Psychology and Neuroscience and professor of biology, and director of the Caltech Brain Imaging Center, and Mike Tyszka, associate director of the Caltech Brain Imaging Center—to visualize neural interactions deep inside the brain. Another will work with Caltech professor of aeronautics Beverley McKeon to visualize how fluid flows over walls. The third project involves systems engineering at JPL, challenging students to create a tool to visualize the complex process by which space missions are designed. Mushkin, Davidoff, and Hendrie also plan to invite more speakers to Caltech to talk about data visualization later in the summer.

You can watch all of the symposium talks on Caltech's YouTube channel.

 

Writer: 
Marcus Woo
Tags: 
Listing Title: 
Symposium: "Emerging Science of Big-Data Visualization"
Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Pages

Subscribe to RSS - EAS