Fred Raichlen (1932–2014)

Fredric ("Fred") Raichlen, professor emeritus of civil and mechanical engineering at Caltech, passed away on December 13, 2014. He was 82 years old. Raichlen was an expert on the mechanics of tsunamis, the waves created by underwater earthquakes, volcanic eruptions, and other geologic events.

Most waves propagate through the water at and just below the surface. A tsunami is fundamentally different. Because it is driven by a movement in the earth's crust, the wave extends from the seafloor all the way up to the surface. As the tsunami approaches land, the transition from deep to shallow water concentrates the wave's energy. The local topography off- and onshore focuses the onrushing wall of water, and utter devastation can follow. (The tsunami caused by the 2011 Tohoku earthquake reached six miles inland in some places and killed more than 18,000 people.)

Raichlen's wave-tank experiments enabled him to develop three-dimensional computer models of how tsunamis originate, propagate through the open ocean, and eventually run up on land. He simulated the run-up in a tank 31 feet long by 15 feet wide that was big enough to hold a scale model of an entire harbor. These experiments allowed him to design features such as breakwaters to protect a specific port or to determine where not to site vulnerable structures such as railroad tracks and oil tanks. In other experiments he determined how fast different regions within a wave move as the wave breaks, which in turn allowed him to calculate the force of the wave's impact. He also investigated the effects of the waves on floating structures such as ships moored in the harbor.

Raichlen earned his bachelor's degree in engineering from the Johns Hopkins University in 1953, and his master's and doctoral degrees at MIT in 1955 and 1962. He came to Caltech as an assistant professor of civil engineering in 1962, and he was promoted to associate professor in 1967 and to professor in 1972. In 1969, he became one of the founding faculty members of Caltech's doctoral program in environmental engineering science. He was appointed professor of civil and mechanical engineering in 1997 and professor emeritus in 2001.

Raichlen was a member of the National Academy of Engineering, and in 1994 he received the American Society of Civil Engineers' John G. Moffatt–Frank E. Nichol Harbor and Coastal Engineering Award.

The funeral will be Thursday, December 18th at 11:30 a.m. at Mt. Sinai Memorial Park, 5950 Forest Lawn Drive, Los Angeles.

A full obituary will be published at a later date.

Writer: 
dsmith
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Controlling Light on a Chip at the Single-Photon Level

Watson Lecture Preview

Integrating optics and electronics into systems such as fiber-optic data links has revolutionized how we transmit information. A second revolution awaits as researchers seek to develop chips in which individual atoms control the movement of light within the chip through optical "wires," and photons could replace electrons as the vehicle for performing computations. Andrei Faraon (BS '04), an assistant professor of applied physics and materials science in the Division of Engineering and Applied Science, presents a preview of this revolution at 8 p.m. on Wednesday, December 17, in Caltech's Beckman Auditorium. Admission is free.

 

Q: What do you do?

A: My field is nanophotonics, which means that I build ultrasmall devices to control light. Electronic devices use silicon chips to control the flow of electrical charge and to store it. We're trying to do the same thing with light, using single atoms or small ensembles of atoms that act like the transistors in electronic circuits. We're building devices out of complex oxides that are similar to glass, or to quartz, but contain atoms called lanthanides that have very interesting optical properties. Lanthanide atoms can absorb light and then release it on demand at some later time. So we're making a memory device. We want to take a single photon, put it in this device, and then push a button and release that photon back into a larger optical circuit.

These devices look like the world's smallest Toblerone bars, except that each tiny triangle is attached to its neighbors at the tip instead of the base. Each bar is about ten microns long, which is about the diameter of a red blood cell, and we make them using techniques similar to those used to make computer chips. The only difference is that we're not using silicon, but these complex oxides containing lanthanide atoms.

In order to use these devices, we would have them connected to one another by on-chip optical waveguides, or to a larger system by optical fibers. The signal would come in through the fiber, get stored in our Toblerone bar for a while, and then be released back into the fiber at the appropriate time. We dream of someday having billions of components on the same chip, but even one bit of optical memory can be useful.

 

Q: What would you use a one-bit memory for?

A: It can be used in systems that transmit information with absolute security by what's called quantum communication. You can create an absolutely secure cryptographic key by sending a sequence of single photons, one at a time. Then you use the key to encrypt your data, which you can now send openly over the Internet. The systems that generate these keys over long distances are based on you sending a series of single photons from point A to point C, and a recipient sends another series from point B to point C. At point C you perform an operation on the photons, you combine them in some way, and that is used to establish the key.

However, optical fibers are not 100 percent efficient, especially over long distances. When you transmit photons through hundreds of kilometers of cable, some of them are going to get absorbed by the fiber. This isn't a big problem when each bit of information is encoded in a big pulse of light, but quantum communication depends on being able to interfere two paired single photons with each other. Many times when the photon from point A arrives at point C, you will have to store it until the photon from point B comes in. It might take a few tries before the second photon arrives, and until then the other one just waits for it.

 

Q: How did you get into this line of work?

A: When I was an undergraduate here, I really liked quantum physics. But I also had a passion for doing things that have some applications, so I went into applied physics for my PhD at Stanford. Transmitting information via cryptographic keys is one of the applications of quantum physics, so I got into this field for my PhD, and continued it during my postdoc at HP Laboratories, and now as a faculty member here.

 

Q: Bonus question: What's it like coming back as a professor after having been here as a student?

A: It feels great. Coming back to Caltech is a privilege. As a faculty member you have the satisfaction of working with all these extremely smart kids. And my wife [Assistant Professor of Biology Viviana Gradinaru (BS '05)] and I go to many undergraduate events where we connect with the students, so it feels good.

 

Named for the late Caltech professor Earnest C. Watson, who founded the series in 1922, the Watson Lectures present Caltech and JPL researchers describing their work to the public. Many past Watson Lectures are available online at Caltech's iTunes U site.

 

Writer: 
Douglas Smith
Frontpage Title: 
This Wednesday: Controlling Light at the Single-Photon Level
Writer: 
Exclude from News Hub: 
No
Short Title: 
This Wednesday: Controlling Light at the Single-Photon Level
News Type: 
In Our Community

Quantum Code-Cracking: An Interview with Thomas Vidick

Quantum computers, looked to as the next generation of computing technology, are expected to one day vastly outperform conventional computers. Using the laws of quantum mechanics—the physics that governs the behavior of matter and light at the atomic and subatomic scales—these computers will allow us to move and analyze enormous amounts of information with unprecedented speed. Although engineers have yet to actually build such a machine, Assistant Professor of Computing and Mathematical Sciences Thomas Vidick is figuring out how some of the principles of quantum computing can be applied right now, using today's technology.

Originally from Belgium, Vidick received his BS from École Normale Supérieure in Paris in 2007 and his master's degree from Université Paris Diderot, also in 2007. He earned his doctorate from UC Berkeley in 2011. Vidick joined the Division of Engineering and Applied Science at Caltech in June from MIT, where he was a postdoctoral associate.

This fall, he spoke with us about quantum methods for encrypting information, what he's looking forward to at Caltech, and his ongoing search for the best croissants in Los Angeles.

 

What are your research interests?

My area is quantum computing, so it's the computer science of quantum physics. Classical computers—like the computer on my desk—work based on the laws of classical mechanics. They just manipulate bits and do various operations. However, in the 1970s people started to wonder what kinds of computational processes could be realized using quantum-mechanical systems. They ended up discovering algorithms that in some cases can be more efficient, or that can implement certain tasks that were not possible with classical computers.

In my research, I look at two things. One, what are the kinds of procedures that you can implement more efficiently using quantum computers? And, two, what kinds of cryptographic systems—ways to encrypt information securely—can you come up with using quantum systems that could be more secure than classical systems? It's all about this exploration of what quantum systems allow us to do that classical systems didn't or wouldn't.

 

Quantum computers haven't been invented yet, so how do you do this work?

That's a good question, and there are several different answers. Some of my research is very theoretical, and it's just about saying, "If we had a quantum computer, what could we do with it?" We don't have a quantum computer yet because it's very hard to manipulate and control quantum systems on a large scale. But that is just an engineering problem, so what people say is that yes it's very hard, but in 10 years, we'll get to it. And the theory is also very hard, so we might as well get started right now.

That's one answer. But the better answer is that a lot of what I do and a lot of what I'm interested in doesn't require or depend on whether we can actually build a quantum computer or not. For instance, the cryptographic aspects of quantum computing are already being implemented. There are start-ups that already sell quantum cryptographic systems on the Internet because these systems only require the manipulation of very-small-scale quantum systems.

We can also do some computations about properties of quantum-mechanical systems on a classical computer. One branch of my research has to do with how you can come up with classical algorithms for computing the properties of systems that are described by the laws of quantum mechanics. The most natural way to understand the systems would be to have a quantum computer and then use the quantum computer to simulate the evolution of the quantum-mechanical system. Since we don't have a quantum computer, we have to develop these algorithms using a classical computer and our understanding of the quantum-mechanical system.

 

Can you give a real-world example of how this work might affect the ways in which an average person uses a computer in the future?

One of the most basic ways that quantum cryptographic tasks are used is to come up with a secret key or passcode to encrypt communication. For instance, the two of us, we trust one another, but we're far away from each other. We want to come up with a secret key—just some sort of passcode that we're going to use to encrypt our communication later. I could dream up the passcode and then tell it to you over the phone, but if someone listens to the line, it's not secure. There might constantly be someone listening in on the line, so there is no passcode procedure to exchange secret keys between us, unless we meet up in person.

However, it is known that if we are able to send quantum messages, then actually we could do it. How this works is that, instead of sending you a passcode of my choice, I would send you a bunch of photons, which are quantum particles, prepared in a completely random state. There is then a whole quantum protocol in which you need to measure the photons, but the main point is that at the end, we'll each be able to extract the exact same passcode: me from the way the photons were prepared, and you from the measurement results. The code will be random, but we'll both know it.

And because of the laws of quantum mechanics, if anyone has been listening on the line—intercepting the photons—we'll be able to tell. The reason for this is that any disturbance of the photon's quantum-mechanical states can be detected from the correlations between the outcomes of the measurements and the initial state of the photons. This is called an "entropy-disturbance tradeoff"—if the eavesdropper perturbs the photons then the outcome distribution you observe is affected in a way that can be checked. This is a uniquely quantum phenomenon, and

it allows distant parties to establish a secret key or a passcode between them in a perfectly secure way.

 

How does your work address this?

This system of sending quantum messages was discovered in the '80s, and, as I said before, people are already implementing it. But there is one big drawback to quantum cryptography, and that's that you need quantum equipment to do it—and this quantum equipment tends to be really clunky. It's very hard to come up with a machine that sends photons one by one, and since single photons can be easily lost, it's also hard to make accurate measurements. Also, you need a machine that can generate single photons and a machine that can detect single photons for the message to be secure.

In practice, we don't have such machines. We have these huge clunky machines that can sort of do it, but they're never perfect. My work tries to bypass the need for these machines, with cryptographic protocols and proofs of security that are secure even if you can't make or see the quantum part of the protocol. To do this, we model the quantum equipment just as a black box. So my work has been to try to get these strong proofs of security into a model where we only really trust the interactions we can see in the classical world. It's a proof of security that holds independently of whether the quantum part of the device works in the way that we think it does.

 

How did you get into this field?

I was doing math originally. I was doing number theory as an undergrad and I liked it a lot. But then I did an internship, and I realized that I couldn't tell anyone why I was asking the questions I was asking. So I thought, "I need a break from this. Whatever I do for my life, I need to know why I'm doing it." The best alternative I could think of was computer science, because it seemed more concrete. And this was when I learned that quantum computing existed—I didn't know before. I think what's most interesting about it is that you're talking about the world—because the world is quantum mechanical. Physics describes the world.

That's what I really like, because from my point of view everything I do is still very theoretical work and I like doing theoretical work. I like the beauty of it. I like the abstractness of it. I like that you have well-posed problems and you can give well-defined answers. But I also like the fact that in the end you are talking about or trying to talk about real-world physics. So every time I think "Why am I doing this?" or "What should I do?" I try to think of how I can connect it to a real, concrete question.

 

How did you get interested in math and computer science when you were a kid?

My dad was a chemist but he worked as an engineer, and he would come home from work and would bring home different experiments with liquid nitrogen or whatever.

I guess he gave me sort of a scientific mind, but then why did I do math problems? Probably like most people good at math, I was just good at it for some reason and it was just easy. Math is so beautiful when you understand it. Throughout middle school and high school, I just enjoyed it so much. But then, as I said, eventually I stretched my limits in math a little bit.

 

What are you excited about in terms of coming to Caltech?

I really like the Computing and Mathematical Sciences department here—it's a young department and it's a small department. For me it's very unique in that there's a very strong group in quantum information—especially the physics side of quantum information, like my neighbor here, John Preskill. Caltech has a very strong group in quantum information and also has a very strong group in computer science. And so, from the point of view of my research, this is just the perfect place.

And then there are the mountains. I love the mountains—they're just beautiful. This is how I abandoned the smoky Paris cafes. I had to think about the mountains. You can't beat the view from my office, and I can go hike up there.

 

Other than hiking, do you have any hobbies or interests that are outside your research?

I also like to bike up mountains. I did that a lot when I came here, but then I fractured my collarbone while biking. It's almost better now, but I still haven't gotten back on the bike yet. Another thing that is an investment of time—and I'm really worried about that one—is croissant hunting. I really like croissants and chocolates. I'm from Belgium, and Belgium is pretty big on chocolate. I've already been to a lot of famous croissant and chocolate places in L.A., but I haven't found something that has lived up to my standards yet. I haven't done everything though, so I'm open to recommendations.

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

New Technique Could Harvest More of the Sun's Energy

As solar panels become less expensive and capable of generating more power, solar energy is becoming a more commercially viable alternative source of electricity. However, the photovoltaic cells now used to turn sunlight into electricity can only absorb and use a small fraction of that light, and that means a significant amount of solar energy goes untapped.

A new technology created by researchers from Caltech, and described in a paper published online in the October 30 issue of Science Express, represents a first step toward harnessing that lost energy.

Sunlight is composed of many wavelengths of light. In a traditional solar panel, silicon atoms are struck by sunlight and the atoms' outermost electrons absorb energy from some of these wavelengths of sunlight, causing the electrons to get excited. Once the excited electrons absorb enough energy to jump free from the silicon atoms, they can flow independently through the material to produce electricity. This is called the photovoltaic effect—a phenomenon that takes place in a solar panel's photovoltaic cells.

Although silicon-based photovoltaic cells can absorb light wavelengths that fall in the visible spectrum—light that is visible to the human eye—longer wavelengths such as infrared light pass through the silicon. These wavelengths of light pass right through the silicon and never get converted to electricity—and in the case of infrared, they are normally lost as unwanted heat.

"The silicon absorbs only a certain fraction of the spectrum, and it's transparent to the rest. If I put a photovoltaic module on my roof, the silicon absorbs that portion of the spectrum, and some of that light gets converted into power. But the rest of it ends up just heating up my roof," says Harry A. Atwater, the Howard Hughes Professor of Applied Physics and Materials Science; director, Resnick Sustainability Institute, who led the study.

Now, Atwater and his colleagues have found a way to absorb and make use of these infrared waves with a structure composed not of silicon, but entirely of metal.

The new technique they've developed is based on a phenomenon observed in metallic structures known as plasmon resonance. Plasmons are coordinated waves, or ripples, of electrons that exist on the surfaces of metals at the point where the metal meets the air.

While the plasmon resonances of metals are predetermined in nature, Atwater and his colleagues found that those resonances are capable of being tuned to other wavelengths when the metals are made into tiny nanostructures in the lab.

"Normally in a metal like silver or copper or gold, the density of electrons in that metal is fixed; it's just a property of the material," Atwater says. "But in the lab, I can add electrons to the atoms of metal nanostructures and charge them up. And when I do that, the resonance frequency will change."

"We've demonstrated that these resonantly excited metal surfaces can produce a potential"—an effect very similar to rubbing a glass rod with a piece of fur: you deposit electrons on the glass rod. "You charge it up, or build up an electrostatic charge that can be discharged as a mild shock," he says. "So similarly, exciting these metal nanostructures near their resonance charges up those metal structures, producing an electrostatic potential that you can measure."

This electrostatic potential is a first step in the creation of electricity, Atwater says. "If we can develop a way to produce a steady-state current, this could potentially be a power source. He envisions a solar cell using the plasmoelectric effect someday being used in tandem with photovoltaic cells to harness both visible and infrared light for the creation of electricity.

Although such solar cells are still on the horizon, the new technique could even now be incorporated into new types of sensors that detect light based on the electrostatic potential.

"Like all such inventions or discoveries, the path of this technology is unpredictable," Atwater says. "But any time you can demonstrate a new effect to create a sensor for light, that finding has almost always yielded some kind of new product."

This work was published in a paper titled, "Plasmoelectric Potentials in Metal Nanostructures." Other coauthors include first author Matthew T. Sheldon, a former postdoctoral scholar at Caltech; Ana M. Brown, an applied physics graduate student at Caltech; and Jorik van de Groep and Albert Polman from the FOM Institute AMOLF in Amsterdam. The study was funded by the Department of Energy, the Netherlands Organization for Scientific Research, and an NSF Graduate Research Fellowship.

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Tuesday, December 2, 2014
Guggenheim 101 (Lees-Kubota Lecture Hall)

PUSD: Annual Open Enrollment

New Center Supports Data-Driven Research

With the advanced capabilities of today's computer technologies, researchers can now collect vast amounts of information with unprecedented speed. However, gathering information is only one half of a scientific discovery, as the data also need to be analyzed and interpreted. A new center on campus aims to hasten such data-driven discoveries by making expertise and advanced computational tools available to Caltech researchers in many disciplines within the sciences and the humanities.

The new Center for Data-Driven Discovery (CD3), which became operational this fall, is a hub for researchers to apply advanced data exploration and analysis tools to their work in fields such as biology, environmental science, physics, astronomy, chemistry, engineering, and the humanities.

The Caltech center will also complement the resources available at JPL's Center for Data Science and Technology, says director of CD3 and professor of astronomy George Djorgovski.

"Bringing together the research, technical expertise, and respective disciplines of the two centers to form this joint initiative creates a wonderful synergy that will allow us opportunities to explore and innovate new capabilities in data-driven science for many of our sponsors," adds Daniel Crichton, director of the Center for Data Science and Technology at JPL.

At the core of the Caltech center are staff members who specialize in both computational methodology and various domains of science, such as biology, chemistry, and physics. Faculty-led research groups from each of Caltech's six divisions and JPL will be able to collaborate with center staff to find new ways to get the most from their research data. Resources at CD3 will range from data storage and cataloguing that meet the highest "housekeeping" standards, to custom data-analysis methods that combine statistics with machine learning—the development of algorithms that can "learn" from data. The staff will also help develop new research projects that could benefit from large amounts of existing data.

"The volume, quality, and complexity of data are growing such that the tools that we used to use—on our desktops or even on serious computing machines—10 years ago are no longer adequate. These are not problems that can be solved by just buying a bigger computer or better software; we need to actually invent new methods that allow us to make discoveries from these data sets," says Djorgovski.

Rather than turning to off-the-shelf data-analysis methods, Caltech researchers can now collaborate with CD3 staff to develop new customized computational methods and tools that are specialized for their unique goals. For example, astronomers like Djorgovski can use data-driven computing in the development of new ways to quickly scan large digital sky surveys for rare or interesting targets, such as distant quasars or new kinds of supernova explosions—targets that can be examined more closely with telescopes, such as those at the W. M. Keck Observatory, he says.

Mary Kennedy, the Allen and Lenabelle Davis Professor of Biology and a coleader of CD3, says that the center will serve as a bridge between the laboratory-science and computer-science communities at Caltech. In addition to matching up Caltech faculty members with the expertise they will need to analyze their data, the center will also minimize the gap between those communities by providing educational opportunities for undergraduate and graduate students.

"Scientific development has moved so quickly that the education of most experimental scientists has not included the techniques one needs to synthesize or mine large data sets efficiently," Kennedy says. "Another way to say this is that 'domain' sciences—biology, engineering, astronomy, geology, chemistry, sociology, etc.—have developed in isolation from theoretical computer science and mathematics aimed at analysis of high-dimensional data. The goal of the new center is to provide a link between the two."

Work in Kennedy's laboratory focuses on understanding what takes place at the molecular level in the brain when neuronal synapses are altered to store information during learning. She says that methods and tools developed at the new center will assist her group in creating computer simulations that can help them understand how synapses are regulated by enzymes during learning.

"The ability to simulate molecular mechanisms in detail and then test predictions of the simulations with experiments will revolutionize our understanding of highly interconnected control mechanisms in cells," she says. "To some, this seems like science fiction, but it won't stay fictional for long. Caltech needs to lead in these endeavors."

Assistant Professor of Biology Mitchell Guttman says that the center will also be an asset to groups like his that are trying to make sense out of big sets of genomic data. "Biology is becoming a big-data science—genome sequences are available at an unprecedented pace. Whereas it took more than $1 billion to sequence the first genome, it now costs less than $1,000," he says. "Making sense of all this data is a challenge, but it is the future of biomedical research."

In his own work, Guttman studies the genetic code of lncRNAs, a new class of gene that he discovered, largely through computational methods like those available at the new center. "I am excited about the new CD3 center because it represents an opportunity to leverage the best ideas and approaches across disciplines to solve a major challenge in our own research," he says.

But the most valuable findings from the center could be those that stem not from a single project, but from the multidisciplinary collaborations that CD3 will enable, Djorgovski says. "To me, the most interesting outcome is to have successful methodology transfers between different fields—for example, to see if a solution developed in astronomy can be used in biology," he says.

In fact, one such crossover method has already been identified, says Matthew Graham, a computational scientist at the center. "One of the challenges in data-rich science is dealing with very heterogeneous data—data of different types from different instruments," says Graham. "Using the experience and the methods we developed in astronomy for the Virtual Observatory, I worked with biologists to develop a smart data-management system for a collection of expression and gene-integration data for genetic lines in zebrafish. We are now starting a project along similar methodology transfer lines with Professor Barbara Wold's group on RNA genomics."

And, through the discovery of more tools and methods like these, "the center could really develop new projects that bridge the boundaries between different traditional fields through new collaborations," Djorgovski says.

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Converting Data Into Knowledge: An Interview with Yisong Yue

When a movie-streaming service recommends a new film you might like, sometimes that recommendation becomes a new favorite; other times, the computer's suggestion really misses the mark. Yisong Yue, assistant professor of computing and mathematical sciences, is interested in how systems like these can better "learn" from human behavior as they turn raw data into actionable knowledge—a concept called machine learning.

Yue joined the Division of Engineering and Applied Science at Caltech in September after spending a year as a research scientist at Disney Research. Born in Beijing and raised in Chicago, Yue completed a bachelor's degree at the University of Illinois in 2005, a doctorate at Cornell University in 2010, and an appointment as a postdoctoral researcher at Carnegie-Mellon in 2013.

Recently he spoke with us about his research interests, his hobbies, and what he is looking forward to here at Caltech.

 

What is your main area of research?

My main research interests are in machine learning. Machine learning is the study of how computers can take raw data or annotated data and convert that into knowledge and actionable items, ideally in a fully automated way—because it's one thing to just have a lot of data, but it's another thing to have knowledge that you can derive from that data.

 

Is machine learning a general concept that can be applied to many different fields?

That's right. Machine learning is becoming a more and more general tool as we become a more digital society. In the past, some of my research has been applied to applications such as data-driven animation, sports analytics, personalized recommender systems, and adaptive urban transportation systems.

 

What application of this work are you most excited about right now?

This is tough because I'm excited about all of them, really, but if I had to just pick one, it would be human-in-the-loop machine learning. The idea is that although we would love to have computers that can derive knowledge from data in a fully automated way, oftentimes the problem is too difficult or it would take too long. So machine learning with humans in the loop acknowledges that we can learn from how humans behave in a system.

I think that we are entering a society where we depend on digital systems for basically everything we do. And that means we have an opportunity to learn from humans how to optimize our daily lives. Because human interaction with digital systems is so ubiquitous, I think learning with humans in the loop is a very compelling research agenda moving forward.

 

Can you give an example of humans-in-the-loop machine learning that we experience on a daily basis?

One example of humans-in-the-loop that we experience fairly regularly is a personalized recommender system. Many websites have a recommendation system built into them, and the system would like to provide personalized recommendations to maximize feedback and engagement of that user with the system. However, when there is a brand-new user, the system doesn't really understand their interests. What the system can do is recommend some stuff and see if the user likes it or not, and their response—thumbs up, thumbs down, or whatever—is an indicator of the topics or content this user is interested in. You see this sort of closed loop between a machine learning system that's trying to learn how best to personalize to a user and a user that's using the system and providing feedback on the fly.

 

You also mentioned animation. How is your work applied in that field?

Before I came to Caltech, I spent one year as a research scientist at Disney Research. I worked on both sports analytics and data-driven animation. With regard to the animation, the basic idea is as follows: you take data about how humans talk in a natural sentence-speaking setting, and then you try to automatically generate natural lip movements or facial movements that correspond to the types of sentences that people would normally say. This is something that people at Disney Research have been working on for a while, so they have a lot of expertise here.

One of the things that you notice many times with animation is that either the character's lip movements are fairly unrealistic—like their mouths just open and close—or in the big-budget movies, it takes a team of artists to manually animate the character's lips. An interesting in-between technology would be to have fairly realistic automatically generated lip movements and facial movements to any type of sentence.

 

What are you looking forward to now that you're at Caltech?

Here I have a combination of research independence, talented colleagues, and support for my research endeavor—and a great culture for intellectual curiosity.

It's such a tight-knit community. It's one of the smallest institutions that I'm familiar with, and what that implies is that basically everyone knows everyone else. The great thing about that is that if you have a question about something that you may not be so knowledgeable about, it's really not that big of a deal to go down the block to talk to someone who works in that field, and you can get information and insight from that person.

 

Have you already begun collaborating with any of your new colleagues?

I'm starting a collaboration with Professor Pietro Perona [Allen E. Puckett Professor of Electrical Engineering] from electrical engineering and Professor Frederick Eberhardt [Professor of Philosophy]. In that collaboration, we'll be addressing a problem that biologists and neuroscientists at Caltech face in assessing how genes affect behavior. These researchers modify the genes of animals—such as fruit flies—and then they video the animal's resulting behaviors. The problem is that researchers don't have time to manually inspect hours upon hours of video to find the particular behavior they're interested in. Professor Perona has been working on this challenge in the past few years, and I was recently brought in to become a part of this collaboration because I work on machine learning and big-data analysis.

The goal is to develop a way to take raw video data of animals under various conditions and try to automatically digest, process, and summarize the significant behaviors in that video data, such as an aggressive attack or attempt to mate.

 

Tell us a little bit about your background.

It is a bit all over the place. I was born in Beijing. I moved to Chicago when I was fairly young, and I spent most of my childhood in Chicago and the surrounding areas. But my parents actually moved out of Chicago after my sister and I left for college, and so I really don't have any relatives or strong ties to Chicago anymore. Where I call home is … I don't really know where I call home. I guess Pasadena is my home.

 

Do you have any hobbies outside of your research?

I like hiking and photography, and I'm really excited to try some of the hiking trails in the area and to bring my camera and my tripod with me.

I have a few other hobbies, although I don't really have the time to do them as much now. I was part of an improv group in high school, and I did a fair amount of comedic acting. I wasn't very good at it, so it's not something I can really brag about, but it was fun. I am also an avid eSports fan. For instance, I love watching and playing StarCraft.

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Heat Transfer Sets the Noise Floor for Ultrasensitive Electronics

A team of engineers and scientists has identified a source of electronic noise that could affect the functioning of instruments operating at very low temperatures, such as devices used in radio telescopes and advanced physics experiments.

The findings, detailed in the November 10 issue of the journal Nature Materials, could have implications for the future design of transistors and other electronic components.

The electronic noise the team identified is related to the temperature of the electrons in a given device, which in turn is governed by heat transfer due to packets of vibrational energy, called phonons, that are present in all crystals. "A phonon is similar to a photon, which is a discrete packet of light," says Austin Minnich, an assistant professor of mechanical engineering and applied physics in Caltech's Division of Engineering and Applied Science and corresponding author of the new paper. "In many crystals, from ordinary table salt to the indium phosphide crystals used to make transistors, heat is carried mostly by phonons."

Phonons are important for electronics because they help carry away the thermal energy that is injected into devices in the form of electrons. How swiftly and efficiently phonons ferry away heat is partly dependent on the temperature at which the device is operated: at high temperatures, phonons collide with one another and with imperfections in the crystal in a phenomenon called scattering, and this creates phonon traffic jams that result in a temperature rise.

One way that engineers have traditionally reduced phonon scattering is to use high-quality materials that contain as few defects as possible. "The fewer defects you have, the fewer 'road blocks' there are for the moving phonons," Minnich says.

A more common solution, however, is to operate electronics in extremely cold conditions because scattering drops off dramatically when the temperature dips below about 50 kelvins, or about –370 degrees Fahrenheit. "As a result, the main strategy for reducing noise is to operate the devices at colder and colder temperatures," Minnich says.

But the new findings by Minnich's team suggest that while this strategy is effective, another phonon transfer mechanism comes into play at extremely low temperatures and severely restricts the heat transfer away from a device.

Using a combination of computer simulations and real-world experiments, Minnich and his team showed that at around 20 kelvins, or –424 degrees Fahrenheit, the high-energy phonons that are most efficient at transporting heat away quickly are unlikely to be present in a crystal. "At 20 kelvins, many phonon modes become deactivated, and the crystal has only low-energy phonons that don't have enough energy to carry away the heat," Minnich says. "As a result, the transistor heats up until the temperature has increased enough that high-energy phonons become available again."

As an analogy, Minnich says to imagine an object that is heated until it is white hot. "When something is white hot, the full spectrum of photons, from red to blue, contribute to the heat transfer, and we know from everyday experience that something white hot is extremely hot," he says. "When something is not as hot it glows red, and in this case heat is only carried by red photons with low energy. The physics for phonons is exactly the same—even the equations are the same."

The electronic noise that the team identified has been known about for many years, but until now it was not thought to play an important role at low temperatures. That discovery happened because of a chance encounter between Minnich and Joel Schleeh, a postdoctoral scholar from Chalmers University of Technology in Sweden and first author of the new study, who was at Caltech visiting the lab of Sander Weinreb, a senior faculty associate in electrical engineering.

Schleeh had noticed that the noise he was measuring in an amplifier was higher than what theory predicted. Schleeh mentioned the problem to Weinreb, and Weinreb recommended he connect with Minnich, whose lab studies heat transfer by phonons. "At another university, I don't think I would have had this chance," Minnich says. "Neither of us would have had the chance to interact like we did here. Caltech is a small campus, so when you talk to someone, almost by definition they're outside of your field."

The pair's findings could have implications for numerous fields of science that rely on superchilled instruments to make sensitive measurements. "In radio astronomy, you're trying to detect very weak electromagnetic waves from space, so you need the lowest noise possible," Minnich says.

Electronic noise poses a similar problem for quantum-physics experiments. "Here at Caltech, we have physicists trying to observe certain quantum-physics effects. The signal that they're looking for is very tiny, and it's essential to use the lowest-noise electronics possible," Minnich says.

The news is not all gloomy, however, because the team's findings also suggest that it may be possible to develop engineering strategies to make phonon heat transfer more efficient at low temperatures. For example, one possibility might be to change the design of transistors so that phonon generation takes place over a broader volume. "If you can make the phonon generation more spread out, then in principle you could reduce the temperature rise that occurs," Minnich says.

"We don't know what the precise strategy will be yet, but now we know the direction we should be going. That's an improvement."

In addition to Minnich and Schleeh, the other coauthors of the paper, "Phonon blackbody radiation limit for heat dissipation in electronics," are Javier Mateos and Ignacio Iñiguez-de-la-Torre of the Universidad de Salamanca in Salamanca, Spain; Niklas Wadefalk of the Low Noise Factory AB in Mölndal, Sweden; and Per A. Nilsson and Jan Grahn of Chalmers University of Technology. Minnich's work on the project at Caltech was funded by a Caltech start-up fund and by the National Science Foundation.

Written by Ker Than

Writer: 
Ker Than
Contact: 
Exclude from News Hub: 
No
News Type: 
Research News

Making Hotter Engines and Lasting Artwork: An Interview with Katherine Faber

Ceramics are extremely versatile materials. Because they can be formed into a variety of shapes and serve as effective insulators from heat, they're used in thousands of applications ranging from dainty porcelain teacups to hardy barrier coatings in engines. However, the characteristic brittle nature of ceramics can often be the material's Achilles' heel. New faculty member Katherine Faber, Simon Ramo Professor of Materials Science in Caltech's Division of Engineering and Applied Science, studies the reasons why brittle ceramics fracture—and how these materials can be made stronger and tougher in the future.

Faber, who comes to Caltech from Northwestern University, received her bachelor's degree in ceramic engineering from Alfred University, her master's degree in ceramic science from Penn State, and a doctorate in materials science from UC Berkeley.

Recently, she spoke about her work, her background, and how her research interests have been applied to sustainability and the arts.

 

What will you be working on in your laboratory at Caltech?

My training is in ceramic materials, studying the fracture of brittle solids. If we understand enough about how brittle materials fail, we can then design new materials that are more robust. In particular, I am interested in high-temperature materials—those desirable for energy-related applications. This includes, for example, the ceramic coatings that are used in power-generation applications, as thermal-barrier coatings in engines. These very-high-temperature materials provide insulation, protecting underlying metallic materials, so that an engine can run hotter and hence more efficiently. Right now, we are characterizing new coating systems for next-generation engines.

 

Does your laboratory also make brand-new materials?

Yes we do. One thing that I have found through the years is that often times the materials I want to study are ones that don't exist. That has moved my research into ceramic processing. Currently we are looking at strategies to make ceramic materials that are filled with pores. Such ceramics might be used as filters, in fuel cells, or as biological scaffolds into which cells can grow. Beyond processing, it is essential to characterize the pores. It's not just the pore size, or what fraction of the material is porous, but how tortuous the pathway through the porous network is. This will determine the ease of flow or filtration capabilities Each application may require a different level of connectivity and tortuosity. Here, we rely on three-dimensional imaging to illustrate these features.

 

Your work also touches on art conservation. How did that come about?

Yes, that's the other part of my work. It all happened rather serendipitously more than a decade ago. When I was chair of my department at Northwestern, the Art Institute of Chicago received funding to hire its very first PhD-level conservation scientist. Museum staff approached me to see if our department of materials science and engineering might be interested in collaborating with their scientist in order to provide a scholarly community and possible opportunities for research. We gladly forged the partnership. My personal involvement included projects on jades and porcelains. But I also became the matchmaker, finding the right people within the university for the right problems at the Art Institute.

Art conservation is not what I was trained to do, but involvement in museum work has become an important part of my career. It's still materials science and engineering, it's just that the "materials" in the museum projects happen to be valuable works of art.

We've been able to involve students through the years, and they, too, see this as a thrilling opportunity to take their training in materials science and engineering to one of the great museums of the world.

 

Do you also enjoy the arts in your spare time?

I've always loved going to art museums, and I'm really looking forward to exploring the museums here in Los Angeles. To be within walking distance of the Huntington Library is just extraordinary. I also love the theater, so I'm ready to discover the Pasadena Playhouse and beyond. We already have tickets for a couple of shows.

 

Are there any other reasons that you're excited to have made the move from Chicago to Southern California?

Well, given the winter that we had in Chicago last year, I am very excited that I don't have to shovel snow this winter. To wake up in the morning and go outside your door to grab a fresh grapefruit off of a tree—that's pretty cool.

 

What else are you looking forward to about being at Caltech?

One of the best things about moving to a new institution later in one's career is that it provides an opportunity to make new connections and work on new problems. That's what I'm most excited about. I suspect that as I meet people across the campus and at JPL I'll learn about a host of new research problems that will intrigue me.

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community
Wednesday, October 29, 2014
Center for Student Services 360 (Workshop Space)

Meet the Outreach Guys: James & Julius

Pages

Subscribe to RSS - EAS