Caltech Appoints Diana Jergovic to Newly Created Position of Vice President for Strategy Implementation

Caltech has named Diana Jergovic as its vice president for strategy implementation. In the newly created position, Jergovic will collaborate closely with the president and provost, and with the division chairs, faculty, and senior leadership on campus and at the Jet Propulsion Laboratory, to execute and integrate Caltech's strategic initiatives and projects and ensure that they complement and support the overall education and research missions of the campus and JPL. This appointment returns the number of vice presidents at the Institute to six.

"Supporting the faculty is Caltech's highest priority," says Edward Stolper, provost and interim president, "and as we pursue complex interdisciplinary and institutional initiatives, we do so with the expectation that they will evolve over a long time horizon. The VP for strategy implementation will help the Institute ensure long-term success for our most important new activities."

In her present role as associate provost for academic and budgetary initiatives at the University of Chicago, Jergovic serves as a liaison between the Office of the Provost and the other academic and administrative offices on campus, and advances campus-wide strategic initiatives. She engages in efforts spanning every university function, including development, major construction, and budgeting, as well as with faculty governance and stewardship matters. Jergovic also serves as chief of staff to University of Chicago provost Thomas F. Rosenbaum, Caltech's president-elect.

"In order to continue Caltech's leadership role and to define new areas of eminence, we will inevitably have to forge new partnerships and collaborations—some internal, some external, some both," Rosenbaum says. "The VP for strategy implementation is intended to provide support for the faculty and faculty leaders in realizing their goals for the most ambitious projects and collaborations, implementing ideas and helping create the structures that make them possible. I was looking for a person who had experience in delivering large-scale projects, understood deeply the culture of a top-tier research university, and could think creatively about a national treasure like JPL."

"My career has evolved in an environment where faculty governance is paramount," Jergovic says. "Over the years, I have cultivated a collaborative approach working alongside a very dedicated faculty leadership. My hope is to bring this experience to Caltech and to integrate it into the existing leadership team in a manner that simultaneously leverages my strengths and allows us together to ensure that the Institute continues to flourish, to retain its position as the world's leading research university, and to retain its recognition as such."

Prior to her position as associate provost, Jergovic was the University of Chicago's assistant vice president for research and education, responsible for the financial management and oversight of all administrative aspects of the Office of the Vice President for Research and Argonne National Laboratory. She engaged in research-related programmatic planning with a special emphasis on the interface between the university and Argonne National Laboratory. This ranged from the development of the university's Science and Technology Outreach and Mentoring Program (STOMP), a weekly outreach program administered by university faculty, staff, and students in low-income neighborhood schools on the South Side of Chicago, to extensive responsibilities with the university's successful bid to retain management of Argonne National Laboratory.

From 1994 to 2001, Jergovic was a research scientist with the university-affiliated National Opinion Research Center (NORC) and, in 2001, served as project director for NORC's Florida Ballot Project, an initiative that examined, classified, and created an archive of the markings on Florida's 175,000 uncertified ballots from its contested 2000 presidential election.

Jergovic earned a BS in psychology and an MA and PhD in developmental psychology, all from Loyola University Chicago, and an MBA from the Booth School of Business at the University of Chicago.

Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

An Equation to Describe the Competition Between Genes

Caltech researchers develop and verify predictive mathematical model

In biology, scientists typically conduct experiments first, and then develop mathematical or computer models afterward to show how the collected data fit with theory. In his work, Rob Phillips flips that practice on its head. The Caltech biophysicist tackles questions in cellular biology as a physicist would—by first formulating a model that can make predictions and then testing those predictions. Using this strategy, Phillips and his group have recently developed a mathematical model that accounts for the way genes compete with each other for the proteins that regulate their expression.

A paper describing the work appears in the current issue of the journal Cell. The lead authors on the paper are Robert Brewster and Franz Weinert, postdoctoral scholars in Phillips's lab.

"The thing that makes this study really interesting is that we did our calculations before we ever did any experiments," says Phillips, the Fred and Nancy Morris Professor of Biophysics and Biology at Caltech and principal investigator on the study. "Just as it is amazing that we have equations for the orbits of planets around stars, I think it's amazing that we are beginning to be able to write equations that predict the complex behaviors of a living cell."

A number of research teams are interested in modeling gene expression—accurately describing all the processes involved in going from a gene to the protein or other product encoded by that DNA. For simplicity's sake, though, most such models do not take competition into consideration. Instead, they assume that each gene has plenty of whatever it needs in order to be expressed—including the regulatory proteins called transcription factors. However, Phillips points out, there often is not enough transcription factor around to regulate all of the genes in a cell.  For one thing, multiple copies of a gene can exist within the same cell. For example, in the case of genes expressed on circular pieces of DNA known as plasmids, it is common to find hundreds of copies in a single cell. In addition, many transcription factors are capable of binding to a variety of different genes. So, as in a game of musical chairs, the genes must compete for a scarce resource—the transcription factors.

Phillips and his colleagues wanted to create a more realistic model by adding in this competition. To do so, they looked at how the level of gene expression varies depending on the amount of transcription factor present in the cell. To limit complexity, they worked with a relatively simple case—a gene in the bacterium E. coli that has just one binding site where a transcription factor can attach. In this case, when the transcription factor binds to the gene, it actually prevents the gene from making its product—it represses expression.

To build their mathematical model, the researchers first considered all the various ways in which the available transcription factor can interact with the copies of this particular gene that are present in the cell, and then developed a statistical theory to represent the situation.

"Imagine that you go into an auditorium, and you know there are a certain number of seats and a certain number of people. There are many different seating arrangements that could accommodate all of those people," Phillips says. "If you wanted to, you could systematically enumerate all of those arrangements and figure out things about the statistics—how often two people will be sitting next to each other if it's purely random, and so on. That's basically what we did with these genes and transcription factors."

Using the resulting model, the researchers were able to make predictions about what would happen if the level of transcription factor and the number of gene copies were independently varied so that the proteins were either in high demand or there were plenty to go around, for example.

With predictions in hand, the researchers next conducted experiments while looking at E. coli cells under a microscope. To begin, they introduced the genes on plasmids into the cells. They needed to track exactly how much transcription factor was present and the rate of gene expression in the presence of that level of transcription factor. Using fluorescent proteins, they were able to follow these changes in the cell over time: the transcription factor lit up red, while the protein expressed by the gene without the transcription factor attached glowed green. Using video fluorescence microscopy and a method, developed in the lab of Caltech biologist Michael Elowitz, for determining the brightness of a single molecule, the researchers were able to count the level of transcription factor present and the rate at which the green protein was produced as the cells grew and divided.

The team found that the experimental data matched the predictions they had made extremely well. "As expected, we find that there are two interesting regimes," says Brewster. "One is that there's just not enough protein to fill the demand. Therefore, all copies of the gene cannot be repressed simultaneously, and some portion will glow green all the time. In that case, there are correlations between the various copies of the genes. They know, in some sense, that the others exist. The second case is that there is a ton of this transcription factor around; in that case, the genes act almost exactly as if the other genes aren't there—there is enough protein to shut off all of the genes simultaneously."

The data fit so well with their model, in fact, that Phillips and his colleagues were able to use plots of the data to predict how many copies of the plasmid would be found in a cell as it grew and multiplied at various points throughout the cell cycle.

"Many times in science you start out trying to understand something, and then you get so good at understanding it that you are able to use it as a tool to measure something else," says Phillips. "Our model has become a tool for measuring the dynamics of how plasmids multiply. And the dynamics of how they multiply isn't what we would have naively expected. That's a little hint that we're pursuing right now."

Overall, he says, "this shows that the assertion that biology is too complicated to be predictive might be overly pessimistic, at least in the context of bacteria."

The work described in the paper, "The Transcription Factor Titration Effect Dictates Level of Gene Expression," was supported by the National Institutes of Health and by the Fondation Pierre-Gilles de Gennes. Additional coauthors are Mattias Rydenfelt, a graduate student in physics at Caltech; Hernan Garcia, a former member of Phillips's lab who is now at Princeton University; and Dan Song, a graduate student at Harvard Medical School.

Writer: 
Kimm Fesenmaier
Images: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

Bending the Light with a Tiny Chip

A silicon chip developed by Caltech researchers acts as a lens-free projector—and could one day end up in your cell phone.

Imagine that you are in a meeting with coworkers or at a gathering of friends. You pull out your cell phone to show a presentation or a video on YouTube. But you don't use the tiny screen; your phone projects a bright, clear image onto a wall or a big screen. Such a technology may be on its way, thanks to a new light-bending silicon chip developed by researchers at Caltech.

The chip was developed by Ali Hajimiri, Thomas G. Myers Professor of Electrical Engineering, and researchers in his laboratory. The results were presented at the Optical Fiber Communication (OFC) conference in San Francisco on March 10.

Traditional projectors—like those used to project a film or classroom lecture notes—pass a beam of light through a tiny image, using lenses to map each point of the small picture to corresponding, yet expanded, points on a large screen. The Caltech chip eliminates the need for bulky and expensive lenses and bulbs and instead uses a so-called integrated optical phased array (OPA) to project the image electronically with only a single laser diode as light source and no mechanically moving parts.

Hajimiri and his colleagues were able to bypass traditional optics by manipulating the coherence of light—a property that allows the researchers to "bend" the light waves on the surface of the chip without lenses or the use of any mechanical movement. If two waves are coherent in the direction of propagation—meaning that the peaks and troughs of one wave are exactly aligned with those of the second wave—the waves combine, resulting in one wave, a beam with twice the amplitude and four times the energy as the initial wave, moving in the direction of the coherent waves.

"By changing the relative timing of the waves, you can change the direction of the light beam," says Hajimiri. For example, if 10 people kneeling in line by a swimming pool slap the water at the exact same instant, they will make one big wave that travels directly away from them. But if the 10 separate slaps are staggered—each person hitting the water a half a second after the last—there will still be one big, combined wave, but with the wave bending to travel at an angle, he says.

Using a series of pipes for the light—called phase shifters—the OPA chip similarly slows down or speeds up the timing of the waves, thus controlling the direction of the light beam. To form an image, electronic data from a computer are converted into multiple electrical currents; by applying stronger or weaker currents to the light within the phase shifter, the number of electrons within each light path changes—which, in turn, changes the timing of the light wave in that path. The timed light waves are then delivered to tiny array elements within a grid on the chip. The light is then projected from each array in the grid, the individual array beams combining coherently in the air to form a single light beam and a spot on the screen.

As the electronic signal rapidly steers the beam left, right, up, and down, the light acts as a very fast pen, drawing an image made of light on the projection surface. Because the direction of the light beam is controlled electronically—not mechanically—it can create a sort of line very quickly. Since the light draws many times per second, the eye sees the process as a single image instead of a moving light beam, says Hajimiri.

"The new thing about our work is really that we can do this on a tiny, one-millimeter-square silicon chip, and the fact that we can do it very rapidly—rapidly enough to form images, since we phase-shift electronically in two dimensions," says Behrooz Abiri, a graduate student in Hajimiri's group and a coauthor on the paper. So far, the images Hajimiri and his team can project with the current version of the chip are somewhat simple—a triangle, a smiley face, or single letters, for example. However, the researchers are currently experimenting with larger chips that include more light-delivering array elements that—like using a larger lens on a camera—can improve the resolution and increase the complexity of the projected images.

In their recent experiments, Hajimiri and his colleagues have used the silicon chip to project images in infrared light, but additional work with different types of semiconductors will also allow the researchers to expand the tiny projector's capabilities into the visible spectrum. "Right now we are using silicon technology, which works better with infrared light. If you want to project visible light, you can take the exact same architecture and do it in what's called compound semiconductor III-V technology," says Firooz Aflatouni, another coauthor on the paper, who in January finished his two-year postdoctoral appointment at Caltech and joined the University of Pennsylvania as an assistant professor. "Silicon is good because it can be easily integrated into electronics, but these other compound semiconductors could be used to do the same thing."

"In the future, this can be incorporated into a phone, and since there is no need for a lens, you can have a phone that acts as a projector all by itself," Hajimiri says. However, although the chip could easily be incorporated into a cell phone, he points out that a tiny projection device can have many applications—including light-based radar systems (called "LIDAR"), which are used in positioning, robotics, geographical measurements, and mapmaking. Such equipment already exists, but current LIDAR technology requires complex, bulky, and expensive equipment—equipment that could be streamlined and simplified to a single chip at a much lower cost.

"But I don't want to limit the device to just a few purposes. The beauty of this thing is that these chips are small and can be made at a very low cost—and this opens up lots of interesting possibilities," he says.

These results were described in a presentation titled "Electronic Two-Dimensional Beam Steering for Integrated Optical Phased Arrays." Along with Hajimiri, Abiri, and Aflatouni, Caltech senior Angad Rekhi also contributed to the work. The study was funded by grants from the Caltech Innovation Initiative, and the Information Science and Technology initiative at Caltech.

Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Monday, March 17, 2014
Jorgensen Lobby – Earle M. Jorgensen Laboratory

Caltech New Media Art Exhibition

Monday, March 31, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

Unleashing Collaborative Learning through Technology: A Study of Tablet-Mediated Student Learning

Building Artificial Cells Will Be a Noisy Business

Engineers like to make things that work. And if one wants to make something work using nanoscale components—the size of proteins, antibodies, and viruses—mimicking the behavior of cells is a good place to start since cells carry an enormous amount of information in a very tiny packet. As Erik Winfree, professor of computer science, computation and neutral systems, and bioengineering, explains, "I tend to think of cells as really small robots. Biology has programmed natural cells, but now engineers are starting to think about how we can program artificial cells. We want to program something about a micron in size, finer than the dimension of a human hair, that can interact with its chemical environment and carry out the spectrum of tasks that biological things do, but according to our instructions."

Getting tiny things to behave is, however, a daunting task. A central problem bioengineers face when working at this scale is that when biochemical circuits, such as the one Winfree has designed, are restricted to an extremely small volume, they may cease to function as expected, even though the circuit works well in a regular test tube. Smaller populations of molecules simply do not behave the same as larger populations of the same molecules, as a recent paper in Nature Chemistry demonstrates.

Winfree and his coauthors began their investigation of the effect of small sample size on biochemical processes with a biochemical oscillator designed in Winfree's lab at Caltech. This oscillator is a solution composed of small synthetic DNA molecules that are activated by RNA transcripts and enzymes. When the DNA molecules are activated by the other components in the solution, a biological circuit is created. This circuit fluoresces in a rhythmic pulse for approximately 15 hours until its chemical reactions slow and eventually stop.

The researchers then "compartmentalized" the oscillator by reducing it from one large system in a test tube to many tiny oscillators. Using an approach developed by Maximilian Weitz and colleagues at the Technical University of Munich and former Caltech graduate student Elisa Franco, currently an assistant professor of mechanical engineering at UC Riverside, an aqueous solution of the DNA, RNA, and enzymes that make up the biochemical oscillator was mixed with oil and shaken until small portions of the solution, each containing a tiny oscillator, were isolated within droplets surrounded by oil.

"After the oil is added and shaken, the mixture turns into a cream, called an emulsion, that looks somewhat like a light mayonnaise," says Winfree. "We then take this cream, pour it on a glass slide and spread it out, and observe the patterns of pulsing fluorescence in each droplet under a microscope." 

When a large sample of the solution is active, it fluoresces in regular pulses. The largest droplets behave as the entire solution does: fluorescing mostly in phase with one another, as though separate but still acting in concert. But the behavior of the smaller droplets was found to be much less consistent, and their pulses of fluorescence quickly moved out of phase with the larger droplets.

Researchers had expected that the various droplets, especially the smaller ones, would behave differently from one another due to an effect known as stochastic reaction dynamics. The specific reactions that make up a biochemical circuit may happen at slightly different times in different parts of a solution. If the solution sample is large enough, this effect is averaged out, but if the sample is very small, these slight differences in the timing of reactions will be amplified. The sensitivity to droplet size can be even more significant depending on the nature of the reactions. As Winfree explains, "If you have two competing reactions—say x could get converted to y, or x could get converted to z, each at the same rate—then if you have a test tube–sized sample, you will end up with a something that is half y and half z. But if you only have four molecules in a droplet, then perhaps they will all convert to y, and that's that: there's no z to be found."

In their experiments on the biochemical oscillator, however, Winfree and his colleagues discovered that this source of noise—stochastic reaction dynamics—was relatively small compared to a source of noise that they did not anticipate: partitioning effects. In other words, the molecules that were captured in each droplet were not exactly the same. Some droplets initially had more molecules, while others had fewer; also, the ratio between the various elements was different in different droplets. So even before the differential timing of reactions could create stochastic dynamics, these tiny populations of molecules started out with dissimilar features. The differences between them were then further amplified as the biochemical reactions proceeded.

"To make an artificial cell work," says Winfree, "you need to know what your sources of noise are. The dominant thought was that the noise you're confronted with when you're engineering nanometer-scale components has to do with randomness of chemical reactions at that scale. But this experience has taught us that these stochastic reaction dynamics are really the next-level challenge. To get to that next level, first we have to learn how to deal with partitioning noise."

For Winfree, this is an exciting challenge: "When I program my computer, I can think entirely in terms of deterministic processes. But when I try to engineer what is essentially a program at the molecular scale, I have to think in terms of probabilities and stochastic (random) processes. This is inherently more difficult, but I like challenges. And if we are ever to succeed in creating artificial cells, these are the sorts of problems we need to address."

Coauthors of the paper, "Diversity in the dynamical behaviour of a compartmentalized programmable biochemical oscillator," include Maximilian Weitz, Korbinian Kapsner, and Friedrich C. Simmel of the Technical University of Munich; Jongmin Kim of Caltech; and Elisa Franco of UC Riverside. The project was funded by the National Science Foundation, UC Riverside, the European Commission, the German Research Foundation Cluster of Excellence Nanosystems Initiative Munich, and the Elite Network of Bavaria.

Writer: 
Cynthia Eller
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News

A New Laser for a Faster Internet

A new laser developed by a research group at Caltech holds the potential to increase by orders of magnitude the rate of data transmission in the optical-fiber network—the backbone of the Internet.

The study was published the week of February 10–14 in the online edition of the Proceedings of the National Academy of Sciences. The work is the result of a five-year effort by researchers in the laboratory of Amnon Yariv, Martin and Eileen Summerfield Professor of Applied Physics and professor of electrical engineering; the project was led by postdoctoral scholar Christos Santis (PhD '13) and graduate student Scott Steger.

Light is capable of carrying vast amounts of information—approximately 10,000 times more bandwidth than microwaves, the earlier carrier of long-distance communications. But to utilize this potential, the laser light needs to be as spectrally pure—as close to a single frequency—as possible. The purer the tone, the more information it can carry, and for decades researchers have been trying to develop a laser that comes as close as possible to emitting just one frequency.

Today's worldwide optical-fiber network is still powered by a laser known as the distributed-feedback semiconductor (S-DFB) laser, developed in the mid 1970s in Yariv's research group. The S-DFB laser's unusual longevity in optical communications stemmed from its, at the time, unparalleled spectral purity—the degree to which the light emitted matched a single frequency. The laser's increased spectral purity directly translated into a larger information bandwidth of the laser beam and longer possible transmission distances in the optical fiber—with the result that more information could be carried farther and faster than ever before.

At the time, this unprecedented spectral purity was a direct consequence of the incorporation of a nanoscale corrugation within the multilayered structure of the laser. The washboard-like surface acted as a sort of internal filter, discriminating against spurious "noisy" waves contaminating the ideal wave frequency. Although the old S-DFB laser had a successful 40-year run in optical communications—and was cited as the main reason for Yariv receiving the 2010 National Medal of Science—the spectral purity, or coherence, of the laser no longer satisfies the ever-increasing demand for bandwidth.

"What became the prime motivator for our project was that the present-day laser designs—even our S-DFB laser—have an internal architecture which is unfavorable for high spectral-purity operation. This is because they allow a large and theoretically unavoidable optical noise to comingle with the coherent laser and thus degrade its spectral purity," he says.

The old S-DFB laser consists of continuous crystalline layers of materials called III-V semiconductors—typically gallium arsenide and indium phosphide—that convert into light the applied electrical current flowing through the structure. Once generated, the light is stored within the same material. Since III-V semiconductors are also strong light absorbers—and this absorption leads to a degradation of spectral purity—the researchers sought a different solution for the new laser.

The high-coherence new laser still converts current to light using the III-V material, but in a fundamental departure from the S-DFB laser, it stores the light in a layer of silicon, which does not absorb light. Spatial patterning of this silicon layer—a variant of the corrugated surface of the S-DFB laser—causes the silicon to act as a light concentrator, pulling the newly generated light away from the light-absorbing III-V material and into the near absorption-free silicon.

This newly achieved high spectral purity—a 20 times narrower range of frequencies than possible with the S-DFB laser—could be especially important for the future of fiber-optic communications. Originally, laser beams in optic fibers carried information in pulses of light; data signals were impressed on the beam by rapidly turning the laser on and off, and the resulting light pulses were carried through the optic fibers. However, to meet the increasing demand for bandwidth, communications system engineers are now adopting a new method of impressing the data on laser beams that no longer requires this "on-off" technique. This method is called coherent phase communication.

In coherent phase communications, the data resides in small delays in the arrival time of the waves; the delays—a tiny fraction (10-16) of a second in duration—can then accurately relay the information even over thousands of miles. The digital electronic bits carrying video, data, or other information are converted at the laser into these small delays in the otherwise rock-steady light wave. But the number of possible delays, and thus the data-carrying capacity of the channel, is fundamentally limited by the degree of spectral purity of the laser beam. This purity can never be absolute—a limitation of the laws of physics—but with the new laser, Yariv and his team have tried to come as close to absolute purity as is possible.

These findings were published in a paper titled, "High-coherence semiconductor lasers based on integral high-Q resonators in hybrid Si/III-V platforms." In addition to Yariv, Santis, and Steger, other Caltech coauthors include graduate student Yaakov Vilenchik, and former graduate student Arseny Vasilyev (PhD, '13). The work was funded by the Army Research Office, the National Science Foundation, and the Defense Advanced Research Projects Agency. The lasers were fabricated at the Kavli Nanoscience Institute at Caltech.

Contact: 
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Monday, April 7, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

Planning Session for the Fall 2014 Teaching Conference

Wednesday, April 23, 2014
Beckman Institute Auditorium – Beckman Institute

The Art of Scientific Presentations

Wednesday, April 2, 2014
Beckman Institute Auditorium – Beckman Institute

Juggling Teaching at a Community College and Research at Caltech

Pages

Subscribe to RSS - EAS