Elachi to Retire as JPL Director

Charles Elachi (MS '69, PhD '71) has announced his intention to retire as director of the Jet Propulsion Laboratory on June 30, 2016, and move to campus as professor emeritus. A national search is underway to identify his successor.

"A frequently consulted national and international expert on space science, Charles is known for his broad expertise, boundless energy, conceptual acuity, and deep devotion to JPL, campus, and NASA," said Caltech president Thomas F. Rosenbaum in a statement to the Caltech community. "Over the course of his 45-year career at JPL, Charles has tirelessly pursued new opportunities, enhanced the Laboratory, and demonstrated expert and nimble leadership. Under Charles' leadership over the last 15 years, JPL has become a prized performer in the NASA system and is widely regarded as a model for conceiving and implementing robotic space science missions."

With Elachi at JPL's helm, an array of missions has provided new understanding of our planet, our moon, our sun, our solar system, and the larger universe. The GRAIL mission mapped the moon's gravity; the Genesis space probe returned to Earth samples of the solar wind; Deep Impact intentionally collided with a comet; Dawn pioneered the use of ion propulsion to visit the asteroids Ceres and Vesta; and Voyager became the first human-made object to reach interstellar space. A suite of missions to Mars, from orbiters to the rovers Spirit, Opportunity, and Curiosity, has provided exquisite detail of the red planet; Cassini continues its exploration of Saturn and its moons; and the Juno spacecraft, en route to a July 2016 rendezvous, promises to provide new insights about Jupiter. Missions such as the Galaxy Evolution Explorer, the Spitzer Space Telescope, Kepler, WISE, and NuSTAR have revolutionized our understanding of our place in the universe.

Future JPL missions developed under Elachi's guidance include Mars 2020, Europa Clipper, the Asteroid Redirect Mission, Jason 3, Aquarius, OCO-2, SWOT, and NISAR.

Elachi joined JPL in 1970 as a student intern and was appointed director and Caltech vice president in 2001. During his more than four decades at JPL, he led a team that pioneered the use of space-based radar imaging of the Earth and the planets, served as principal investigator on a number of NASA-sponsored studies and flight projects, authored more than 230 publications in the fields of active microwave remote sensing and electromagnetic theory, received several patents, and became the director for space and earth science missions and instruments. At Caltech, he taught a course on the physics of remote sensing for nearly 20 years

Born in Lebanon, Elachi received his B.Sc. ('68) in physics from University of Grenoble, France and the Dipl. Ing. ('68) in engineering from the Polytechnic Institute, Grenoble. In addition to his MS and PhD degrees in electrical science from Caltech, he also holds an MBA from the University of Southern California and a master's degree in geology from UCLA.

Elachi was elected to the National Academy of Engineering in 1989 and is the recipient of numerous other awards including an honorary doctorate from the American University of Beirut (2013), the National Academy of Engineering Arthur M. Bueche Award (2011), the Chevalier de la Légion d'Honneur from the French Republic (2011), the American Institute of Aeronautics and Astronautics Carl Sagan Award (2011), the Royal Society of London Massey Award (2006), the Lebanon Order of Cedars (2006 and 2012), the International von Kármán Wings Award (2007), the American Astronautical Society Space Flight Award (2005), the NASA Outstanding Leadership Medal (2004, 2002, 1994), and the NASA Distinguished Service Medal (1999).

Home Page Title: 
Elachi to Retire as JPL Director
Exclude from News Hub: 
No
News Type: 
In Our Community
Exclude from Home Page: 
Home Page Summary: 
He will move to campus as professor emeritus. A national search is underway to identify his successor.

Seeing Sound

A busy kitchen is a place where all of the senses are on high alert—your brain is processing the sound of sizzling oil, the aroma of spices, the visual aesthetic of food arranged on a plate, the feel and taste of taking a bite. While these signals may seem distinct and independent, they actually interact and integrate together within the brain's network of sensory neurons.

Caltech researchers have now discovered that intrinsic neural connections—called crossmodal mappings—can be used by assistive devices to help the blind detect their environment without requiring intense concentration or hundreds of hours of training. This new multisensory perspective on such aids (called sensory substitution devices) could make tasks that were previously attention-consuming much easier, allowing nonsighted people to acquire a new sensory functionality similar to vision. The work is described in a paper published in the October 22 issue of the journal Scientific Reports.

"Many neuroscience textbooks really only devote a few pages to multisensory interaction," says Shinsuke Shimojo, the Gertrude Baltimore Professor of Experimental Psychology and principal investigator on the study. "But 99 percent of our daily life depends on multisensory—also called multimodal—processing." As an example, he says, if you are talking on the phone with someone you know very well, and they are crying, you will not just hear the sound but will visualize their face in tears. "This is an example of the way sensory causality is not unidirectional—vision can influence sound, and sound can influence vision."

Shimojo and postdoctoral scholar Noelle Stiles have exploited these crossmodal mappings to stimulate the visual cortex with auditory signals that encode information about the environment. They explain that crossmodal mappings are ubiquitous; everyone already has them. Mappings include the intuitive matching of high pitch to elevated locations in space or the matching of noisy sounds with bright lights. Multimodal processing, like these mappings, may be the key to making sensory substitution devices more automatic.

The researchers conducted trials with both sighted and blind people using a sensory substitution device, called a vOICe device, that translates images into sound.

The vOICe device is made up of a small computer connected to a camera that is attached to darkened glasses, allowing it to "see" what a human eye would. A computer algorithm scans each camera image from left to right, and for every column of pixels, generates an associated sound with a frequency and volume that depends upon the vertical location and brightness of the pixels. A large number of bright pixels at the top of a column would translate into a loud, high-frequency sound, whereas a large number of lower dark pixels would be a quieter, lower-pitched sound. A blind person wearing this camera on a pair of glasses could then associate different sounds with features of their environment.

In the trials, sighted people with no training or instruction were asked to match images to sounds; while the blind subjects were asked to feel textures and match them to sound. Tactile textures can be related to visual textures (patterns) like a topographic map—bright regions of an image translate to high tactile height relative to a page, while dark regions are flatter. Both groups showed an intuitive ability to identify textures and images from their associated sounds. Surprisingly, the untrained (also called "naive") group's performance was significantly above chance, and not very different from the trained.

The intuitively identified textures used in the experiments exploited the crossmodal mappings already within the vOICe encoding algorithm. "When we reverse the crossmodal mappings in the vOICe auditory-to-visual translation, the naive performance significantly decreased, showing that the mappings are important to the intuitive interpretation of the sound," explains Stiles.

"We found that using this device to look at textures—patterns of light and dark—illustrated 'intuitive' neural connections between textures and sounds, implying that there is some preexisting crossmodality," says Shimojo. One common example of crossmodality is a condition called synesthesia, in which the activation of one sense leads to a different involuntary sensory experience, such as seeing a certain color when hearing a specific sound. "Now, we have discovered that crossmodal connections, preexisting in everyone, can be used to make sensory substitution intuitive with no instruction or training."

The researchers do not exactly know yet what each sensory region of the brain is doing when processing these various signals, but they have a rough idea. "Auditory regions are activated upon hearing sound, as are the visual regions, which we think will process the sound for its spatial qualities and elements. The visual part of the brain, when processing images, maps objects to spatial location, fitting them together like a puzzle piece," Stiles says. To learn more about how the crossmodal processing happens in the brain, the group is currently using functional magnetic resonance imaging (fMRI) data to analyze the crossmodal neural network.

These preexisting neural connections provide an important starting point for training visually impaired people to use devices that will help them see. A sighted person simply has to open their eyes, and the brain automatically processes images and information for seamless interaction with the environment. Current devices for the blind and visually impaired are not so automatic or intuitive to use, generally requiring a user's full concentration and attention to interpret information about the environment. The Shimojo lab's new finding on the role of multimodal processing and crossmodal mappings starts to address this issue.

Beyond its practical implications, Shimojo says, the research raises an important philosophical question: What is seeing?

"It seems like such an obvious question, but it gets complicated," says Shimojo. "Is seeing what happens when you open your eyes? No, because opening your eyes is not enough if the retina [the light-sensitive layer of tissue in the eye] is damaged. Is it when your visual cortex is activated? But our research has shown that the visual cortex can be activated by sound, indicating that we don't really need our eyes to see. It's very profound—we're trying to give blind people a visual experience through other senses."

The paper is titled "Auditory Sensory Substitution Is Intuitive and Automatic with Texture Stimuli" and was funded by grants from the National Science Foundation, the Della Martin Fund for Discoveries in Mental Illness, and the Japan Science and Technology Agency, Core Research for Evolutional Science and Technology.

Writer: 
Lori Dajose
Tags: 
Home Page Title: 
Seeing Sound
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Teaser Image: 
Exclude from Home Page: 
Home Page Summary: 
Intrinsic neural connections can be used by assistive devices, allowing non-sighted people to acquire a new sensory functionality similar to vision.
rpyle's picture

Probing the Mysterious Perceptual World of Autism

New research looks at what people with Autism Spectrum Disorder pay attention to in the real world.

The perceptual world of a person with autism spectrum disorder (ASD) is unique. Beginning in infancy, people who have ASD observe and interpret images and social cues differently than others. Caltech researchers now have new insight into just how this occurs, research that eventually may help doctors diagnose, and more effectively treat, the various forms of the disorder. The work is detailed in a study published in the October 22 issue of the journal Neuron.

Symptoms of ASD include impaired social interaction, compromised communication skills, restricted interests, and repetitive behaviors. Research suggests that some of these behaviors are influenced by how an individual with ASD senses, attends to, and perceives the world.

The new study investigated how visual input is interpreted in the brain of someone with ASD. In particular, it examined the validity of long-standing assumptions about the condition, including the belief that those with ASD often miss facial cues, contributing to their inability to respond appropriately in social situations.

"Among other findings, our work shows that the story is not as simple as saying 'people with ASD don't look normally at faces.' They don't look at most things in a typical way," says Ralph Adolphs, the Bren Professor of Psychology and Neuroscience and professor of biology, in whose lab the study was done. Indeed, the researchers found that people with ASD attend more to nonsocial images, to simple edges and patterns in those images, than to the faces of people.

To reach these determinations, Adolphs and his lab teamed up with Qi Zhao, an assistant professor of electrical and computer engineering at the National University of Singapore, the senior author on the paper, who had developed a detailed method. The researchers showed 700 images to 39 subjects. Twenty of the subjects were high-functioning individuals with ASD, and 19 were control, or "neurotypical," subjects without ASD. The two groups were matched for age, race, gender, educational level, and IQ. Each subject viewed each image for three seconds while an eye-tracking device recorded their attention patterns on objects depicted in the images.

Unlike the abstract representations of single objects or faces that have been commonly used in such studies, the images that Adolphs and his team presented contained combinations of more than 5,500 real-world elements—common objects like people, trees, and furniture as well as less common items like knives and flames—in natural settings, mimicking the scenes that a person might observe in day-to-day life.

"Complex images of natural scenes were a big part of this unique approach," says first-author Shuo Wang (PhD '14), a postdoctoral fellow at Caltech. The images were shown to subjects in a rich semantic context, "which simply means showing a scene that makes sense," he explains. "I could make an equally complex scene with Photoshop by combining some random objects such as a beach ball, a hamburger, a Frisbee, a forest, and a plane, but that grouping of objects doesn't have a meaning—there is no story demonstrated. Having objects that are related in a natural way and that show something meaningful provides the semantic context. It is a real-world approach."

In addition to validating previous studies that showed, for example, that individuals with ASD are less drawn to faces than control subjects, the new study found that these subjects were strongly attracted to the center of images, regardless of the content placed there. Similarly, they tended to focus their gaze on objects that stood out—for example, due to differences in color and contrast—rather than on faces. Take, for example, one image from the study showing two people talking with one facing the camera and the other facing away so that only the back of their head is visible. Control subjects concentrated on the visible face, whereas ASD subjects attended equally to the face and the back of the other person's head.

"The study is probably most useful for informing diagnosis," Adolphs says. "Autism is many things. Our study is one initial step in trying to discover what kinds of different autisms there actually are. The next step is to see if all people with ASD show the kind of pattern we found. There are probably differences between individual people with ASD, and those differences could relate to differences in diagnosis, for instance, revealing subtypes of autism. Once we have identified those subtypes, we can begin to ask if different kinds of treatment might be best for each kind of subtype."

Adolphs plans to continue this type of research using functional magnetic resonance imaging scans to track the brain activity of people with ASD while they are viewing images in laboratory settings similar to what was used in this study.

The paper, "Atypical Visual Saliency in Autism Spectrum Disorder Quantified through Model-Based Eye Tracking," was coauthored by Shuo Wang and Ralph Adolphs at Caltech; Ming Jiang and Qi Zhao from the National University of Singapore; Xavier Morin Duchesne and Daniel P. Kennedy of Indiana University, Bloomington; and Elizabeth A. Laugeson from UCLA.

The research was supported by a postdoctoral fellowship from the Autism Science Foundation, a Fonds de Recherche du Québec en Nature et Technologies predoctoral fellowship, a National Institutes of Health Grant and National Alliance for Research on Schizophrenia and Depression Young Investigator Grant, a grant from the National Institute of Mental Health to the Caltech Conte Center for the Neurobiology of Social Decision Making, a grant from the Simons Foundation Autism Research Initiative, and Singapore's Defense Innovative Research Program and the Singapore Ministry of Education's Academic Research Fund Tier 2.

Home Page Title: 
Patterns of Attention
Listing Title: 
Patterns of Attention of People with Autism Spectrum Disorder (ASD).
Writer: 
Exclude from News Hub: 
No
Short Title: 
Patterns of Attention for Autism
News Type: 
Research News
Exclude from Home Page: 
Home Page Summary: 
New research into autism, utilizing complex real-world images, provides enhanced understanding of how people with autism attend to visual cues.
Friday, October 30, 2015
Beckman Institute Auditorium – Beckman Institute

Teaching Statement Workshop

Cells Rhythmically Regulate Their Genes

Even in a calm, unchanging environment, cells are not static. Among other actions, cells activate and then deactivate some types of transcription factors—proteins that control the expression of genes—in a series of unpredictable and intermittent pulses. Since discovering this pulsing phenomenon, scientists have wondered what functions it could provide for cells.

Now, a new study from Caltech researchers shows that pulsing can allow two proteins to interact with each other in a rhythmic fashion that allows them to control genes. Specifically, when the expression of the transcription factors goes in and out of sync, gene expression also goes up and down. These rhythms of activation, the researchers say, may also underlie core processes in the cells of organisms from across the kingdoms of life.

"The way transcription factor pulses sync up with one another in time could play an important role in allowing cells to process information, communicate with other cells, and respond to stress," says paper coauthor Michael Elowitz, a professor of biology and biological engineering and an investigator with the Howard Hughes Medical Institute.

The research, led by Caltech postdoctoral scholar Yihan Lin, appears in the October 15 issue of Nature. Other Caltech authors of the paper are Assistant Professor of Chemistry Long Cai; Chang Ho Sohn, a staff scientist in the Cai lab; and Elowitz's former graduate student Chiraj K. Dalal (PhD '10), now at UC San Francisco.

Cai, Dalal, and Elowitz reported a functional role for transcription factor pulsing in 2008. In the meantime, researchers worldwide have been steadily uncovering similar surges of protein activity across diverse cell types and genetic systems.

Realizing that many different factors are pulsing in the same cell even in unchanging conditions, the Caltech scientists began to wonder if cells might adjust the relative timing of these pulses to enable a novel sort of time-based regulation. To find out, they set up time-lapse movies to follow two pulsing proteins and a target gene in real time in individual yeast cells.  


A three color movie of cells in response to two different stresses (as indicated). Green color corresponds to Msn2 protein, red color corresponds to Mig1 protein, and blue color corresponds to a RNA binding protein that is used to report gene expression. The white circle highlights the cell of interest. (Credit: Michael Elowitz and Yihan Lin/Caltech)

The team tagged two central transcription factors named Msn2 and Mig1 with green and red fluorescent proteins, respectively. When the transcription factors are activated, they move into the nucleus, where they influence gene expression. This movement—as well as the activation of the factors—can be visualized because the fluorescent markers concentrate within the small volume of the nucleus, causing it to glow brightly, either green, red, or both. The color choice for the fluorescent tags was symbolic: Msn2 serves as an activator, and Mig1 as a repressor. "Msn2, the green factor, steps on the gas and turns up gene expression, while Mig1, the red factor, hits the brakes," says Elowitz.

When the scientists stressed the yeast cells by adding heat, for example, or restricting food, the pulses of Msn2 and Mig1 changed their timing with respect to one another, with more or less frequent periods of overlap between their pulses, depending upon the stressing stimulus.

Generally, when the two transcription factors pulsed in synchrony, the repressor blocked the ability of the activator to turn on genes. "It's like someone simultaneously pumping the gas and brake pedals in a car over and over again," says Elowitz.

But when they were off-beat, with the activator pulsing without the repressor, gene expression increased. "When the cell alternates between the brake and the gas—the Msn2 transcription factor in this case—the car can move," says Elowitz. As a result of these stress-altered rhythms, the cells successfully produced more (or fewer) copies of certain proteins that helped the yeast cope with the unpleasant situation.

Previously, researchers have thought that the relative concentrations of multiple transcription factors in the nucleus determine how they regulate a common gene target—a phenomenon known as combinatorial regulation. But the new study suggests that the relative timing of the pulses of transcription factors may be just as important as their concentration.

"Most genes in the cell are regulated by several transcription factors in a combinatorial fashion, as parts of a complex network," says Cai. "What we're now seeing is a new mode of regulation that controls the pulse timing of transcription factors, and this could be critical to understanding the combinatorial regulation in genetic networks."

"There appears to be a layer of time-based regulation in the cell that, because it can only be observed with movies of individual cells, is still largely unexplored," says Lin. "We look forward to learning more about this intriguing and underappreciated form of gene regulation."    

In future research, the scientists will try to understand how prevalent this newfound mode of time-based regulation is in a variety of cell types and will examine its involvement in gene regulation systems. In the context of synthetic biology—the harnessing and modification of biological systems for human technological applications—the researchers also hope to develop methods to control such pulsing to program new cellular behaviors.

The paper is titled "Combinatorial gene regulation by modulation of relative pulse timing." The work was funded by the National Institutes of Health, the National Science Foundation, the Defense Advanced Research Projects Agency, and by the Gordon and Betty Moore Foundation through a grant to the Caltech Programmable Molecular Technology Initiative. 

Tags: 
Home Page Title: 
Cells Rhythmically Regulate Their Genes
Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Exclude from Home Page: 
Wednesday, November 11, 2015
Center for Student Services 360 (Workshop Space) – Center for Student Services

Communication Strategies for Tutoring and Office Hours

Understanding Olfaction: An Interview with Elizabeth Hong

You walk by a bakery, smell the scent of fresh cookies, and are immediately reminded of baking with your grandmother as a child. The seemingly simple act of learning to associate a smell with a good or bad outcome is actually quite a complicated behavior—one that can begin as a single synapse, or junction, where a signal is passed between two neurons in the brain.

Assistant Professor of Neuroscience Betty Hong is interested in how animals sense cues in their environment, process that information in the brain, and then use that information to guide behaviors. To study the processing of information from synapse to behavior, her work focuses on olfaction—or chemical sensing via smell—in fruit flies.

Hong, who received her bachelor's degree from Caltech in 2002 and her doctorate from Harvard in 2009, came from a postdoctoral position at Harvard Medical School to join the Caltech faculty in June. We spoke with her recently about her work, her life outside the laboratory, and why she is looking forward to being back at Caltech.

 

How did you initially become interested in your field?

It's rather circuitous. I was initially drawn to neuroscience because I was interested in disease. I had family who passed away from Alzheimer's disease, and it's clear that with the current demographic of our country, diseases associated with aging—like Alzheimer's—are going to have a large impact on society in the next 20 to 30 years. Working at the Children's Hospital Boston in graduate school, I also became increasingly interested in understanding the rise of neurodevelopmental disorders like autism.

I really wanted to understand the mechanistic basis for neurological disease. And then it became clear to me that part of the problem of trying to understand neurological disorders was that we really had no idea how the brain is supposed to work. If you were a mechanic who didn't know how cars work, how could you fix a broken car? That led me to study increasingly more basic mechanisms of how the brain functions.  

 

Why did you decide to focus your research on olfaction?

Although we humans have evolved to move away from olfaction—humans and primates are very visual—the whole rest of the animal kingdom relies on olfaction heavily for all of its daily survival and functions. Even the lowliest microbe relies on chemical sensing to navigate its way through the environment. We study olfaction in an invertebrate model—the fruit fly Drosophila. We do that for a couple of reasons. One is that it has a very small brain, and so its circuits are very compact, and that small size and numerical simplicity lets us get a global overview of what's happening—a view that you could never get if you're looking at a big circuit, like a mouse brain or a human brain.

The other reason is that there are versatile genetic tools and new technologies that have allowed us to make high-resolution electrical and optical recordings of neural activity in the brains of fruit flies. That very significant technical hurdle had to be crossed in order to make it a worthwhile experimental model. With electrophysiological access to the brain, and genetic tools that allow you to manipulate the circuits, you can watch brain activity as it's happening and ask what happens to neural activity when you tweak the properties of the system in specific ways. And the fly also has a robust and flexible set of behaviors that you can relate to all of this. 

 

What are some of the behaviors that you are interested in studying?

We're very interested in understanding how flies can associate an odor with a pleasant or unpleasant outcome. So, in the same way that you might associate wonderful baking smells with something from your childhood, flies can learn to arbitrarily associate odors with different outcomes. And to know "when I smell this odor, I should run away," or "based on what happened to me the last time I smelled this odor, this might be an indicator of food"—that's actually a fairly sophisticated behavior that is a basic building block for more complex higher-order cognitive tasks that emerge in vertebrates.

There are many animals that are inflexibly wired. In other words, they smell something, and through evolution, their circuits have evolved to tell them to move toward it or go away from it. Even if they are in an unusual environment, they can't flexibly alter that behavior. The ability to flexibly adapt our behavior to new and unfamiliar environments was a key transition in the evolution of the nervous system.

 

You are also a Caltech alum. What drew you back as a faculty member?

Yes, it seems like such a long time ago, but I was an undergraduate here—a biology major in Page House—from 1998 to 2002. I was also a SURF student with [Professor of Biology] Bruce Hay and later with David Baltimore [president emeritus and Robert Andrews Millikan Professor of Biology]. It's kind of wild to have as your colleagues people who were your mentors a decade ago, but I think the main reason I chose Caltech was the community of scholars here—on the level of faculty, undergraduate students, graduate students, and postdocs—that I will be able to interact with. In the end, you mainly just want to be with smart, motivated people who want to use science to make a difference in the world. And I think that encapsulates what Caltech does.

 

Do you have any interests or hobbies that are outside of the lab?

I used to play horn in the wind ensemble and orchestra, including the time when I was here as an undergraduate. But these days, any time that I'm not in the office, I'm with my two young kids. Right now, we're really excited about exploring all the fun and exciting things to do outdoors in Southern California. We've done a lot of hiking and exploring the natural beauty here. The kids have gotten into fishing lately, so our latest thing has been scoping out the best places to fish. I would love to hear from members of the community what their favorite spots are!

Writer: 
Exclude from News Hub: 
No
News Type: 
Research News
Friday, October 23, 2015
Winnett Lounge – Winnett Student Center

TeachWeek Caltech Capstone Panel

Friday, October 16, 2015
Center for Student Services 360 (Workshop Space) – Center for Student Services

Course Ombudsperson Training, Fall 2015

NIH Announces Second Round of BRAIN Funding

On October 1, the National Institutes of Health (NIH) announced its second round of funding in furtherance of President Obama's Brain Research through Advancing Innovative Neurotechnology—or BRAIN—Initiative. The new awards totaled more than $38 million. Included among the 67 funded projects are two led by Caltech researchers.

"Development of a scalable methodology for imaging neuropeptide release in the brain" — U01-MH109147

David Anderson, Seymour Benzer Professor of Biology; Investigator, Howard Hughes Medical Institute

Anderson and his colleagues aim to develop and test a new technology for visualizing the release of neuropeptides from neurons. Neuropeptides are a class of signaling molecules that communicate information between neurons, called neuromodulators. Neuropeptides control fundamental behavioral processes like eating and mood. These chemical signals can fundamentally alter neural computations in brain circuits and have previously been difficult to study using existing methods.

Anderson also received an NIH BRAIN grant in 2014 for developing a detailed, publicly available database characterizing the genetic, physiological, and morphological features of the various cell types in the brain that are involved in circuits controlling sensations and emotions.

"Tracing brain circuits by transneuronal control of transcription" — R21-EY026432

Carlos Lois, research professor of biology

Elizabeth Hong, assistant professor of neuroscience

Kai Zinn, professor of biology

Neurons in the brain are wired together like a complex electronic circuit, but it is unknown exactly what that circuit looks like. This project aims to utilize a new method to investigate the brain's wiring diagram. A new genetic system has been developed that will allow researchers to map the connections between neurons in the brain. Identifying how neurons in brain circuits are connected to one another is crucial to understanding how the brain processes information involved in perception, movement, or emotion. Recent research indicates that aberrant neuronal wiring may be the cause of several neurodevelopmental disorders, including autism and schizophrenia, further emphasizing the importance of identifying the wiring diagrams of brain circuits.

Lois also received an NIH BRAIN grant in 2014 to investigate the genetic mechanisms responsible for the evolution of the brain by comparing the genomic and biophysical properties of neurons across different mammalian species.

Writer: 
Lori Dajose
Tags: 
Writer: 
Exclude from News Hub: 
No
News Type: 
In Our Community

Pages