Future Electronics May Depend on Lasers, Not Quartz

Nearly all electronics require devices called oscillators that create precise frequencies—frequencies used to keep time in wristwatches or to transmit reliable signals to radios. For nearly 100 years, these oscillators have relied upon quartz crystals to provide a frequency reference, much like a tuning fork is used as a reference to tune a piano. However, future high-end navigation systems, radar systems, and even possibly tomorrow's consumer electronics will require references beyond the performance of quartz.

Now, researchers in the laboratory of Kerry Vahala, the Ted and Ginger Jenkins Professor of Information Science and Technology and Applied Physics at Caltech, have developed a method to stabilize microwave signals in the range of gigahertz, or billions of cycles per second—using a pair of laser beams as the reference, in lieu of a crystal.

Quartz crystals "tune" oscillators by vibrating at relatively low frequencies—those that fall at or below the range of megahertz, or millions of cycles per second, like radio waves. However, quartz crystals are so good at tuning these low frequencies that years ago, researchers were able to apply a technique called electrical frequency division that could convert higher-frequency microwave signals into lower-frequency signals, and then stabilize these with quartz. 

The new technique, which Vahala and his colleagues have dubbed electro-optical frequency division, builds off of the method of optical frequency division, developed at the National Institute of Standards and Technology more than a decade ago. "Our new method reverses the architecture used in standard crystal-stabilized microwave oscillators—the 'quartz' reference is replaced by optical signals much higher in frequency than the microwave signal to be stabilized," Vahala says.

Jiang Li—a Kavli Nanoscience Institute postdoctoral scholar at Caltech and one of two lead authors on the paper, along with graduate student Xu Yi—likens the method to a gear chain on a bicycle that translates pedaling motion from a small, fast-moving gear into the motion of a much larger wheel. "Electrical frequency dividers used widely in electronics can work at frequencies no higher than 50 to 100 GHz. Our new architecture is a hybrid electro-optical 'gear chain' that stabilizes a common microwave electrical oscillator with optical references at much higher frequencies in the range of terahertz or trillions of cycles per second," Li says.  

The optical reference used by the researchers is a laser that, to the naked eye, looks like a tiny disk. At only 6 mm in diameter, the device is very small, making it particularly useful in compact photonics devices—electronic-like devices powered by photons instead of electrons, says Scott Diddams, physicist and project leader at the National Institute of Standards and Technology and a coauthor on the study.

"There are always tradeoffs between the highest performance, the smallest size, and the best ease of integration. But even in this first demonstration, these optical oscillators have many advantages; they are on par with, and in some cases even better than, what is available with widespread electronic technology," Vahala says.

The new technique is described in a paper that will be published in the journal Science on July 18. Other authors on this paper include Hansuek Lee, who is a visiting associate at Caltech. The work was sponsored by the DARPA's ORCHID and PULSE programs; the Caltech Institute for Quantum Information and Matter (IQIM), an NSF Physics Frontiers Center with support of the Gordon and Betty Moore Foundation; and the Caltech Kavli NanoScience Institute.

Listing Title: 
Future Electronics May Depend on Lasers
Exclude from News Hub: 
News Type: 
Research News
Friday, October 10, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

Course Ombudsperson Training

Wednesday, January 7, 2015
Center for Student Services 360 (Workshop Space) – Center for Student Services

Head TA Network

Thursday, September 25, 2014

Head TA Network

Wednesday, November 5, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

HALF TIME: A Mid-Quarter Meetup for TAs

Thursday, April 9, 2015
Center for Student Services 360 (Workshop Space) – Center for Student Services

Ombudsperson Training

Friday, October 3, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

TA Training

Frederick B. Thompson



Frederick Burtis Thompson, professor of applied philosophy and computer science, emeritus, passed away on May 27, 2014. The research that Thompson began in the 1960s helped pave the way for today's "expert systems" such as IBM's supercomputer Jeopardy! champ Watson and the interactive databases used in the medical profession. His work provided quick and easy access to the information stored in such systems by teaching the computer to understand human language, rather than forcing the casual user to learn a programming language.

Indeed, Caltech's Engineering & Science magazine reported in 1981 that "Thompson predicts that within a decade a typical professional [by which he meant plumbers as well as doctors] will carry a pocket computer capable of communication in natural language."

"Natural language," otherwise known as everyday English, is rife with ambiguity. As Thompson noted in that same article, "Surgical reports, for instance, usually end with the statement that 'the patient left the operating room in good condition.' While doctors would understand that the phrase refers to the person's condition, some of us might imagine the poor patient wielding a broom to clean up."

Thompson cut through these ambiguities by paring "natural" English down to "formal" sublanguages that applied only to finite bodies of knowledge. While a typical native-born English speaker knows the meanings of 20,000 to 50,000 words, Thompson realized that very few of these words are actually used in any given situation. Instead, we constantly shift between sublanguages—sometimes from minute to minute—as we interact with other people.

Thompson's computer-compatible sublanguages had vocabularies of a few thousand words—some of which might be associated with pictures, audio files, or even video clips—and a simple grammar with a few dozen rules. In the plumber's case, this language might contain the names and functions of pipe fittings, vendors' catalogs, maps of the city's water and sewer systems, sets of architectural drawings, and the building code. So, for example, a plumber at a job site could type "I need a ¾ to ½ brass elbow at 315 South Hill Avenue," and, after some back-and-forth to clarify the details (such as threaded versus soldered, or a 90-degree elbow versus a 45), the computer would place the order and give the plumber directions to the store.

Born on July 26, 1922, Thompson served in the Army and worked at Douglas Aircraft during World War II before earning bachelor's and master's degrees in mathematics at UCLA in 1946 and 1947, respectively. He then moved to UC Berkeley to work with logician Alfred Tarski, whose mathematical definitions of "truth" in formal languages would set the course of Thompson's later career.

On getting his PhD in 1951, Thompson joined the RAND (Research ANd Development) Corporation, a "think tank" created within Douglas Aircraft during the war and subsequently spun off as an independent organization. It was the dawn of the computer age—UNIVAC, the first commercial general-purpose electronic data-processing system, went on sale that same year. Unlike previous machines built to perform specific calculations, UNIVAC ran programs written by its users. Initially, these programs were limited to simple statistical analyses; for example, the first UNIVAC was bought by the U.S. Census Bureau. Thompson pioneered a process called "discrete event simulation" that modeled complex phenomena by breaking them down into sequences of simple actions that happened in specified order, both within each sequence and in relation to actions in other, parallel sequences.

Thompson also helped model a thermonuclear attack on America's major cities in order to help devise an emergency services plan. According to Philip Neches (BS '73, MS '77, PhD '83), a Caltech trustee and one of Thompson's students, "When the team developed their answer, Fred was in tears: the destruction would be so devastating that no services would survive, even if a few people did. . . . This kind of hard-headed analysis eventually led policy makers to a simple conclusion: the only way to win a nuclear war is to never have one." Refined versions of these models were used in 2010 to optimize the deployment of medical teams in the wake of the magnitude-7.0 Haiti earthquake, according to Neches. "The models treated the doctors and supplies as the bombs, and calculated the number of people affected," he explains. "Life has its ironies, and Fred would be the first to appreciate them."

In 1957, Thompson joined General Electric Corporation's computer department. By 1960 he was working at GE's TEMPO (TEchnical Military Planning Operation) in Santa Barbara, where his natural-language research began. "Fred's first effort to teach English to a computer was a system called DEACON [for Direct English Access and CONtrol], developed in the early 1960s," says Neches.

Thompson arrived at Caltech in 1965 with a joint professorship in engineering and the humanities. "He advised the computer club as a canny way to recruit a small but dedicated cadre of students to work with him," Neches recalls. In 1969, Thompson began a lifelong collaboration with Bozena Dostert, a senior research fellow in linguistics who died in 2002. The collaboration was personal as well as professional; their wedding was the second marriage for each.

Although Thompson's and Dostert's work was grounded in linguistic theory, they moved beyond the traditional classification of words into parts of speech to incorporate an operational approach similar to computer languages such as FORTRAN. And thus they created REL, for Rapidly Extensible Language. REL's data structure was based on "objects" that not only described an item or action but allowed the user to specify the interval for which the description applied. For example:

                        Object: Mary Ann Summers

                        Attribute: driver's license

                        Value: yes

                        Start time: 1964

                        End time: current

"This foreshadowed today's semantic web representations," according to Peter Szolovits (BS '70, PhD '75), another of Thompson's students.

In a uniquely experimental approach, the Thompsons tested REL on complex optimization problems such as figuring out how to load a fleet of freighters—making sure the combined volumes of the assorted cargoes didn't exceed the capacities of the holds, distributing the weights evenly fore and aft, planning the most efficient itineraries, and so forth. Volunteers worked through various strategies by typing questions and commands into the computer. The records of these human-computer interactions were compared to transcripts of control sessions in which pairs of students attacked the same problem over a stack of paperwork face-to-face or by communicating with each other from separate locations via teletype machines. Statistical analysis of hundreds of hours' worth of seemingly unstructured dialogues teased out hidden patterns. These patterns included a five-to-one ratio between complete sentences—which had a remarkably invariant average length of seven words—and three-word sentence fragments. Similar patterns are heard today in the clipped cadences of the countdown to a rocket launch.

The "extensible" in REL referred to the ease with which new knowledge bases—vocabulary lists and the relationships between their entries—could be added. In the 1980s, the Thompsons extended REL to POL, for Problem Oriented Language, which had the ability to work out the meanings of words not in its vocabulary as well as coping with such human frailties as poor spelling, bad grammar, and errant punctuation—all on a high-end desktop computer at a time when other natural-language processors ran on room-sized mainframe machines.

"Fred taught both the most theoretical and the most practical computer science courses at the Institute long before Caltech had a formal computer science department. In his theory class, students proved the equivalence of a computable function to a recursive language to a Turing machine. In his data analysis class, students got their first appreciation of the growing power of the computer to handle volumes of data in novel and interesting ways," Neches says. "Fred and his students pioneered the arena of 'Big Data' more than 50 years ahead of the pack." Thompson co-founded Caltech's official computer science program along with professors Carver Mead (BS '56, MS '57, PhD '60) and Ivan Sutherland (MS '60) in 1976.

Adds Remy Sanouillet (MS '82, PhD '94), Thompson's last graduate student, "In terms of vision, Fred 'invented' the Internet well before Al Gore did. He saw, really saw, that we would be asking computers questions that could only be answered by fetching pieces of information stored on servers all over the world, putting the pieces together, and presenting the result in a universally comprehensible format that we now call HTML."

Thompson was a member of the scientific honorary society Sigma Xi, the Association for Symbolic Logic, and the Association for Computing Machinery. He wrote or coauthored more than 40 unclassified papers—and an unknown number of classified ones.

Thompson is survived by his first wife, Margaret Schnell Thompson, and his third wife, Carmen Edmond-Thompson; two children by his first marriage, Mary Ann Thompson Arildsen and Scott Thompson; and four grandchildren.

Plans for a celebration of Thompson's life are pending.

Douglas Smith
Home Page Title: 
Frederick B. Thompson (1922–2014)
Exclude from News Hub: 
News Type: 
In Our Community
Tuesday, July 29, 2014
Center for Student Services 360 (Workshop Space) – Center for Student Services

Intro to Course Design Workshop

DOE Awards $15 Million to Caltech's Solar Energy Research

The United States Department of Energy (DOE) announced on Wednesday that it will be awarding $15.2 million to Caltech's Light-Material Interactions in Energy Conversion (LMI) program, one of 32 Energy Frontier Research Centers (EFRCs) nationwide that will receive a combined $100 million over the next four years to pursue innovative energy research.

The LMI-EFRC is directed by Harry Atwater, the Howard Hughes Professor of Applied Physics and Materials Science, and is a collaborative partnership of researchers in photonics (the generation, manipulation, and detection of light) at Caltech, Lawrence Berkeley National Laboratory, the University of Illinois at Urbana-Champaign, Stanford University, and Harvard University.

The DOE received more than 200 proposals for EFRCs. Caltech is among 22 centers whose initial funding, granted in 2009, is being extended for another four years. During its first funding period, among other accomplishments, LMI-EFRC fabricated complex three-dimensional photonic nanostructure and light absorbers; created a solar cell with world-record-breaking efficiency; and developed the printing-based mechanical assembly of microscale solar cells.

"In recent years the solar energy landscape has been fundamentally altered with the recent growth of a large worldwide photovoltaics industry," says Atwater. "The most important area for basic research advances is now in enumerating the scientific principles and methods for achieving the highest conversion efficiencies. There is a new era emerging in which the science of nanoscale light management plays a critical role in enabling energy conversion to surpass traditional limits. This is where the Light-Material Interactions EFRC has focused its effort and is making advances."

LMI-EFRC will be using its new DOE award to address opportunities for high-efficiency solar energy conversion, with a goal of making scientific discoveries that will enable utilization of the entire visible and infrared solar resource.

"We are proud of the accomplishments of Professor Atwater, his colleagues, postdocs, and students in the Light-Material Interactions effort and are gratified that this effort will go beyond its great accomplishments to date through the renewed funding from the DOE," says Peter Schröder, deputy chair of the Division of Engineering and Applied Science and the Shaler Arthur Hanisch Professor of Computer Science and Applied and Computational Mathematics. "Harry exemplifies the best tradition of engineering at Caltech, creating the interface between fundamental science advances and their realization through engineering for the benefit of society at large."

According to the United States Department of Energy, "transforming the way we generate, supply, transmit, store, and use energy will be one of the defining challenges for America and the globe in the 21st century. At its heart, the challenge is a scientific one. Important as they are, incremental advances in current energy technologies will not be sufficient. History has demonstrated that radically new technologies arise from disruptive advances at the science frontiers. The Energy Frontier Research Centers program aims to accelerate such transformative discovery." 

Energy Secretary Ernest Moniz, in announcing the awards, said, "Today, we are mobilizing some of our most talented scientists to join forces and pursue the discoveries and breakthroughs that will lay the foundation for our nation's energy future."

Cynthia Eller
Exclude from News Hub: 
News Type: 
In Our Community