skip to main content
Caltech

DOLCIT Seminar

Friday, June 21, 2019
12:00pm to 1:00pm
Add to Cal
Annenberg 213
Unsupervised Learning of Image Correspondences in Medical Image Analysis
Adrian Dalca, Faculty of Radiology at Harvard Medical School, and Research Affiliate at CSAIL, Massachusetts Institute of Technology, Machine Learning in Medical Images, Medicine, and Photography, MIT,

Image alignment, or registration, is fundamental to many tasks in medical image analysis, computer vision, and computational anatomy. Classical image registration methods have undergone decades of technical development, but are often prohibitively slow since they solve an optimization problem for each 3D image pair. In this talk, I will introduce new models that leverage learning paradigms to enable deformable medical image registration orders of magnitude faster than traditional methods.

We start by building a connection between classical and learning-based methods. I will introduce probabilistic generative models and a resulting unsupervised learning-based inference strategy that uses insights from classical registration methods and makes use of recent developments in convolutional neural networks (CNNs). We demonstrate registration accuracy comparable to state-of-the-art 3D image registration, while operating orders of magnitude faster in practice. I will discuss new insights for this class of models, including amortized optimization, leveraging image segmentation maps during training to dramatically improve the state of the art, and use of these models in limited data settings. Moreover, I will talk about a recent result in which we exploit these models to enable dramatically improved single-shot image segmentation. Finally, based on these models I introduce a new learning framework for building deformable templates, which play a fundamental role in these analyses. This learning approach to template construction can yield a new class of on-demand conditional templates, enabling new analysis.

For more information, please contact Diane Goodfellow by email at [email protected].