An Anatomically Curated Fiber Clustering White Matter Atlas for Consistent White Matter Tract Parcellation across the Lifespan
An Immersive Virtual Reality Environment for Diagnostic Imaging
Inter-site and Inter-scanner Diffusion MRI Data Harmonization
The Open Anatomy Browser: A Collaborative Web-Based Viewer for Interoperable Anatomy Atlases
Unsupervised Discovery of Emphysema Subtypes in a Large Clinical Cohort
Identifying Shared Brain Networks in Individuals by Decoupling Functional and Anatomical Variability
Supra-Threshold Fiber Cluster Statistics for Data-Driven Whole Brain Tractography Analysis
Free Water Modeling of Peritumoral Edema using Multi-fiber Tractography
Estimation of Bounded and Unbounded Trajectories in Diffusion MRI
Principal Gradient of Macroscale Cortical Organization
Slide 10
Evolution of a Simultaneous Segmentation and Atlas Registration
Multi-modality MRI-based Atlas of the Brain
Intracranial Fluid Redistribution
Corticospinal Tract Modeling for Neurosurgical Planning by Tracking through Regions of Peritumoral Edema and Crossing Fibers
Automated White Matter Fiber Tract Identification in Patients with Brain Tumors
State-space Models of Mental Processes from fMRI
Robust Initialization of Active Shape Models for Lung Segmentation in CT Scans: A Feature-Based Atlas Approach
Tractography-driven Groupwise Multi-Scale Parcellation of the Cortex
Gray Matter Alterations in Early Aging
Statistical Shape Analysis: From Landmarks to Diffeomorphisms
A Generative Probabilistic Model and Discriminative Extensions for Brain Lesion Segmentation
Joint Modeling of Imaging and Genetic Variability
MR-Ultrasound Fusion for Neurosurgery
Diffusion MRI and Tumor Heterogeneity
SlicerDMRI: Open Source Diffusion MRI Software for Brain Cancer Research

Neuroimage Analysis Center

The Neuroimaging Analysis Center is a research and technology center with the mission of advancing the role of neuroimaging in health care. The ability to access huge cohorts of patient medical records and radiology data, the emergence of ever-more detailed imaging modalities, and the availability of unprecedented computer processing power marks the possibility for a new era in neuroimaging, disease understanding, and patient treatment. We are excited to present a national resource center with the goal of finding new ways of extracting disease characteristics from advanced imaging and computation, and to make these methods available to the larger medical community through a proven methodology of world-class research, open-source software, and extensive collaboration.

Our Sponsor

NIBIB

The NAC is a Biomedical Technology Resource Center supported by the National Institute of Biomedical Imaging and Bioengineering (NIBIB) (P41 EB015902). It was supported by the National Center for Research Resources (NCRR) (P41 RR13218) through December 2011.

Contact the Center Directors

Westin

Carl-Fredrik Westin, PhD
Laboratory of Mathematics in Imaging
Brigham and Women's Hospital
1249 Boylston St., Room 240
Boston, MA 02215
Phone: +1 617 525-6209
E-mail: westin at bwh.harvard.edu
 

Ron Kikinis

Ron Kikinis, MD
Surgical Planning Laboratory 
Brigham and Women's Hospital 
75 Francis St, L1 Room 050
Boston, MA 02115
Phone: +1 617 732-7389
E-mail: kikinis at bwh.harvard.edu
 

 

Recent Publications

  • Magnotta VA, Friedman L. Measurement of Signal-to-Noise and Contrast-to-Noise in the fBIRN Multicenter Imaging Study. J Digit Imaging. 2006;19(2):140–7.
    The ability to analyze and merge data across sites, vendors, and field strengths depends on one’s ability to acquire images with the same image quality including image smoothness, signal-to-noise ratio (SNR) and contrast-to-noise ratio (CNR). SNR can be used to compare different magnetic resonance scanners as a measure of comparability between the systems. This study looks at the SNR and CNR ratios in structural fast spin-echo T2-weighted scans acquired in five individuals across ten sites that are part of Functional Imaging Research of Schizophrenia Testbed Biomedical Informatics Research Network (fBIRN). Different manufacturers, field strengths, gradient coils, and RF coils were used at these sites. The SNR of gray matter was fairly uniform (41.3-43.3) across scanners at 1.5 T. The higher field scanners produced images with significantly higher SNR values (44.5-108.7 at 3 T and 50.8 at 4 T). Similar results were obtained for CNR measurements between gray/white matter at 1.5 T (9.5-10.2), again increasing at higher fields (10.1-28.9 at 3 T and 10.9 at 4 T).
  • Angenent S, Pichon E, Tannenbaum A. MATHEMATICAL METHODS IN MEDICAL IMAGE PROCESSING. Bull New Ser Am Math Soc. 2006;43:365–396.
    In this paper, we describe some central mathematical problems in medical imaging. The subject has been undergoing rapid changes driven by better hardware and software. Much of the software is based on novel methods utilizing geometric partial differential equations in conjunction with standard signal/image processing techniques as well as computer graphics facilitating man/machine interactions. As part of this enterprise, researchers have been trying to base biomedical engineering principles on rigorous mathematical foundations for the development of software methods to be integrated into complete therapy delivery systems. These systems support the more effective delivery of many image-guided procedures such as radiation therapy, biopsy, and minimally invasive surgery. We will show how mathematics may impact some of the main problems in this area, including image enhancement, registration, and segmentation.
  • Michailovich O V, Tannenbaum A. Despeckling of medical ultrasound images. IEEE Trans Ultrason Ferroelectr Freq Control. 2006;53(1):64–78.
    Speckle noise is an inherent property of medical ultrasound imaging, and it generally tends to reduce the image resolution and contrast, thereby reducing the diagnostic value of this imaging modality. As a result, speckle noise reduction is an important prerequisite, whenever ultrasound imaging is used for tissue characterization. Among the many methods that have been proposed to perform this task, there exists a class of approaches that use a multiplicative model of speckled image formation and take advantage of the logarithmical transformation in order to convert multiplicative speckle noise into additive noise. The common assumption made in a dominant number of such studies is that the samples of the additive noise are mutually uncorrelated and obey a Gaussian distribution. The present study shows conceptually and experimentally that this assumption is oversimplified and unnatural. Moreover, it may lead to inadequate performance of the speckle reduction methods. The study introduces a simple preprocessing procedure, which modifies the acquired radio-frequency images (without affecting the anatomical information they contain), so that the noise in the log-transformation domain becomes very close in its behavior to a white Gaussian noise. As a result, the preprocessing allows filtering methods based on assuming the noise to be white and Gaussian, to perform in nearly optimal conditions. The study evaluates performances of three different, nonlinear filters—wavelet denoising, total variation filtering, and anisotropic diffusion—and demonstrates that, in all these cases, the proposed preprocessing significantly improves the quality of resultant images. Our numerical tests include a series of computer-simulated and in vivo experiments.
  • Talos IF, Mian AZ, Zou KH, Hsu L, Goldberg-Zimring D, Haker S, Bhagwat JG, Mulkern R V. Magnetic resonance and the human brain: anatomy, function and metabolism. Cell Mol Life Sci. 2006;63(10):1106–24.
    The introduction and development, over the last three decades, of magnetic resonance (MR) imaging and MR spectroscopy technology for in vivo studies of the human brain represents a truly remarkable achievement, with enormous scientific and clinical ramifications. These effectively non-invasive techniques allow for studies of the anatomy, the function and the metabolism of the living human brain. They have allowed for new understandings of how the healthy brain works and have provided insights into the mechanisms underlying multiple disease processes which affect the brain. Different MR techniques have been developed for studying anatomy, function and metabolism. The primary focus of this review is to describe these different methodologies and to briefly review how they are being employed to more fully appreciate the intricacies associated with the organ, which most distinctly differentiates the human species from the other animal forms on earth.
  • Learned-Miller EG. Data driven image models through continuous joint alignment. IEEE Trans Pattern Anal Mach Intell. 2006;28(2):236–50.
    This paper presents a family of techniques that we call congealing for modeling image classes from data. The idea is to start with a set of images and make them appear as similar as possible by removing variability along the known axes of variation. This technique can be used to eliminate "nuisance" variables such as affine deformations from handwritten digits or unwanted bias fields from magnetic resonance images. In addition to separating and modeling the latent images-i.e., the images without the nuisance variables-we can model the nuisance variables themselves, leading to factorized generative image models. When nuisance variable distributions are shared between classes, one can share the knowledge learned in one task with another task, leading to efficient learning. We demonstrate this process by building a handwritten digit classifier from just a single example of each class. In addition to applications in handwritten character recognition, we describe in detail the application of bias removal from magnetic resonance images. Unlike previous methods, we use a separate, nonparametric model for the intensity values at each pixel. This allows us to leverage the data from the MR images of different patients to remove bias from each other. Only very weak assumptions are made about the distributions of intensity values in the images. In addition to the digit and MR applications, we discuss a number of other uses of congealing and describe experiments about the robustness and consistency of the method.