Triplanar ensemble U-Net model for white matter hyperintensities segmentation on MR images

Vaanathi Sundaresan, Giovanna Zamboni, Peter M. Rothwell, Mark Jenkinson, Ludovica Griffanti

Research output: Contribution to journalArticlepeer-review


White matter hyperintensities (WMHs) have been associated with various cerebrovascular and neurodegenerative diseases. Reliable quantification of WMHs is essential for understanding their clinical impact in normal and pathological populations. Automated segmentation of WMHs is highly challenging due to heterogeneity in WMH characteristics between deep and periventricular white matter, presence of artefacts and differences in the pathology and demographics of populations. In this work, we propose an ensemble triplanar network that combines the predictions from three different planes of brain MR images to provide an accurate WMH segmentation. Also, the network uses anatomical information regarding WMH spatial distribution in loss functions for improving the efficiency of segmentation and to overcome the contrast variations between deep and periventricular WMHs. We evaluated our method on 5 datasets, of which 3 are part of a publicly available dataset (training data for MICCAI WMH Segmentation Challenge 2017 - MWSC 2017) consisting of subjects from three different cohorts. On evaluating our method separately in deep and periventricular regions, we observed robust and comparable performance in both regions. Our method performed better than most of the existing methods, including FSL BIANCA, and on par with the top ranking deep learning method of MWSC 2017.

Original languageEnglish
Publication statusPublished or Issued - 26 Jul 2020
Externally publishedYes


  • Deep learning
  • MRI
  • Segmentation
  • U-Nets
  • White matter hyperintensities

ASJC Scopus subject areas

  • Biochemistry, Genetics and Molecular Biology(all)
  • Agricultural and Biological Sciences(all)
  • Immunology and Microbiology(all)
  • Neuroscience(all)
  • Pharmacology, Toxicology and Pharmaceutics(all)

Cite this