Interpretable Segmentation of Amyloid-βStained Whole Slide Images of Brain Tissue

Published: Nov. 16, 2020, 2:02 a.m.

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.11.13.381871v1?rss=1 Authors: Lai, Z., Guo, R., Xu, W., Hu, Z., Mifflin, K., Dugger, B. N., Cheung, S.-c., Chuah, C.-N. Abstract: Neurodegenerative disease pathologies have been reported in both grey matter (GM) and white matter (WM) with different density distributions, an automated separation task of GM/WM would be extremely advantageous for aid in neuropathologic deep phenotyping. Standard segmentation methods typically involve manual annotations, where a trained researcher traces the delineation of GM/WM in ultra-high-resolution Whole Slide Images (WSIs). This method can be time-consuming and subjective, preventing the analysis of large amounts of WSIs in a scalable way. In this paper, we propose an automated segmentation pipeline combining a Convolutional Neural Network (CNN) module for segmenting GM/WM regions and a post-processing module to remove artifacts/residues of tissues as well as generate XML annotations that can be visualized via Aperio ImageScope. First, we investigate two baseline models for medical image segmentation: FCN, and U-Net. Then we propose a new patch-based approach, ResNet-Patch, to classify the GM/WM/background regions. In addition, we integrate a Neural Conditional Random Field (NCRF) module, ResNet-NCRF, to model and incorporate the spatial correlations among neighboring patches. Although their mechanisms are greatly different, both U-Net and ResNet-Patch/ResNet-NCRF achieve Intersection over Union (IoU) of more than 90% in GM and more than 80% in WM, while ResNet-Patch achieves 1% superior to U-Net with lower variance among various WSIs. ResNet-NCRF further improves the IoU by 3% for WM compared to ResNet-Patch before post-processing. We also apply gradient-weighted class activation mapping (Grad-CAM) to interpret the segmentation masks and provide clinical explanations and insights. Copy rights belong to original authors. Visit the link for more info