Deep Texture Representations as a Universal Encoder for Pan-cancer Histology

Published: July 29, 2020, 3:01 a.m.

Link to bioRxiv paper: http://biorxiv.org/cgi/content/short/2020.07.28.224253v1?rss=1 Authors: Komura, D., Kawabe, A., Fukuta, K., Sano, K., Umezaki, T., Koda, H., Suzuki, R., Tominaga, K., Konishi, H., Nishida, S., Furuya, G., Katoh, H., Ushiku, T., Fukayama, M., Ishikawa, S. Abstract: Cancer histological images contain rich biological and clinical information, but quantitative representation can be problematic and has prevented direct comparison and accumulation of large-scale datasets. Here we show that deep texture representations (DTRs) produced by a bilinear Convolutional Neural Network, express cancer morphology well in an unsupervised manner, and work as a universal encoder for cancer histology. DTRs are useful for content-based image retrieval, enabling quick retrieval of histologically similar images from optimally area selected datasets of 7,175 cases from The Cancer Genome Atlas. Via comprehensive comparison with driver and clinically actionable gene mutations, we have successfully predicted 309 combinations of genomic features and cancer types from hematoxylin and eosin-stained images at high accuracy (AUC > 0.70 and q < 0.02). With its mounting capabilities on accessible devices such as smartphones, DTR-based encoding for cancer histology has a potentially strong impact on global equalization for cancer diagnosis and targeted therapies. Copy rights belong to original authors. Visit the link for more info