Mammogram classification and abnormality detection from nonlocal labels using deep multiple instance neural network
Mammography is the common modality used for screening and early detection of breast cancer. The emergence of machine learning, particularly deep learning methods, aims to assist radiologists to reach higher sensitivity and specificity. Yet, typical supervised machine learning methods demand the radiological images to have findings annotated within the image. This is a tedious task, which is often out of reach due to the high cost and unavailability of expert radiologists. We describe a computer-aided detection and diagnosis system for weakly supervised learning, where the mammogram (MG) images are tagged only on a global level, without local annotations. Our work addresses the problem of MG classification and detection of abnormal findings through a novel deep learning framework built on the multiple instance learning (MIL) paradigm. Our proposed method processes the MG image utilizing the full resolution, with a deep MIL convolutional neural network. This approach allows us to classify the whole MG according to a severity score and localize the source of abnormality in full resolution, while trained on a weakly labeled data set. The key hallmark of our approach is automatic discovery of the discriminating patches in the mammograms using MIL. We validate the proposed method on two mammogram data sets, a large multi-center MG cohort and the publicly available INbreast, in two different scenarios. We present promising results in classification and detection, comparable to a recent supervised method that was trained on fully annotated data set. As the volume and complexity of data in healthcare continues to increase, such an approach may have a profound impact on patient care in many applications.