RoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging.

TitleRoS-KD: A Robust Stochastic Knowledge Distillation Approach for Noisy Medical Imaging.
Publication TypeJournal Article
Year of Publication2022
AuthorsJaiswal A, Ashutosh K, Rousseau JF, Peng Y, Wang Z, Ding Y
JournalProc IEEE Int Conf Data Min
Volume2022
Pagination981-986
Date Published2022 Nov-Dec
ISSN1550-4786
Abstract

AI-powered Medical Imaging has recently achieved enormous attention due to its ability to provide fast-paced healthcare diagnoses. However, it usually suffers from a lack of high-quality datasets due to high annotation cost, inter-observer variability, human annotator error, and errors in computer-generated labels. Deep learning models trained on noisy labelled datasets are sensitive to the noise type and lead to less generalization on the unseen samples. To address this challenge, we propose a Robust Stochastic Knowledge Distillation (RoS-KD) framework which mimics the notion of learning a topic from multiple sources to ensure deterrence in learning noisy information. More specifically, RoS-KD learns a smooth, well-informed, and robust student manifold by distilling knowledge from multiple teachers trained on overlapping subsets of training data. Our extensive experiments on popular medical imaging classification tasks (cardiopulmonary disease and lesion classification) using real-world datasets, show the performance benefit of RoS-KD, its ability to distill knowledge from many popular large networks (ResNet-50, DenseNet-121, MobileNet-V2) in a comparatively small network, and its robustness to adversarial attacks (PGD, FSGM). More specifically, RoS-KD achieves > 2% and > 4% improvement on F1-score for lesion classification and cardiopulmonary disease classification tasks, respectively, when the underlying student is ResNet-18 against recent competitive knowledge distillation baseline. Additionally, on cardiopulmonary disease classification task, RoS-KD outperforms most of the SOTA baselines by ~1% gain in AUC score.

DOI10.1109/icdm54844.2022.00118
Alternate JournalProc IEEE Int Conf Data Min
PubMed ID37038389
PubMed Central IDPMC10082964
Grant ListR00 LM013001 / LM / NLM NIH HHS / United States