Publication | Open Access
An annotation-free whole-slide training approach to pathological classification of lung cancer types using deep learning
215
Citations
41
References
2021
Year
Deep learning for digital pathology is hindered by the high spatial resolution of whole‑slide images, and most studies rely on laborious patch‑based methods that require detailed free‑hand contouring of image patches. The study aims to eliminate the need for patch‑level annotations by training neural networks on entire whole‑slide images using only slide‑level diagnoses. The approach uses a unified memory mechanism to enable training on entire whole‑slide images despite accelerator memory limits. Experiments on 9,662 lung cancer whole‑slide images show the method achieves AUROC of 0.9594 for adenocarcinoma and 0.9414 for squamous cell carcinoma, outperforming multiple‑instance learning and providing strong localization of small lesions via class activation mapping.
Abstract Deep learning for digital pathology is hindered by the extremely high spatial resolution of whole-slide images (WSIs). Most studies have employed patch-based methods, which often require detailed annotation of image patches. This typically involves laborious free-hand contouring on WSIs. To alleviate the burden of such contouring and obtain benefits from scaling up training with numerous WSIs, we develop a method for training neural networks on entire WSIs using only slide-level diagnoses. Our method leverages the unified memory mechanism to overcome the memory constraint of compute accelerators. Experiments conducted on a data set of 9662 lung cancer WSIs reveal that the proposed method achieves areas under the receiver operating characteristic curve of 0.9594 and 0.9414 for adenocarcinoma and squamous cell carcinoma classification on the testing set, respectively. Furthermore, the method demonstrates higher classification performance than multiple-instance learning as well as strong localization results for small lesions through class activation mapping.
| Year | Citations | |
|---|---|---|
Page 1
Page 1