AI for Cancer Prognostication
Background. Recent advances in artificial intelligence, deep learning in particular, have shown
remarkable progress in many fields ranging from speech recognition to computer vision. Within radiology,
deep learning has the potential to automatically learn and quantify radiographic characteristics of
underlying tissues. In this study, we investigated the clinical utility of convolutional neural networks
(CNN) in quantifying the radiographic phenotype of non-small cell lung cancer (NSCLC) tumors in
computed tomography (CT) data.
Methods and Findings. We performed an integrative analysis on 7 independent cohorts across 5
institutions totaling 1213 NSCLC patients. We identified and independently validated prognostic
signatures using 3D CNNs for patients treated with radiotherapy (n=777). We then employed a transfer
learning approach to achieve the same for surgery patients (n=404). We found that CNN's have a strong
prognostic power in predicting 2-year overall survival for patients treated with radiotherapy (AUC=0.70,
p=1.13x10 -07 ) and surgery (AUC=0.71, p=3.02x10 -04 ). The CNN's were found to outperform state-of-the-
art machine learning models that rely on engineered features and clinical imaging parameters, namely
tumor volume and maximum diameter. We also demonstrated the CNN's high robustness against test-
retest (ICC=0.91) and inter-reader (Spearman's Rank-Order Correlation=0.88) variations. To gain a
better understanding of the characteristics captured by CNN's, we identified regions with the most
contributions towards predictions and highlighted the importance of tumor-surrounding tissue and tumor
locality in patient stratification. We also identified the biological basis of captured phenotypes as being
directly linked to cell cycle and transcriptional processes.
Conclusion. Our results highlight the improved performance of deep learning over its traditional
counterparts and its robustness against variability. These results argue for the integration of deep
learning approaches into the clinical practice, given their ability to predict tumor clinical characteristics
noninvasively from standard-of-care medical images with single-click seed point annotations.