Segmentation-Free Outcome Prediction from Head and Neck Cancer PET/CT Images: Deep Learning-Based Feature Extraction from Multi-Angle Maximum Intensity Projections (MA-MIPs)

Affiliations
  • 1Department of Radiology, University of British Columbia, Vancouver, BC V5Z 1M9, Canada.
  • 2Department of Integrative Oncology, BC Cancer Research Institute, Vancouver, BC V5Z 1L3, Canada.
  • 3Department of Cardiology, University Hospital Bern, CH-3010 Bern, Switzerland.
  • 4Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, CH-1211 Geneva, Switzerland.
  • 5Department of Physics & Astronomy, University of British Columbia, Vancouver, BC V6T 1Z1, Canada.
  • 6Department of Biomedical Engineering, University of British Columbia, Vancouver, BC V6T 1Z3, Canada.

Published on:

Abstract

We introduce an innovative, simple, effective segmentation-free approach for survival analysis of head and neck cancer (HNC) patients from PET/CT images. By harnessing deep learning-based feature extraction techniques and multi-angle maximum intensity projections (MA-MIPs) applied to Fluorodeoxyglucose Positron Emission Tomography (FDG-PET) images, our proposed method eliminates the need for manual segmentations of regions-of-interest (ROIs) such as primary tumors and involved lymph nodes. Instead, a state-of-the-art object detection model is trained utilizing the CT images to perform automatic cropping of the head and neck anatomical area, instead of only the lesions or involved lymph nodes on the PET volumes. A pre-trained deep convolutional neural network backbone is then utilized to extract deep features from MA-MIPs obtained from 72 multi-angel axial rotations of the cropped PET volumes. These deep features extracted from multiple projection views of the PET volumes are then aggregated and fused, and employed to perform recurrence-free survival analysis on a cohort of 489 HNC patients. The proposed approach outperforms the best performing method on the target dataset for the task of recurrence-free survival analysis. By circumventing the manual delineation of the malignancies on the FDG PET-CT images, our approach eliminates the dependency on subjective interpretations and highly enhances the reproducibility of the proposed survival analysis method. The code for this work is publicly released.