Cross-modality synthesis from CT to PET using FCN and GAN networks for improved automated lesion detection

Avi Ben-Cohen*, Eyal Klang, Stephen P. Raskin, Shelly Soffer, Simona Ben-Haim, Eli Konen, Michal Marianne Amitai, Hayit Greenspan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

135 Scopus citations

Abstract

In this work we present a novel system for generation of virtual PET images using CT scans. We combine a fully convolutional network (FCN) with a conditional generative adversarial network (GAN) to generate simulated PET data from given input CT data. The synthesized PET can be used for false-positive reduction in lesion detection solutions. Clinically, such solutions may enable lesion detection and drug treatment evaluation in a CT-only environment, thus reducing the need for the more expensive and radioactive PET/CT scan. Our dataset includes 60 PET/CT scans from Sheba Medical center. We used 23 scans for training and 37 for testing. Different schemes to achieve the synthesized output were qualitatively compared. Quantitative evaluation was conducted using an existing lesion detection software, combining the synthesized PET as a false positive reduction layer for the detection of malignant lesions in the liver. Current results look promising showing a 28% reduction in the average false positive per case from 2.9 to 2.1. The suggested solution is comprehensive and can be expanded to additional body organs, and different modalities.

Original languageEnglish
Pages (from-to)186-194
Number of pages9
JournalEngineering Applications of Artificial Intelligence
Volume78
DOIs
StatePublished - Feb 2019

Funding

FundersFunder number
Israel Science Foundation1918/16

    Keywords

    • CT
    • Deep learning
    • GAN
    • Image synthesis
    • Liver lesion
    • PET

    Fingerprint

    Dive into the research topics of 'Cross-modality synthesis from CT to PET using FCN and GAN networks for improved automated lesion detection'. Together they form a unique fingerprint.

    Cite this