Feature aggregation in perceptual loss for ultra low-dose (ULD) CT denoising

Michael Green, Edith M. Marom, Eli Konen, Nahum Kiryati, Arnaldo Mayer

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Lung cancer CT screening programs are continuously reducing patient exposure to radiation at the expense of image quality. State-of-the-art denoising algorithms are instrumental in preserving the diagnostic value of these images. In this work, a novel neural denoising scheme is proposed for ULD chest CT. The proposed method aggregates multi-scale features that provide rich information for the computation of a perceptive loss. The loss is further optimized for chest CT data by using denoising auto-encoders on real CT images to build the feature extracting network instead of using an existing network trained on natural images. The proposed method was validated on co-registered pairs of real ULD and normal dose scans and compared favorably with published state-of-the-art denoising networks both qualitatively and quantitatively.

Original languageEnglish
Title of host publicationISBI 2019 - 2019 IEEE International Symposium on Biomedical Imaging
PublisherIEEE Computer Society
Pages1635-1638
Number of pages4
ISBN (Electronic)9781538636411
DOIs
StatePublished - Apr 2019
Event16th IEEE International Symposium on Biomedical Imaging, ISBI 2019 - Venice, Italy
Duration: 8 Apr 201911 Apr 2019

Publication series

NameProceedings - International Symposium on Biomedical Imaging
Volume2019-April
ISSN (Print)1945-7928
ISSN (Electronic)1945-8452

Conference

Conference16th IEEE International Symposium on Biomedical Imaging, ISBI 2019
Country/TerritoryItaly
CityVenice
Period8/04/1911/04/19

Keywords

  • Convolutional neural networks
  • Features aggregation
  • Image denoising
  • Perceptual loss
  • Ultra-low-dose CT

Fingerprint

Dive into the research topics of 'Feature aggregation in perceptual loss for ultra low-dose (ULD) CT denoising'. Together they form a unique fingerprint.

Cite this