Super-Resolution via Image-Adapted Denoising CNNs: Incorporating External and Internal Learning

Tom Tirer*, Raja Giryes

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


While deep neural networks exhibit state-of-the-art results in the task of image super-resolution (SR) with a fixed known acquisition process (e.g., a bicubic downscaling kernel), they experience a huge performance loss when the real observation model mismatches the one used in training. Recently, two different techniques suggested to mitigate this deficiency, i.e., enjoy the advantages of deep learning without being restricted by the training phase. The first one follows the plug-and-play (P&P) approach that solves general inverse problems (e.g., SR) by using Gaussian denoisers for handling the prior term in model-based optimization schemes. The second builds on internal recurrence of information inside a single image, and trains a super-resolver network at test time on examples synthesized from the low-resolution image. Our letter incorporates these two independent strategies, enjoying the impressive generalization capabilities of deep learning, captured by the first, and further improving it through internal learning at test time. First, we apply a recent P&P strategy to SR. Then, we show how it may become image-adaptive in test time. This technique outperforms the above two strategies on popular datasets and gives better results than other state-of-the-art methods in practical cases where the observation model is inexact or unknown in advance.

Original languageEnglish
Article number8727404
Pages (from-to)1080-1084
Number of pages5
JournalIEEE Signal Processing Letters
Issue number7
StatePublished - Jul 2019


  • Deep learning
  • denoising neural network
  • image super-resolution
  • internal learning
  • plug-and-play


Dive into the research topics of 'Super-Resolution via Image-Adapted Denoising CNNs: Incorporating External and Internal Learning'. Together they form a unique fingerprint.

Cite this