Concepedia

Publication | Open Access

Beyond a Gaussian Denoiser: Residual Learning of Deep CNN for Image Denoising

8.5K

Citations

45

References

2017

Year

TLDR

Discriminative model learning for image denoising has attracted considerable attention due to its favorable performance. The authors aim to design a deep feed‑forward CNN (DnCNN) that incorporates residual learning and batch normalization to perform blind Gaussian denoising and other image restoration tasks with a single model. DnCNN employs residual learning and batch normalization in a very deep architecture, enabling blind Gaussian denoising by implicitly removing the latent clean image in hidden layers and accelerating training. Experiments show that DnCNN achieves high performance on multiple denoising tasks and runs efficiently on GPUs.

Abstract

Discriminative model learning for image denoising has been recently attracting considerable attentions due to its favorable denoising performance. In this paper, we take one step forward by investigating the construction of feed-forward denoising convolutional neural networks (DnCNNs) to embrace the progress in very deep architecture, learning algorithm, and regularization method into image denoising. Specifically, residual learning and batch normalization are utilized to speed up the training process as well as boost the denoising performance. Different from the existing discriminative denoising models which usually train a specific model for additive white Gaussian noise (AWGN) at a certain noise level, our DnCNN model is able to handle Gaussian denoising with unknown noise level (i.e., blind Gaussian denoising). With the residual learning strategy, DnCNN implicitly removes the latent clean image in the hidden layers. This property motivates us to train a single DnCNN model to tackle with several general image denoising tasks such as Gaussian denoising, single image super-resolution and JPEG image deblocking. Our extensive experiments demonstrate that our DnCNN model can not only exhibit high effectiveness in several general image denoising tasks, but also be efficiently implemented by benefiting from GPU computing.

References

YearCitations

2016

214.9K

2014

84.5K

2017

75.5K

2014

75.4K

2015

46.2K

2015

24.2K

2015

18.4K

2024

15.6K

1992

15.4K

2007

9K

Page 1