Concepedia

Publication | Open Access

Image Restoration Using Very Deep Convolutional Encoder-Decoder Networks with Symmetric Skip Connections

1.1K

Citations

7

References

2016

Year

TLDR

The authors propose a very deep fully convolutional encoder‑decoder network with symmetric skip connections for image restoration tasks such as denoising and super‑resolution. The architecture stacks convolutional feature‑extracting layers and deconvolutional reconstruction layers, using skip connections to propagate gradients and image details, enabling efficient training and high‑quality restoration. Experiments show the single model handles multiple noise levels and outperforms all previously reported state‑of‑the‑art methods.

Abstract

In this paper, we propose a very deep fully convolutional encoding-decoding framework for image restoration such as denoising and super-resolution. The network is composed of multiple layers of convolution and de-convolution operators, learning end-to-end mappings from corrupted images to the original ones. The convolutional layers act as the feature extractor, which capture the abstraction of image contents while eliminating noises/corruptions. De-convolutional layers are then used to recover the image details. We propose to symmetrically link convolutional and de-convolutional layers with skip-layer connections, with which the training converges much faster and attains a higher-quality local optimum. First, The skip connections allow the signal to be back-propagated to bottom layers directly, and thus tackles the problem of gradient vanishing, making training deep networks easier and achieving restoration performance gains consequently. Second, these skip connections pass image details from convolutional layers to de-convolutional layers, which is beneficial in recovering the original image. Significantly, with the large capacity, we can handle different levels of noises using a single model. Experimental results show that our network achieves better performance than all previously reported state-of-the-art methods.

References

YearCitations

Page 1