Concepedia

Publication | Open Access

Image restoration using very deep convolutional encoder-decoder networks with symmetric skip connections

592

Citations

27

References

2016

Year

TLDR

The paper proposes a very deep fully convolutional encoder‑decoder network with symmetric skip connections for image restoration tasks such as denoising and super‑resolution. The architecture stacks convolutional feature‑extracting layers with deconvolutional reconstruction layers, and uses symmetric skip connections to propagate signals and image details, thereby easing training and improving restoration quality. Experimental results show that a single model can handle multiple noise levels and outperforms recent state‑of‑the‑art methods.

Abstract

In this paper, we propose a very deep fully convolutional encoding-decoding framework for image restoration such as denoising and super-resolution. The network is composed of multiple layers of convolution and deconvolution operators, learning end-to-end mappings from corrupted images to the original ones. The convolutional layers act as the feature extractor, which capture the abstraction of image contents while eliminating noises/corruptions. Deconvolutional layers are then used to recover the image details. We propose to symmetrically link convolutional and deconvolutional layers with skip-layer connections, with which the training converges much faster and attains a higher-quality local optimum. First, the skip connections allow the signal to be back-propagated to bottom layers directly, and thus tackles the problem of gradient vanishing, making training deep networks easier and achieving restoration performance gains consequently. Second, these skip connections pass image details from convolutional layers to deconvolutional layers, which is beneficial in recovering the original image. Significantly, with the large capacity, we can handle different levels of noises using a single model. Experimental results show that our network achieves better performance than recent state-of-the-art methods.

References

YearCitations

Page 1