Publication | Closed Access
Enhanced Autoencoders With Attention-Embedded Degradation Learning for Unsupervised Hyperspectral Image Super-Resolution
158
Citations
77
References
2023
Year
Convolutional Neural NetworkEngineeringMachine LearningUnmixing-based NetworksAutoencodersAttention-embedded Degradation LearningSuper-resolution ImagingImage AnalysisData ScienceSingle-image Super-resolutionComputational ImagingVideo Super-resolutionVideo TransformerMachine VisionInput ModalitiesFeature LearningComputer ScienceMedical Image ComputingDeep LearningComputer VisionHyperspectral ImagingEu2adl NetworkEnhanced Autoencoders
Recently, unmixing-based networks have shown significant potential in unsupervised multispectral-aided hyperspectral image super-resolution task (MS-aided HS-SR). Nevertheless, the representation ability of unsupervised networks and the design of loss functions still have not been fully explored, leaving large room for further improvement. To this end, we propose an enhanced unmixing-inspired unsupervised network with attention-embedded degradation learning, EU2ADL for short, to realize MS-aided HS-SR. First, two coupled autoencoders serve as the backbone of EU2ADL network to simultaneously decompose input modalities into abundances and corresponding endmembers, whose encoder part is composed of a spatial-spectral two-stream subnetwork for modality-salient representation learning and a parameter-shared one-stream subnetwork for modality-interacted representation enhancement. More importantly, a hybrid model-constrained loss containing a perceptual abundance term and a degradation-guided term is introduced to further eliminate the latent distortions. Since the hybrid loss is built on the degradation model, we additionally present an attention-embedded degradation learning network to adaptively estimate the unknown degradation parameters. Extensive experimental results on four datasets demonstrate the effectiveness of our proposed methods when compared with state-of-the-arts.
| Year | Citations | |
|---|---|---|
Page 1
Page 1