Publication | Closed Access
Truncation Cross Entropy Loss for Remote Sensing Image Captioning
99
Citations
47
References
2020
Year
Structured PredictionEngineeringMachine LearningNatural Language ProcessingMultimodal LlmImage AnalysisText-to-image RetrievalData SciencePattern RecognitionCross EntropyMachine TranslationRemote Sensing ImageMachine VisionFeature LearningVision Language ModelDeep LearningImage CaptioningComputer VisionAutomatic Annotation
Remote sensing image captioning relies on encoder‑decoder CNN‑LSTM models, yet cross‑entropy training forces target words to probability one, causing overfitting due to synonym variability. This work investigates the overfitting induced by cross‑entropy loss in RSIC and introduces a truncation cross‑entropy loss to mitigate it. The authors evaluate the new loss through extensive experiments on UCM‑captions, Sydney‑captions, and RSICD datasets. The truncation cross‑entropy loss achieves state‑of‑the‑art results on Sydney‑captions and RSICD and competitive performance on UCM‑captions, demonstrating its benefit for RSIC.
Recently, remote sensing image captioning (RSIC) has drawn an increasing attention. In this field, the encoder-decoder-based methods have become the mainstream due to their excellent performance. In the encoder-decoder framework, the convolutional neural network (CNN) is used to encode a remote sensing image into a semantic feature vector, and a sequence model such as long short-term memory (LSTM) is subsequently adopted to generate a content-related caption based on the feature vector. During the traditional training stage, the probability of the target word at each time step is forcibly optimized to 1 by the cross entropy (CE) loss. However, because of the variability and ambiguity of possible image captions, the target word could be replaced by other words like its synonyms, and therefore, such an optimization strategy would result in the overfitting of the network. In this article, we explore the overfitting phenomenon in the RSIC caused by CE loss and correspondingly propose a new truncation cross entropy (TCE) loss, aiming to alleviate the overfitting problem. In order to verify the effectiveness of the proposed approach, extensive comparison experiments are performed on three public RSIC data sets, including UCM-captions, Sydney-captions, and RSICD. The state-of-the-art result of Sydney-captions and RSICD and the competitive results of UCM-captions achieved by TCE loss demonstrate that the proposed method is beneficial to RSIC.
| Year | Citations | |
|---|---|---|
Page 1
Page 1