Concepedia

Abstract

AbstractSynthetic Aperture Radar (SAR) is capable of penetrating a certain depth of dry sand and facilitates a unique opportunity for sub-surface imaging while optical sensors record information on sand features in the arid and semi-arid environments. Merging of radar and optical data generates greater quality images that can interpret and map the details of structural attributes in these landscapes. The current study compared different merging options to examine the effects of data fusion on object recognition. The study area is located on a sand-buried area of Eastern Sahara. Image fusion using Principal Component (PC) and intensity hue saturation (IHS) provides better results over the arithmetic (i.e. Synthetic Variable Ratio (SVR) and Brovey) and wavelet-based methods. The study also discussed the utility of data fusion for surface and sub-surface feature imaging and mapping in the desert environments. AcknowledgementsThe first author acknowledges Japan Society for the Promotion of Science (JSPS) for granting post-doctoral fellowship. The authors would like to thank JSPS for Grant-in-Aid for Scientific Research (No. 1907023) and Young Scientist Award (No. 19686025), Venture Business Laboratory & Chiba University for 10th Research Grant and National Institute of Information and Communication Technology (NICT) for International Research Collaboration Research Grant. Landsat ETM+ Satellite data and SIR-C images were downloaded from glcf.umiacs.umd.edu/index.shtml website and United States Geological Survey (USGS), respectively. Finally, the contributions of two anonymous referees are acknowledged, since their comments and suggestions greatly improved the overall quality of the manuscript.

References

YearCitations

Page 1