Indexed by:
Abstract:
Multimodal change detection (CD) is a practical but challenging task in remote sensing. Multimodal change detection allows remote sensing images to be acquired by different sensors, which is important for disaster damage assessment. In this study, a multimodal information exchange network (MIENet) is explored by abandoning existing feature space transformation methods for solving multimodal image heterogeneity. The network incorporates an information exchange module (IEM) in the feature extraction layer, utilizing an attention mechanism to facilitate the exchange of information between different modalities. Through this module, we can mutually learn and perceive the contextual information of different modality features, achieving domain adaptation between different modality domains to a certain extent. Extensive experiments on the California dataset demonstrate that MIENet outperforms existing feature space transformation methods in recognizing change areas in multimodal images. Compared to other methods applied to this dataset, MIENet achieved the highest F1 scores and MIOU values: 90.35% and 89.69%. These results validate the effectiveness of our proposed MIENet method in multimodal change detection. © 2024 Copyright held by the owner/author(s).
Keyword:
Reprint 's Address:
Email:
Version:
Source :
Year: 2024
Page: 98-103
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: