Indexed by:
Abstract:
This paper addresses the challenge of hyperspectral anomaly detection (HAD) in noisy environments, where traditional methods often assume noise-free data or treat denoising and detection as separate processes. Hyperspectral images (HSI) are prone to various noise types, such as Gaussian noise, salt-and-pepper noise, and others, which degrade image quality and hinder accurate anomaly detection. In this paper, a unified model is proposed that integrates denoising and anomaly detection within the same network structure, combining Autoencoder and Transformer architectures. For the entire method workflow, the network is first trained to develop a denoising model. Subsequently, the same network is utilized to train the HAD model. Specifically, a Transformer is embedded into the hidden layer of the autoencoder to leverage both pixel-level spectral and spatial information for distinguishing between background and anomalies, while utilizing its reconstruction capability for denoising. Additionally, a local multi-scale feature extraction module is introduced to capture anomaly features across different spatial scales, working alongside the Transformer to explore both global and local feature information. Furthermore, the self-attention mechanism is enhanced through the use of a normalized dot product, improving the capture of global information and effectively addressing noise interference and multi-scale features. Extensive experiments conducted on six datasets with varying levels of Gaussian noise and Salt and Pepper noise indicate that our method demonstrates excellent detection performance in noisy environments compared to other state-of-the-art detectors. © 2025 Informa UK Limited, trading as Taylor & Francis Group.
Keyword:
Reprint 's Address:
Email:
Source :
International Journal of Remote Sensing
ISSN: 0143-1161
Year: 2025
3 . 0 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2
Affiliated Colleges: