Indexed by:
Abstract:
Spectral Reconstruction (SR) has attracted a lot of attention in recent years. A few studies have proposed spectral reconstruction techniques based on Transformers. However, the performance optimization of Transformers still is a vital issue. Furthermore, how to simultaneously explore the spatial and spectral information of hyperspectral image and fuse the features of different resolutions at different hierarchical levels is worth further investigation. Considering these issues, in this paper, we propose the Patch Clustering-Based Spectral- Transformer with Multi-level Fusion Network (PCFM). Specifically, to address the interaction of pixels with similar long-distance spectral information, a Patch-based Clustering Spectral Self-Attention (PCSS) mechanism is developed. After patching, each patch can focus on capturing local features, and clustering can reduce the computational complexity of processing large images. Furthermore, in order to fully consider the relationship between different clusters for PCSS, a Deal Conv Feed Attention (DCFA) module is designed, which adjusts the weight of the input data by calculating the correlation between clusters so that the model can better understand and handle these complex interactions. In addition, to address the various scales fusion issue, different from the previous methods, an Hierarchical Optimization Feature Fusion mechanism (HOFF) is introduced for merging images generated at different hierarchical levels. Finally, the Channel Relation Module (CRM) is employed to further bridge the gap between the reconstructed HSI and their ground truth. Extensive experimental results on the Pavia, Sandiga and Chikusei datasets indicate that this method outperforms the SOTA method in quantitative metrics and visual effects.
Keyword:
Reprint 's Address:
Version:
Source :
NEUROCOMPUTING
ISSN: 0925-2312
Year: 2025
Volume: 626
5 . 5 0 0
JCR@2023
CAS Journal Grade:2
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1