Indexed by:
Abstract:
Underwater image enhancement is a highly challenging task, requiring solutions to complex environmental degradation factors such as light attenuation and color cast. Achieving stability in color restoration and precision in texture recovery is key to improving enhancement results. However, existing methods generally lack in-depth modeling of color and texture information and fail to efficiently fuse these two core visual components, significantly limiting the overall performance of the enhancement results. To this end, we propose an innovative Dual-Attention Fusion Net (DuAF) that solves this problem. On a global scale, DuAF introduces explicit semantic consistency constraints to precisely model color features by reconstructing pixel intensity distribution, enhancing sensitivity to color features, and capturing real pixel gradient changes, effectively addressing complex color distortion issues. On a local scale, DuAF dynamically adjusts the perception window, combines optimized attention weights with positional deviations, and deeply models texture information, significantly improving the restoration of texture details. Overall, DuAF significantly improves the stability of color restoration and the clarity of texture details in complex degraded scenes, providing an efficient and comprehensive solution for underwater image enhancement. Our project is publicly available on https://github.com/HuShuteng/DuAF. © 2025 Elsevier B.V.
Keyword:
Reprint 's Address:
Email:
Source :
Information Fusion
ISSN: 1566-2535
Year: 2026
Volume: 127
1 4 . 8 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: