Indexed by:
Abstract:
Electricity theft causes substantial economic losses and safety hazards. While the widespread adoption of advanced metering infrastructure has significantly reduced electricity theft, perpetrators continue to find ways to exploit the system, employing increasingly covert and intricate methods. To address the ongoing challenge, this paper proposes an attention mechanism optimized multi-temporal granularity feature driven convolutional ensemble model for enhanced accuracy and robustness in electricity theft detection (ETD). For comprehensive feature extraction across diverse temporal scales, the proposed framework integrates two specialized feature extraction modules. The first module, a squeeze-and-excitation network-optimized temporal convolutional network, selectively focuses on informative temporal features within the electricity consumption data. The second module, a dual-dimensional attention enhanced deep residual network composed of residual blocks embedded with the convolutional block attention module, facilitates the model's concurrent learning of informative spatial and temporal features. Then, the features from each module are fused and classified through a fully connected layer. To validate the effectiveness of the proposed ETD method, this paper conducted simulation experiments using the publicly available dataset from the State Grid Corporation of China. The experimental results show that the model optimized with the attention mechanism significantly improves the performance of ETD. Compared to other ETD models, the proposed model performs excellently in various indicators under different training set ratios and sample imbalance scenarios, demonstrating good generalization and robustness. Additionally, the model was deployed on a Raspberry Pi edge computing device to further verify its feasibility in practical engineering applications. © 2025 Elsevier Ltd
Keyword:
Reprint 's Address:
Email:
Version:
Source :
Engineering Applications of Artificial Intelligence
ISSN: 0952-1976
Year: 2025
Volume: 152
7 . 5 0 0
JCR@2023
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 1
Affiliated Colleges: