J* E* C* N* U* N* S* ›› 2025, Vol. 2025 ›› Issue (6): 19-28.doi: 10.3969/j.issn.1000-5641.2025.06.003

Previous Articles     Next Articles

Electricity theft detection based on transfer learning and attention hybrid neural network

Lishen CHEN1, Peng PU2, Jianghai QIAN1,3,*()   

  1. 1. College of Mathematics and Physics, Shanghai University of Electric Power, Shanghai 200090, China
    2. School of Data Science and Engineering, East China Normal University, Shanghai 200062, China
    3. Engineering Research Center of Software and Hardware Co-design Technology and Application of the Ministry of Education, East China Normal University, Shanghai 200062, China
  • Received:2024-01-29 Online:2025-11-25 Published:2025-11-29
  • Contact: Jianghai QIAN E-mail:qianjianghai@shiep.edu.cn

Abstract:

In this study, several issues with current electricity theft detection methods are addressed, notably the reliance on one-dimensional electricity load time series data to develop a singular model. These approaches are often plagued by low detection accuracy, and they require extensive training parameters and a significant number of training samples when computer vision models are directly applied to two-dimensional images of electricity load time series. To overcome these challenges, a novel electricity theft detection method that utilizes a hybrid neural network, combining transfer learning and attention mechanisms, is proposed. The training demands of the ConvNeXt model are reduced via the integration of transfer learning, significantly enhancing its performance. Additionally, a bi-directional long short-term memory (BiLSTM) model is integrated to support the training of the refined ConvNeXt model by extracting global nonlinear features from one-dimensional load time-series data. Furthermore, SimAM and multi-headed self-attention (MHSA) mechanisms are incorporated to improve the feature representation capability of the hybrid model. The experimental verification of the proposed method in the China State Grid public dataset shows that $A_{\mathrm{UC}} $, $M_{{\text{AP@}}100} $, $ M_{{\text{AP@}}200}$, and $F_1 $ metrics of the proposed model can be effectively enhanced when compared to those of other deep learning classification models. For example, $F_1 $ is improved by 9.1% compared to that obtained via t-LeNet algorithm.

Key words: transfer learning, ConvNeXt, bi-directional long short-term memory, attention mechanism, hybrid neural network

CLC Number: