Journal of East China Normal University(Natural Science) ›› 2023, Vol. 2023 ›› Issue (5): 77-89.doi: 10.3969/j.issn.1000-5641.2023.05.007

• System for Learning from Data • Previous Articles     Next Articles

Privacy-preserving cloud-end collaborative training

Xiangyun GAO1, Dan MENG2, Mingkai LUO2,3, Jun WANG2, Liping ZHANG1, Chao KONG1,4,*()   

  1. 1. School of Computer and Information, Anhui Polytechnic University, Wuhu, Anhui 241000, China
    2. OPPO Research Institute, Shenzhen, Guangdong 518000, China
    3. College of Electronic and Information Engineering, Tongji University, Shanghai 201804, China
    4. Reconfigurable and Intelligent Computing Laboratory, Anhui Polytechnic University, Wuhu, Anhui 241000, China
  • Received:2023-06-30 Online:2023-09-25 Published:2023-09-15
  • Contact: Chao KONG E-mail:kongchao@ahpu.edu.cn

Abstract:

China has the advantages of scale and diversity in data resources, and mobile internet data applications, which generate massive amounts of data in diverse application scenarios, recommendation systems have the capability to extract valuable information from this massive amounts of data, thereby mitigating the problem of information overload. Most existing research on recommendation systems focused on centralized recommender systems, training the data on the cloud centrally. However, with increasingly prominent data security and privacy protection issues, collecting user data has become increasingly difficult, making centralized recommendation methods infeasible. This study focuses on privacy-preserving cloud-end collaborative training in a decentralized manner for personalized recommender systems. To fully utilize the advantages of end devices and cloud servers while considering privacy and security issues, a cloud-end collaborative training method named FedMNN (federated machine learning and mobile neural network) is proposed for recommender systems based on federated machine learning (FedML) and a mobile neural network (MNN). The proposed method was divided into three parts: First, cloud-based models implemented in various deep learning frameworks were converted into general MNN models for end-device training using the ONNX (open neural network exchange) intermediate framework and a MNN model conversion tool. Second, the cloud server sends the model to the end-side devices, which initialized and obtain local data for training and loss calculation, followed by gradient back-propagation. Finally, the end-side models are fed back to the cloud server for model aggregation and updating. Depending on different requirements, the cloud model was deployed on end-side devices as required, achieving end-cloud collaboration. Experiments comparing power consumption of the proposed FedMNN and FLTFlite (flower and TensorFlow lite) frameworks on benchmark tasks identified that FedMNN is 32% to 51% lower than FLTFlite. Using DSSM (deep structured semantic model) and deep and wide recommendation models, the experimental results demonstrated the effectiveness of the proposed cloud-end collaborative training method.

Key words: privacy protection, federated learning, machine learning, cloud-end collaborative training

CLC Number: