华东师范大学学报(自然科学版) ›› 2023, Vol. 2023 ›› Issue (5): 110-121.doi: 10.3969/j.issn.1000-5641.2023.05.010

• 数据学习系统 • 上一篇    下一篇

异构编码联邦学习

史洪玮1(), 洪道诚2,*(), 施连敏3,4, 杨迎尧3   

  1. 1. 宿迁学院 信息工程学院, 江苏 宿迁 223800
    2. 华东师范大学 上海智能教育研究院 & 计算机科学与技术学院, 上海 200062
    3. 苏州大学 计算机科学与技术学院, 江苏 苏州 215008
    4. 武夷学院 认知计算与智能信息处理福建省高校重点实验室, 福建 武夷山 354300
  • 收稿日期:2023-07-19 接受日期:2023-07-19 出版日期:2023-09-25 发布日期:2023-09-20
  • 通讯作者: 洪道诚 E-mail:shwtongxin@squ.edu.cn;hongdc@dase.ecnu.edu.cn
  • 作者简介:史洪玮, 男, 博士研究生, 副教授, 研究方向为智能仪器. E-mail: shwtongxin@squ.edu.cn
  • 基金资助:
    国家自然科学基金 (61977025); 2021年度江苏省重点研发计划 (现代农业) 项目 (BE2021354); 2020宿迁市项目(Z2020133); 2021宿迁市现代农业项目 (L202109); 福建省高校重点实验室开放课题基金 (KLCCIIP2021201); 苏州市科技计划项目 (SNG201908)

Heterogeneous coding-based federated learning

Hongwei SHI1(), Daocheng HONG2,*(), Lianmin SHI3,4, Yingyao YANG3   

  1. 1. School of Information Engineering, Suqian University, Suqian, Jiangsu 223800, China
    2. Shanghai Institute of AI for Education & School of Computer Science and Technology, East China Normal University, Shanghai 200062, China
    3. School of Computer Science and Technology, Soochow University, Suzhou, Jiangsu 215008, China
    4. The Key Laboratory of Cognitive Computing and Intelligent Information Processing of Fujian Education Institutions, Wuyi University, Wuyishan, Fujian 354300, China
  • Received:2023-07-19 Accepted:2023-07-19 Online:2023-09-25 Published:2023-09-20
  • Contact: Daocheng HONG E-mail:shwtongxin@squ.edu.cn;hongdc@dase.ecnu.edu.cn

摘要:

异构联邦学习系统中的个人电脑、嵌入式设备等多种边缘设备, 存在资源受限的掉队者设备降低联邦学习系统训练效率的问题. 针对此问题, 本文提出了异构编码联邦学习(heterogeneous coded-based federated learning, HCFL)系统框架, 以实现: ①提高系统训练效率, 加快多掉队者场景下的异构联邦学习 (federated learning, FL)训练速度; ②提供一定级别的数据隐私保护. HCFL方案分别从客户端和服务器角度出发设计了调度策略, 以满足通用环境下多掉队者模型计算加速; 同时设计了线性编码计算方案(linear coded computing, LCC)为任务分发提供数据保护. 实验结果表明, 当异构FL中设备之间性能差异较大时, HCFL能够将训练时间缩短89.85%.

关键词: 联邦学习, 线性编码, 异构系统, 调度算法

Abstract:

In heterogeneous federated learning systems, among a variety of edge devices such as personal computers and embedded devices, resource-constrained devices, i.e. stragglers, reduce the training efficiency of the federated learning system. This paper proposes a heterogeneous coded federated learning (HCFL) system to ① improve the training efficiency of the system and speed up the training of heterogeneous federated learning (FL) for multiple stragglers, ② provide a certain level of data privacy protection. The HCFL scheme designs scheduling strategies from the perspective of client and server to satisfy the accelerated calculation of multiple stragglers model in the general environment. In addition, a linear coded computing (LCC) scheme is designed to provide data protection for task distribution. The experimental results show that HCFL can reduce training time by 89.85% when the performance difference between devices is large.

Key words: federated learning, linear coding, heterogeneous system, scheduling algorithm

中图分类号: