Home / News / Details
Graduate students published a paper in IEEE Transactions on Wireless Communications
Date:29/11/2025 Article:Wang Chuxi Photo:From the interviewee

 

Recently, a research team led by ZJUI Assist. Prof. Yang Hao has achieved a series of significant findings in the field of simulated over-the-air federated learning for edge intelligence networks. These results have been formally published in IEEE Transactions on Wireless Communications, a top-tier international journal in wireless communications (CAS Tier 1, Impact Factor=10.7). The paper’s authors include Zhu Jiaqi, a 2023 Master student in Electronic Information; corresponding author ZJUI Assist. Prof. Yang Hao; and co-authors Prof. Bikramjit Das from the Singapore University of Technology and Design (SUTD), Prof. Xie Yong from Nanjing University of Posts and Telecommunications (NJUPT), and Prof. Nikolaos Pappas from Linköping University (LiU), Sweden.

 

“”

 

Federated learning facilitates collaborative model training across multiple clients while preserving data privacy. However, its performance is often constrained by limited communication resources, particularly in systems supporting a large number of clients. To address this challenge, integrating over-the-air computations into the training process has emerged as a promising solution to alleviate communication bottlenecks. The system significantly increases the number of clients it can support in each communication round by transmitting intermediate parameters via analog signals rather than digital ones. This improvement, however, comes at the cost of channel-induced distortions, such as fading and noise, which affect the aggregated global parameters.

 

“”

 

“”

 

“”

 

To elucidate these effects, the paper leverages this theoretical framework to quantify how channel distortions and client scaling impact model performance. The analysis reveals three key advantages of scaling up the number of participating clients. First, enhanced privacy is achieved, as the mutual information between a client’s local gradient and the server’s aggregated gradient diminishes, effectively reducing privacy leakage. Second, channel fading is mitigated through the channel hardening effect, which eliminates the impact of small-scale fading in the noisy global gradient. Third, convergence is improved because reduced thermal noise and gradient estimation errors benefit the convergence rate.

“”

 

“”

 

These findings solidify over-the-air model training as a viable approach for federated learning in networks with a large number of clients. In further experimental studies, the team also found that as the system scale expands, the resilience of the over-the-air federated learning system against malicious attacks is significantly enhanced. Meanwhile, its robustness when implementing second-order optimization methods is effectively improved, providing strong support for the stable operation of this technology in complex application environments.

Through theoretical construction and experimental validation, this research comprehensively reveals the core advantages and operational mechanisms of over-the-air federated learning in large-scale client scenarios. It offers novel technical insights and theoretical foundations for breaking through the communication bottleneck of federated learning and improving system performance, holding great significance for advancing the large-scale deployment and application of federated learning technology.

 

回到顶部