科学研究
学术报告
Decentralized Reduced Rank Regression for Response Partition
邀请人:周叶青
发布时间:2024-05-16浏览次数:

题目:Decentralized Reduced Rank Regression for Response Partition

姓名:毛晓军 副教授 (上海交通大学)

地点:致远楼108室

时间:2024年5月22日 10:00-11:30

Abstract:Distributed learning in decentralized networks has been extensively studied and applied in various machine-learning scenarios. However, previous research primarily focused on data partitioning based on samples. In this paper, we address the less explored scenario of response partition, where different components of the response vector are collected and stored across multiple nodes in a multi-agent network. To mitigate the information loss resulting from response partitioning, we use the Reduced Rank Regression (RRR) model to establish connections between the response components. Subsequently, we formulate an optimization problem that involves both local and global parameters within the framework of matrix factorization, capturing both inter-node and intra-node correlations. To solve this problem efficiently, we propose an algorithm based on Decentralized Gradient Descent with Gradient Tracking (DGGT), which incorporates an additional step for local estimation. The theoretical analysis yields non-asymptotic error bounds for both estimation error and consensus error. As the number of iterations tends to infinity, the statistical error rate converges to the optimal performance achieved in the centralized case. Furthermore, we validate the effectiveness of our method through simulations and real-world applications. The numerical results not only align with our theoretical findings but also demonstrate the superiority of our approach over local reduced-rank regression methods.

欢迎各位参加!