Lower bound for Byzantine stochastic convex optimization on heterogeneous data(12月13日)
报告人:石乾坤   日期:2023年12月11日 15:26  

题  目: Lower bound for Byzantine stochastic convex optimization on heterogeneous data

主讲人: 石乾坤 博士

单  位: 中山大学

时  间: 121310:30

腾 讯 ID: 735-218-576

密  码: 1213

摘  要:We lower bound the complexity of finding near-stationary points (with gradient norm at most ε+εT ) using stochastic first-order methods. First, we prove that in Byzantine stochastic optimization problems, there exists a non-vanishing error εT > 0 if data is heterogeneous. Then we further divide the vanishing error εT into three parts caused by Byzantine, stochasticity and insufficient iteration, and analyse their complexities (in the worst case) respectively. To prove the lower bound is tight, we design a stochastic Nesterov accelerated gradient method for Byzantine strongly convex problems and a recursive regularization method for Byzantine convex problems. We establish the oracle complexities of our methods, matching the corresponding lower bounds, implying the lower bound is tight and the proposed methods are optimal.

简  介: Qiankun Shi is a first-year PhD student in the School of Computer Science and Engineering at SunYat-sen University, advised by Prof. Qing Ling. He is also advised by Prof. Xiao Wang at Pengcheng Laboratory. His research interest focuses on distributed optimization and constraint optimization. He received his Master degree from Shanghai Tech University under the supervision of HaoWang in 2023 and had doing his undergraduate study at Sichuan University from 2015 to 2020.