ADMM-based Bilevel Descent Aggregation Algorithm for Sparse Hyperparameter Selection (2月7日)
报告人:丁彦昀   日期:2026年02月05日 11:02  

题   目:ADMM-based Bilevel Descent Aggregation Algorithm for Sparse Hyperparameter Selection

报告人:丁彦昀  

单   位:深圳职业技术大学

时   间:2026年2月7日 15:00

地   点:龙子湖校区九章学堂南楼C座302


摘   要:This paper proposes a novel bilevel optimization framework for a class of nonsmooth convex sparse optimization problems, addressing the key limitation of the commonly required Lower-Level Singleton (LLS) assumption. The framework integrates the Alternating Direction Method of Multipliers (ADMM) for efficiently solving the lower-level problem with a Bilevel Descent Aggregation (BDA) algorithm for exploring the hyperparameter space in the upper level. A primary contribution is a new convergence analysis, which proves that the proposed ADMM-BDA algorithm achieves global convergence under significantly relaxed conditions, moving beyond the restrictive LLS assumption. Numerical experiments on synthetic and real-world data demonstrate the superior effectiveness and robustness of ADMM-BDA compared to state-of-the-art methods, particularly when the lower-level problem involves elastic-net penalized statistical models.


报告人简介:丁彦昀,博士,讲师,2023年加入深圳职业技术大学。研究方向为最优化理论与方法,在iScience(Cell子刊)、Optim. Methods Softw.、J. Math. Imaging Vis.、J. Nonlinear Var. Anal.等期刊发表论文十余篇。