题 目: Convex and Nonconvex Risk-Based Linear Regression at Scale
主讲人: 吴灿 博士后
单 位: 香港理工大学
时 间: 1月7日 10:00 1月5日15:00
地 点: 河南大学龙子湖校区九章学堂301 数学与统计学院 北研
摘 要: The Value-at-Risk (VaR) and the Conditional Value-at-Risk (CVaR) are two popular risk measures to hedge against the uncertainty of data. ln this talk, we will provide a computational toolbox for solving high-dimensional sparse linear regression problems under either the VaR or the CVaR measures, the former being nonconvex while the latter being convex. We address the convex CVaR linear regression problem by adopting a semismooth Newton based proximal augmented Lagrangian method. The matrix structures of the Newton systems are carefully explored to reduce the computational cost per iteration. The method is further embedded in a majorization-minimization algorithm as a subroutine to tackle the nonconvex VaR-based regression problem. We shall also discuss an adaptive sieving strategy to iteratively guess and adjust the effective problem dimension, which is particularly useful when a solution path associated with a sequence of tuning parameters is needed. Extensive numerical experiments on both synthetic and real data will be provided to demonstrate the effectiveness of our proposed methods.
简 介:吴灿, 香港理工大学博士后。2015年7月于河南科技学院获得学士学位;2018年7月于河南大学获得硕士学位;2023年6月于华南师范大学获得博士学位;2020年9月至2023年9月于香港理工大学任研究助理;2023年9月至今于香港理工大学做博士后研究。研究兴趣主要集中在稀疏优化算法及其在高维统计和机器学习领域的应用。在INFORMS Journal on Computing等学术期刊上发表论文多篇。