Some Recent Advances in Optimization for Machine Learning

发布者:梁慧丽发布时间:2025-12-30浏览次数:13

报告人


罗珞

复旦大学

时间

2025年12月30日 星期二

下午 14:00-15:00

地点


602会议室



Abstract


This talk considers optimization theory and algorithms for machine learning, addressing challenges in distributed optimization, second-order optimization, minimax problem, and nonconvex nonsmooth problem. We first focus on distributed finite-sum optimization, proposing a sampling strategy that allows different nodes to use different batch sizes at each iteration. This results the near-optimal communication complexity and computation complexity with respect to the global smoothness. We then consider second-order methods for minimization and minimax optimization, introducing block Broyden updates, the squared technique, and partial quasi-Newton methods with the fast superlinear local convergence rates. We further study the constrained nonconvex nonsmooth optimization by introducing the notion of generalized Goldstein stationary point, which characterizes the convergence of the stochastic algorithms for minimizing the Lipschitz continuous function. We also applied this idea to address the general nonsmooth nonconvex-concave minimax problem.




Biography


罗珞,复旦大学大数据学院青年副研究员。2019年获得上海交通大学计算机博士学位,之后于香港科技大学数学系任博士后。研究方向包括机器学习、优化理论与矩阵计算,相关工作表于JMLR, ICML, NeurIPS, KDD, COLT等期刊及会议,获KDD最佳研究论文亚军奖以及COLT最佳学生论文奖,主持国家自然科学基金青年项目与面上项目,担任ICML, NeurIPS, ICLR等会议领域主席。



搜索
您想要找的