威尼斯人娱乐场

清华主页 EN
导航菜单 威尼斯人娱乐场

Beyond Second Order Methods for Nonconvex Optimization

来源: 08-26

威尼斯人娱乐场
威尼斯人娱乐场
威尼斯人娱乐场
威尼斯人娱乐场

时间:Thursday, 16:00-17:00 Aug 28, 2025

地点:B627, Shuangqing Complex Building

组织者:Jin-Peng Liu

主讲人:Kate Wenqi Zhu

Kate Wenqi Zhu

Ph.D. Student, University of Oxford

Kate Wenqi Zhu is a fourth-year Ph.D. student in Applied Mathematics at the University of Oxford, under the supervision of Professor Coralia Cartis, and is fully funded by the CIMDA–Oxford Studentship. Her research focuses on leveraging higher-order information for efficient nonconvex optimization, with interests spanning computational complexity analysis, tensor approximation, sum-of-squares techniques, implementable high-order subproblem solvers, and adaptive regularization methods. She completed both her undergraduate and first master's degrees in Mathematics at Oxford, followed by an M.Sc. in Mathematical Modelling and Scientific Computing, supervised by Professor Yuji Nakatsukasa.

Organizer

Jin-Peng Liu 刘锦鹏

Time

Thursday, 16:00-17:00

Aug 28, 2025

Venue

B627, Shuangqing Complex Building

Abstract

Traditionally, first-order gradient-based techniques, such as stochastic gradient descent (SGD), and second-order methods, such as the Newton method, have dominated the field of optimization. In recent years, high-order tensor methods with regularization for nonconvex optimization have garnered significant research interest. These methods offer superior local convergence rates, improved worst-case evaluation complexity, enhanced insights into data geometry through higher-order information, and better parallelization compared to SGD.

The most critical challenge in implementing the $p$th-order method ($p \geq 3$) lies in efficiently minimizing the $p$th-order subproblem, which typically consists of a $p$th-degree multivariate Taylor polynomial combined with a $(p+1)$th-order regularization term. In this talk, we address the research gaps by characterizing the local and global optimality of the subproblem and investigating its potential NP-hardness. In this talk, we will introduce and discuss a series of provably convergent and efficient algorithms for minimizing the regularized subproblem both locally and globally, including the Quadratic Quartic Regularization Method (QQR), the Cubic Quartic Regularization Method (CQR), and the Sums-of-Squares Convex Taylor Method (SoS-C). More interestingly, our research adopts an AI-integrated approach, using the mathematical reasoning capabilities of large language models (LLMs) to verify the nonnegativity of multivariate polynomials. This problem is closely related to Hilbert’s Seventeenth Problem and the challenge of globally minimizing subproblems.

返回顶部
威尼斯人娱乐场相关的文章