Abstract
Efficient sampling from complex non-log-concave distributions is a cornerstone of statistical computing and machine learning, yet it is challenged by stringent isoperimetric conditions and high computational costs. Traditional methods often suffer from exponential gradient complexity, limiting their applicability to a broad array of problems. In this talk, we present RS-DMC, a novel advancement in Diffusion-based Monte Carlo (DMC) algorithms that transcends these limitations by introducing a recursive score estimation technique. By dissecting the diffusion process into manageable segments, RS-DMC addresses sampling through interconnected mean estimation and sampling subproblems, each efficiently solvable under strongly log-concave distributions. This methodological innovation not only significantly reduces the gradient complexity to a quasi-polynomial dependency on the error tolerance , but also establishes a new benchmark in sampling efficiency for distributions without isoperimetry condition, which is commonly required in previous works. This talk will delve into the algorithm's design and the rigorous proof details that underpin our theoretical framework, and introduces some new directions that can be explored accordingly.
Time
Monday, Apr.15, 15:30--16:30
Speaker
Difan Zou is an assistant professor in computer science department and institute of data science at HKU. He has received his PhD degree in Department of Computer Science, University of California, Los Angeles (UCLA).
His research interests are broadly in machine learning, deep learning theory, graph learning, and interdisciplinary research between AI and other subjects. His research is published in top-tier machine learning conferences (ICML, NeurIPS, COLT, ICLR) and journal papers (IEEE Trans., Nature Comm., PNAS, etc.). He serves as an area chair/senior PC member for NeurIPS and AAAI, and PC members for ICML, ICLR, COLT, etc.
Room
信管学院308室




