当前位置: 首 页 - 科学研究 - 学术报告 - 正文

数学学院、所2025年系列学术活动(第087场):周珍楠 副教授 西湖大学

发表于: 2025-07-09   点击: 

课程名称: Computational Methods for Stochastic Systems: SDEs, Sampling, Stochastic Gradients, and Variance Reduction

授课人: 周珍楠 副教授 西湖大学 

课程日期: 2025721-723日 14:00 – 16:00

课程地点: 正新楼209,吉林大学,长春;Zoom ID: 904 645 6677, 密码: 2025


摘要:

This three-part lecture series provides a concise introduction to the essential theory and numerical methods for simulating stochastic systems, particularly those described by Stochastic Differential Equations (SDEs). These systems are fundamental in modeling phenomena with inherent randomness across various fields, including physics, engineering, finance, biology, and data science.


Lecture 1-Fundamentals of Stochastic Differential Equations and Sampling introduces the mathematical foundation of stochastic systems and their application in sampling complex distributions. We cover Brownian motion, Itô calculus (stochastic integration and Itô's lemma), and the definition of SDEs. The Euler-Maruyama scheme is derived, and its accuracy is analyzed through consistency, stability, and convergence concepts. We then explore the connection between SDEs and stationary distributions via the Fokker-Planck equation and ergodicity, focusing on sampling the Boltzmann distribution using Langevin dynamics. Numerical methods such as the Unadjusted Langevin Algorithm (ULA) and splitting schemes (e.g., BAOAB) are introduced, with applications in Bayesian inference and statistical mechanics.


Lecture 2-Computational Challenges and Stochastic Gradients addresses the computational challenges in large-scale stochastic problems, especially those involving gradients from large datasets. We introduce Stochastic Gradient (SG) approximations and the Stochastic Gradient Langevin Dynamics (SGLD) algorithm for efficient Bayesian sampling. The differences between using stochastic gradients for sampling (SGLD) versus optimization (SGD) are highlighted, along with an overview of related techniques like the Random Batch Method for interacting particle systems.


Lecture 3-Variance Reduction Techniques for Efficient Simulations focuses on improving the efficiency of stochastic simulations through variance reduction techniques. We motivate the need for reducing statistical noise in Monte Carlo and stochastic gradient methods. Classical variance reduction principles (e.g., Control Variates) and modern techniques (e.g., SVRG, SAGA) are reviewed, with a focus on their application in Langevin dynamics (e.g., SVRG-LD). Theoretical insights and practical considerations are discussed.


Overall, this series aims to provide participants with a solid understanding of computational methods for stochastic simulation, bridging foundational SDE theory with practical algorithms relevant to current research.


授课人简介

周珍楠,西湖大学副教授,国家级青年人才。2009年毕业于吉林大学数学学院,获学士学位,2014年毕业于美国威斯康辛大学麦迪逊分校数学系,获博士学位。2014年至2017年在美国杜克大学担任助理研究教授。2017年至2023年在北京大学北京国际数学研究中心担任助理教授。2024年3月全职加入西湖大学,担任理论科学研究院特聘研究员。研究领域主要为用应用分析、数值计算和随机模拟的方法研究自然科学中的数学问题。在Math. Comp.、SIAM J. Math. Anal.、SIAM J. Sci. Comput.、J. Comput. Phys.等期刊发表多篇论文。


课程QQ群二维码: