Home  /  Academic Activities  /  Content

Course: Computational Methods for Stochastic Systems: SDEs, Sampling, Stochastic Gradients, and Variance Reduction

Posted: 2025-07-09   Views: 

Course: Computational Methods for Stochastic Systems: SDEs, Sampling, Stochastic Gradients, and Variance Reduction

Location: Zhengxin Building 209Jilin UniversityChangchun

Zoom ID: 904 645 6677, Password : 2025

Time: 14:00 – 16:00, July 21st - July 23rd, 2025

Instructor: Zhou Zhennan (Associate Professor, Westlake University)

Abstract:

This three-part lecture series provides a concise introduction to the essential theory and numerical methods for simulating stochastic systems, particularly those described by Stochastic Differential Equations (SDEs). These systems are fundamental in modeling phenomena with inherent randomness across various fields, including physics, engineering, finance, biology, and data science.


Lecture 1-Fundamentals of Stochastic Differential Equations and Sampling introduces the mathematical foundation of stochastic systems and their application in sampling complex distributions. We cover Brownian motion, Itô calculus (stochastic integration and Itô's lemma), and the definition of SDEs. The Euler-Maruyama scheme is derived, and its accuracy is analyzed through consistency, stability, and convergence concepts. We then explore the connection between SDEs and stationary distributions via the Fokker-Planck equation and ergodicity, focusing on sampling the Boltzmann distribution using Langevin dynamics. Numerical methods such as the Unadjusted Langevin Algorithm (ULA) and splitting schemes (e.g., BAOAB) are introduced, with applications in Bayesian inference and statistical mechanics.


Lecture 2-Computational Challenges and Stochastic Gradients addresses the computational challenges in large-scale stochastic problems, especially those involving gradients from large datasets. We introduce Stochastic Gradient (SG) approximations and the Stochastic Gradient Langevin Dynamics (SGLD) algorithm for efficient Bayesian sampling. The differences between using stochastic gradients for sampling (SGLD) versus optimization (SGD) are highlighted, along with an overview of related techniques like the Random Batch Method for interacting particle systems.


Lecture 3-Variance Reduction Techniques for Efficient Simulations focuses on improving the efficiency of stochastic simulations through variance reduction techniques. We motivate the need for reducing statistical noise in Monte Carlo and stochastic gradient methods. Classical variance reduction principles (e.g., Control Variates) and modern techniques (e.g., SVRG, SAGA) are reviewed, with a focus on their application in Langevin dynamics (e.g., SVRG-LD). Theoretical insights and practical considerations are discussed.


Overall, this series aims to provide participants with a solid understanding of computational methods for stochastic simulation, bridging foundational SDE theory with practical algorithms relevant to current research.


Introduction of the Instructor


Zhennan Zhou is an Associate Professor at Westlake University and a national young talent. Zhou graduated from the School of Mathematics at Jilin University with a Bachelor's degree in 2009, and obtained a Ph.D. from the Department of Mathematics at the University of Wisconsin-Madison, USA in 2014. From 2014 to 2017, Zhou served as an Assistant Research Professor at Duke University, USA. From 2017 to 2023, Zhou worked as an Assistant Professor at the Beijing International Center for Mathematical Research, Peking University. In March 2024, Zhou joined Westlake University full-time as a Distinguished Research Fellow at the Institute for Theoretical Sciences. His research focuses on using applied analysis, numerical computation, and stochastic simulation to study mathematical problems in natural sciences. Zhou has published numerous papers in journals such as Math. Comp., SIAM J. Math. Anal., SIAM J. Sci. Comput., and J. Comput. Phys..



QR code for the course QQ group: