Generative Models for Structured Distributional Shift: Robustness, Causal Interventions, and Uncertainty Quantification

Speaker: Prof. Yao Xie

Time: 14:00 (GMT), March 18, 2026.

Speaker photo

Abstract

Existing generative models, including diffusion and flow-based methods, are typically used for data imitation: they learn a reference data distribution from training samples and generate new samples from that distribution. In this talk, Prof. Yao Xie take a broader view and use generative models as a framework for representing distributional shift when the change is governed by an underlying model, constraint, or geometry, rather than being arbitrary. Prof. Yao Xie first consider robust optimization, in which generative parameterizations provide tractable models of worst-case distributions and support decision-making via minimax optimization in Wasserstein space. She then turn to causal settings, where interventions and counterfactuals induce structured shifts in the data-generating distribution, and show how generative models can represent these shifts in both static and time-series problems. Finally, Prof. Yao Xie discuss how the same perspective supports uncertainty quantification under shift, using flow-based conformal prediction for multivariate time series as an example. More broadly, this viewpoint also applies to settings in which distributions evolve through strategic interactions, including recent work on high-dimensional mean-field games. Taken together, these examples suggest that generative modeling is useful not only for sampling from observed data but also for modeling how distributions change and for supporting reliable inference, prediction, and decision-making under such changes. 

 

Our Speaker

Yao Xie is the Coca-Cola Foundation Chair and Professor in the H. Milton Stewart School of Industrial and Systems Engineering at the Georgia Institute of Technology, where she also serves as Associate Director of the Machine Learning Center (ML@GT). She received her Ph.D. in Electrical Engineering, with a minor in Mathematics, from Stanford University. Her research lies at the intersection of statistics, machine learning, and optimization, with a focus on statistically sound and computationally efficient methods for high-dimensional, sequential, and spatio-temporal data. She is a member of the 2026 cohort of the National Academies’ New Voices in Sciences, Engineering, and Medicine program and the IEEE Information Theory Society Distinguished Lecturer for 2026 to 2027. Her honors include the NSF CAREER Award, the INFORMS Gaver Early Career Award for Excellence in Operations Research, and the C.W.S. Woodroofe Award. She serves as an Associate Editor for multiple journals, including Operations Research, IEEE Transactions on Information Theory, and the Journal of the American Statistical Association (Theory and Methods), and as an Area Chair for NeurIPS, ICML, and ICLR. 

 


Image
Image
Image