Dissertation Title: Number Theoretic Algorithms for Elliptic Curves
Hunter Johnson, PhD in Mathematics
Advisor: C. Laskowski
Dissertation Title: Definable Families of Finite VC Dimension
Nicholas Long, PhD in Mathematics
Advisor: M. Boyle
Dissertation Title: Involutions of Shifts of Finite Type: Fixed Point Shifts, Orbit Quotients, and the Dimension Representation
Jane Long, PhD in Mathematics
Advisor: J. Schafer
Dissertation Title: The Cohomology of the Affine Group over Fp2 and of PSL(3, Fp)
Yabing Mai, PhD in Mathematical Statistics
Advisor: E. Slud
Dissertation Title: Comparing Survival Distributions in the Presence of Dependent Censoring: Asymptotic Validity and Bias Correction of the Logrank Test
Tinghui Yu, PhD in Mathematical Statistics
Advisor: A. Kagan
Estimation Theory of a Location Parameter in Small Samples
Abstract: Diffusion model is a prevailing Generative AI approach. It uses a score function to characterize a complex data distribution and its evolution toward an easy distribution. This talk will report progress in two different topics, both closely related to the origins of the score function.
The first topic, which will take most time of the talk, will be on a quantification of the generation accuracy of diffusion model. The importance of this problem already led to a rich and substantial literature; however, most existing theoretical investigations assumed that an epsilon-accurate score function has already been oracle-given, and focused on just the inference process of diffusion model. I will instead describe a first quantitative understanding of the actual generative modeling protocol, including both score training (optimization) and inference (sampling). The resulting full error analysis will elucidate (again, but this time theoretically) how to design the training and inference processes for effective generation.
The second topic will no longer be about generative modeling, but sampling instead. The goal is leverage the fact that diffusion model is very good at handling multimodal distributions, and extrapolate it to the holy grail problem of efficient sampling from multimodal density. There, one needs to rethink about how to get the score function, as no more data samples are available and one instead has unnormalized density. A new sampler that is insensitive to metastability, with performance guarantee, and not even requiring continuous density, will be presented.
4176 Campus Drive - William E. Kirwan Hall
College Park, MD 20742-4015
P: 301.405.5047 | F: 301.314.0827