Abstract: Rare and extreme events like hurricanes, energy grid blackouts, dam breaks, earthquakes, and pandemics are infrequent but have severe consequences. Because estimating the probability of such events can inform strategies that mitigate their effects, scientists must develop methods to study the distribution tail of these occurrences. However, calculating small probabilities is hard, particularly when involving complex dynamics and high-dimensional random variables. In this talk, I will discuss our proposed method for the accurate estimation of rare event or failure probabilities for expensive-to-evaluate numerical models in high dimensions, and its application to rare event control. The proposed approach combines ideas from large deviation theory and adaptive importance sampling. The importance sampler uses a cross-entropy method to find an optimal Gaussian biasing distribution, and reuses all samples made throughout the process for both, the target probability estimation and for updating the biasing distributions. Large deviation theory is used to find a good initial biasing distribution through the solution of an optimization problem. Additionally, it is used to identify a low-dimensional subspace that is most informative of the rare event probability. We compare the method with a state-of-the-art cross-entropy-based importance sampling scheme using examples including a tsunami problem.
Abstract: This talk shows the use of parameter continuation techniques to characterize intermediate-term dynamics due to the presence of small Brownian noise near normally-hyperbolic, transversally stable periodic orbits and quasiperiodic invariant tori found in the deterministic limit. The proposed formulation relies on adjoint boundary-value problems for constructing continuous families of transversal hyperplanes that are invariant under the linearized deterministic flow, and covariance boundary-value problems for describing Gaussian distributions of intersections of stochastic trajectories with these hyperplanes. Analytical and numerical results, including validation with the help of the continuation package COCO, show excellent agreement with stochastic time integration for problems with either autonomous or time-periodic drift terms.
Abstract: The recent pandemics have highlighted critical factors that need to be addressed in the modern study of epidemic dynamics, such as the role of human behavior, economics, biosurveillance, and information sharing. The intertwined processes, where individuals make behavioral decisions driven by epidemic dynamics, which in turn reshape the progression of the contagion, make epidemics complex adaptive systems. In this talk, I will show that incorporating adaptive human behavior into epidemiological models can lead to unexpected results. Specifically, I will discuss scenarios in which individuals’ risk perceptions can increase or decrease the final epidemic size, producing a hysteresis-like effect.
Abstract: In recent work I developed the Knight model rank of a countable structure M, inspired by a construction by Julia Knight called "Knight's model". The Knight model rank identifies when the structure's automorphism group Aut(M) involves S_\infty and thus has "maximal classification strength" in the sense of invariant descriptive set theory. We develop this rank further to show that given a bound on the Knight model rank of M, we can bound the cardinality of any potential Scott sentence for a model extending M. The talk will include a brief review of Scott sentences and their potential versions, as well as some historical context behind Knight's original construction that significantly motivated this work.
Abstract: Quantum Singular Value Transformation (QSVT) is one of the most important developments in quantum algorithms in the past decade. At the heart of QSVT is an innovative polynomial representation called Quantum Signal Processing (QSP), which can encode a target polynomial of definite parity using the product of a sequence of parameterized SU(2) matrices. Given a target polynomial, the corresponding parameters are called phase factors. In the past few years, there has been significant progress in designing and analyzing algorithms for finding phase factors, which can be viewed as a highly nonlinear optimization problem. In this talk, we argue that nonlinear Fourier analysis (NLFA) provides a natural framework for understanding QSP, as first observed by Thiele et al. Based on NLFA, we develop a Riemann--Hilbert--Weiss (RHW) algorithm to evaluate phase factors. To the best of our knowledge, this is the first provably numerically stable algorithm for almost all functions that admit a QSP representation. We will also discuss the impact of QSP on NLFA, which may lead to surprising progress in algorithms for inverse nonlinear Fourier transformations.
Abstract: Several different invariants can be attached to an isolated hypersurface singularity using motivic integration. In the Euler characteristic limit, these invariants are all related in a straightforward way, but the relationships between the motivic versions are more difficult to understand. We will discuss the case of plane curves in detail, including connections to knot Floer homology, Hilbert schemes, and the Igusa zeta function. Based on joint work with Oblomkov and Wyss.
Abstract: Several different invariants can be attached to an isolated hypersurface singularity using motivic integration. In the Euler characteristic limit, these invariants are all related in a straightforward way, but the relationships between the motivic versions are more difficult to understand. We will discuss the case of plane curves in detail, including connections to knot Floer homology, Hilbert schemes, and the Igusa zeta function. Based on joint work with Oblomkov and Wyss.
Abstract: Many two-dimensional random growth models, including first-passage and last-passage percolation, are conjectured to fall within the Kardar–Parisi–Zhang (KPZ) universality class under mild assumptions on the underlying noise. In recent years, researchers have focused on a subset of exactly solvable models, where these conjectures can be rigorously verified. This talk discusses a specific line of research that focuses on advancing the understanding of both exactly solvable and non-solvable KPZ models using general probabilistic methods, such as percolation and coupling.
Abstract: The spine graph construction is an explicit geometric construction that can be used to show moduli spaces of hyperbolic surfaces with boundary are homeomorphic to certain moduli spaces of ‘metric ribbon graphs’. In this talk, I will explain why these homeomorphisms roughly preserve the geometry of these moduli spaces, in particular with respect to the Weil-Petersson and Kontsevich volume forms. Furthermore, I will explain how this result can be used to obtain a convergence-in-mean result on critical exponent random variables.
Abstract: State of the art machine learning algorithms implicitly learn useful structure from raw, unstructured data. How this is possible is somewhat mysterious, given that most unstructured problems are ill-posed and lack the typical identifying restrictions necessary to consistently recover low-dimensional structure. Moreover, when recovery is possible, standard nonparametric theory demands astronomical sample sizes due to the curse of dimensionality. So how is it that these algorithms are capable of learning at all?
To shed light on this intriguing question, we will discuss our recent progress towards understanding structure learning in nonparametric models. We will show how classical algorithms can provably recover latent graphical structure in general families under smoothness (i.e. H\"older) conditions, and then extend this to even broader classes of models. Along the way, we will also discuss how it is possible to circumvent the curse of dimensionality in structured models, even if this structure may not be known in advance. As a special case, our results provide a complete resolution to the problem of nonparametric estimation of high-dimensional graphical models.