Special Lecture Archives for Fall 2019 to Spring 2020


Neuromorphic Artificial Intelligence: From Mathematical Foundations of Deep Learning to 'Cortex-on-a-Chip'

When: Fri, September 21, 2018 - 11:00am
Where: 1146 A.V. Williams Building
Speaker: Distinguished University Professor John S. Baras (ECE/ISR) - http://www.isr.umd.edu/faculty/baras
Abstract: Deep Learning and Artificial Intelligence have attracted enormous attention recently. The race to design and manufacture “brain-like” computers is on and several companies have produced various such chips. Yet, the current state of affairs is very unsatisfactory and ad hoc. We describe a mathematical framework we have developed that provides a hierarchical architecture for learning and cognition. The architecture combines a wavelet preprocessor, a group invariant feature extractor and a hierarchical (layered) learning algorithm. There are two feedback loops, one back from the learning output to the feature extractor and one all the way back to the wavelet preprocessor. We show that the scheme can incorporate all typical metric differences but also non-metric dissimilarity measures like Bregman divergences. The learning module incorporates two universal learning algorithms in their hierarchical tree-structured form, both due to Kohonen, Learning Vector Quantization (LVQ) for supervised learning and Self-Organizing Map (SOM) for unsupervised learning. We demonstrate the superior performance of the resulting algorithms and architecture on a variety of practical problems including: speaker and sound identification, simultaneous determination of sound direction of arrival speaker and vowel ID, face recognition. We demonstrate how the underlying mathematics can be used to provide systematic models for design, analysis and evaluation of deep neural networks. We describe current work and plans on mixed signal (digital and analog) micro-electronic implementations that mimic architectural abstractions of the cortex of higher-level animals and humans, for sound and vision perception and cognition. The resulting architecture is non-von Neumann (i.e. computing and memory are not separated in the hardware) and neuromorphic. We call the resulting chip class “Cortex-on-a-Chip.”

(Special Applied Math Colloquium, Special time): Atomistic Simulation of Crystalline Defects [A Numerical Analysis Perspective]

When: Tue, October 9, 2018 - 11:00am
Where: Kirwan Hall 3206
Speaker: Christoph Ortner (University of Warwick) - http://homepages.warwick.ac.uk/staff/C.Ortner/
Abstract: A common problem of atomistic materials modelling is to determine properties of
crystalline defects, such as structure, energetics, mobility, from which
meso-scopic material properties or coarse-grained models can be derived (e.g.,
Kinetic Monte-Carlo, Discrete Dislocation Dynamics, Griffith-type fracture
laws). In this talk I will focus on one the most basic tasks, computing the
equilibrium configuration of a crystalline defect, but will also also comment on
free energy and transition rate computations.
A wide range of numerical strategies, including the classical supercell method
(periodic boundary conditions) or flexibe boundary conditions (discrete BEM),
but also more recent developments such as atomistic/continuum and QM/MM hybrid
schemes, can be interpreted as Galerkin discretisations with variational crimes,
for an infinite-dimensional nonlinear variational problem. This point of view is
effective in studying the structure of exact solutions, identify approximation
parameters, derive rigorous error bounds, optimise and construct novel schemes
with superior error/cost ratio.
Time permitting I will also discuss how this framework can be used to analyse
model errors in interatomic potentials and how this can feed back into the
developing of new interatomic potentials by machine learning techniques.

Discovering Governing Laws of Interaction in Heterogeneous Agents Dynamics from Observation

When: Tue, October 9, 2018 - 4:00pm
Where: Kirwan Hall 1310
Speaker: Ming Zhong (Johns Hopkins University ) -
Abstract: Inferring the laws of interaction of particles and agents in complex dynamical systems from observational data is a fundamental challenge in a wide variety of disciplines. We start from data consisting of trajectories of interacting agents, which is in many cases abundant, and propose a non-parametric statistical learning approach to extract the governing laws of interaction. We demonstrate the effectiveness of our learning approach both by providing theoretical guarantees, and by testing the approach on a variety of prototypical systems in various disciplines, with homogeneous and heterogeneous agents systems, ranging from fundamental physical interactions between particles to systems-level interactions, with such as social influence on people's opinion, prey-predator dynamics, flocking and swarming, and cell dynamics.

Women in Math: Jen Rezeppa

When: Thu, November 1, 2018 - 3:30pm
Where: Kirwan Hall 3206
Speaker: Jennifer Rezeppa (University of MD (2000)) -
Abstract:
Jen Rezeppa has 10+ years of experience in Silicon Valley working for and leading teams at top companies including Apple, GoPro during the IPO, and presently at Tesla. She earned a BS in Mathematics and over the course of her career, her UMCP degree has led to opportunities in teaching, reliability engineering, and most recently, demand planning and channel planning. Let’s learn about Jen’s career progression and her thoughts on leadership for students and new graduates as they explore how a math degree can lay the foundation for their professional goals.

Climate Change and the Wind-driven Ocean Circulation

When: Fri, December 7, 2018 - 2:00pm
Where: ATL 4301
Speaker: Professor Michael Gill (UCLA and Ecole Normale Supérieure) - https://dept.atmos.ucla.edu/tcd
Abstract: https://docs.google.com/document/d/1ziUed_5e_SelGmPrwZc85xhCdyTmG95eDzR-A60Dxbo/edit?usp=sharing

Visualization Through Mathematica Final Presentations

When: Sat, December 15, 2018 - 1:30pm
Where: Kirwan Hall 3206
Speaker: Ajeet Gary/ Math299m () -
Abstract: The talented students of Math299m - "Visualization Through Mathematica" will be presenting their final projects where they use Mathematica to model and investigate a diverse array of topics, including fractional derivatives, gravitation, seismology, machine learning, musical harmonies, financial models and more.

Control of multi-agent systems and mean-field limits

When: Tue, February 19, 2019 - 12:00pm
Where: Kirwan Hall 3206
Speaker: Benedetto Piccoli (Rutgers University–Camden) -
Abstract: Control of multi-agent systems has application to many different domains
(including traffic, biology and other) and can be addressed at many different scale. At microscopic scale sparsity is a desired property for applicability, while passing to mean-field limits pose mathematical challenges, as controls may become singular.
We first show some recent results at microscopic scale and in rigorously passing to the limit. Then we introduce a new concept of differential equation for measures which appear to be a promising framework to deal with this problems and, finally, show application to traffic.

What is Data Science?

When: Tue, March 12, 2019 - 4:00pm
Where: Edward St. John Learning and Teaching Center, 4131 Campus Dr, College Park, MD 20742, USA Room 0202
Talk by by Professor David Donoho

No equations, no variables, no parameters, no space, no time: Data and the modeling of complex systems

When: Tue, March 12, 2019 - 5:30pm
Where: Edward St. John Learning and Teaching Center, 4131 Campus Dr, College Park, MD 20742, USA Room 0202
Speaker: Professor Yannis G. Kevrekidis
https://datascienceday.math.umd.edu/home/keynote-address

Abstract: Obtaining predictive dynamical equations from data lies at the heart of science and engineering modeling, and is the linchpin of our technology. In mathematical modeling one typically progresses from observations of the world (and some serious thinking!) first to equations for a model, and then to the analysis of the model to make predictions.

Good mathematical models give good predictions (and inaccurate ones do not) - but the computational tools for analyzing them are the same: algorithms that are typically based on closed form equations.

While the skeleton of the process remains the same, today we witness the development of mathematical techniques that operate directly on observations -data-, and appear to circumvent the serious thinking that goes into selecting variables and parameters and deriving accurate equations. The process then may appear to the user a little like making predictions by "looking in a crystal ball". Yet the "serious thinking" is still there and uses the same -and some new- mathematics: it goes into building algorithms that jump directly from data to the analysis of the model (which is now not available in closed form) so as to make predictions. Our work here presents a couple of efforts that illustrate this "new" path from data to predictions. It really is the same old path, but it is travelled by new means.

Hamiltonian density of states from free probability theory: Anderson model, Floquet systems, and Quantum Spin Chains

When: Tue, April 2, 2019 - 2:00pm
Where: Toll Physics Building, room 2205
Speaker: Ramis Movassagh (IBM) - https://researcher.watson.ibm.com/researcher/view.php?person=us-ramis
Abstract: Suppose the eigenvalue distributions of two matrices M_1 and M_2 are known. What is the eigenvalue distribution of the sum M_1+M_2? This problem has a rich pure mathematics history dating back to H. Weyl (1912) with many applications in various fields. Free probability theory (FPT) answers this question under certain conditions, which often involves some degree of randomness (disorder). We will describe FPT and show examples of its powers for approximating physical quantities such as the density of states of the Anderson model, quantum spin chains, and gapped vs. gapless phases of some Floquet systems. These physical quantities are often hard to compute exactly. Nevertheless, using FPT and other ideas from random matrix theory excellent approximations can be obtained. Besides the applications presented, we believe the techniques will find new applications in fresh new contexts.

Supercritical Entanglement: counter-examples to the area law for quantum matter

When: Wed, April 3, 2019 - 11:00am
Where: Physical Sciences Complex, room 3150
Speaker: Ramis Movassagh (IBM) - https://researcher.watson.ibm.com/researcher/view.php?person=us-ramis
Abstract: We discuss basic notions of quantum entanglement relevant for the study of quantum matter. Entanglement is simultaneously a blessing for quantum computation and a curse for classical simulation of matter. In recent years, there has been a surge of activities in proposing exactly solvable quantum spin chains with the surprisingly high amount of ground state entanglement entropies--beyond what one expects from critical systems describable by conformal field theories (i.e., super-logarithmic violations of the area law). We will introduce entanglement and discuss these models. We prove that the ground state entanglement entropy is \sqrt(n) and in some cases even extensive (i.e., ~n) despite the underlying Hamiltonian being: 1. Local 2. Having a unique ground state and 3. Being translationally invariant in the bulk. These models have rich connections with combinatorics, random walks, and universality of Brownian excursions. Lastly, we develop techniques that enable proving the gap of these models. As a consequence, the gap scaling of 1/n^c with c>1 that we prove rules out the possibility of these models having a relativistic conformal field theory description. Time permitting we will discuss more recent developments in this direction.

Quantifying Gerrymandering: A Mathematician Goes to Court

When: Thu, April 11, 2019 - 4:00pm
Where: John S. Toll Physics Building, 4150 Campus Dr, College Park, MD 20740, USA Room 1412

Speaker: Jonathan Christopher MattinglyAbstract: In October 2017, I found myself testifying for hours in a Federal court. I had not been arrested. Rather---I was attempting to quantify gerrymandering using mathematical analysis. I was intrigued by the surprising results of the 2012 election, wondering if these results were really surprising. It hinged on probing the geopolitical structure of North Carolina using a Markov Chain Monte Carlo algorithm. In this talk, I will describe the mathematical ideas involved in our analysis. The talk will be accessible and, hopefully, interesting to all, including undergraduates. In fact, this project began as a sequence of undergraduate research projects, which undergraduates continue to be involved with to this day.

In situ data processing computational scientific data using the ADIOS framework

When: Tue, April 23, 2019 - 11:00am
Where: 5825 University Research Ct, Suite 2500, College Park, MD 20740
Speaker: Scott Klasky

Title: In situ data processing computational scientific data using the ADIOS framework
Date and Time: Tuesday April 23, 11 AM
Location: Mid-Atlantic Crossroads, 5825 University Research Ct, Suite 2500, College Park, MD 20740

ABSTRACT:
The USA Exascale Computing Project (ECP) is focused on accelerating the delivery of a capable exascale computing ecosystem that delivers 50 times more computational science and data analytic application power than possible with DOE HPC systems such as Titan (ORNL) and Sequoia (LLNL). As next generation applications and experiments grow in concurrency and in complexity, the data produced often grows to extreme levels, limiting scientific knowledge discovery. In my presentation, I will talk about the new set of applications and experiments which push the edge of scientific data processing and simulation. I will present some of the exciting new research in this area to cope with this tsunami of data, along with the challenges in implementing these effectively on next-generation computer architectures. In my presentation I will also focus on the ADIOS framework ( https://www.olcf.ornl.gov/center-projects/adios/) a next generation to ingest, reduce, and move data on HPC systems and over the WAN to other computational resources. I will also focus on in situ data processing infrastructure and next generation data compression algorithms.

STAT/JPSM Seminar: An Overview of Statistical Machine Learning Techniques with Applications

When: Tue, April 30, 2019 - 3:30pm
Where: Kirwan Hall 1313
Speaker: Dr. Amita Pal (Indian Statistical Institute) -
Abstract: Statistical Machine Learning involves an algorithmic approach, derived from statistical models, for solving certain problems that arise in the domain of Artificial Intelligence, that can be implemented through computers. Machine learning algorithms build a mathematical model of sample data, known as "training data", in order to make predictions or decisions. Depending on whether training data is labeled/ unlabeled, a variety of supervised/unsupervised Statistical Machine Learning methods are available. An overview of the most widely-used ones will be provided in this talk, and application to the problems of automatic speaker recognition (ASR) and content-based image retrieval (CBIR) will be briefly described.

Spectral methods in network analysis (continued)

When: Wed, May 1, 2019 - 2:00pm
Where: Kirwan Hall 3206
Speaker: David Bindel (Cornell University) - http://www.cs.cornell.edu/~bindel/
Abstract: Linear algebra methods play a central role in modern methods for large-scale network analysis. The same
approach underlies many of these methods. First, one tells a story that associates the network with a matrix,
either as the generator of a linear time-invariant dynamical process on the graph or as a quadratic form used
to measure some quantity of interest. Then, one uses the eigenvalues and eigenvectors of the matrix to
reason about the properties of the dynamical system or quadratic form, and from there to understand the
network. We describe some of the most well-known spectral network analysis methods for tasks such as
bisection and partitioning, clustering and community detection, and ranking and centrality. These methods
largely depend only on a few eigenvalues and eigenvectors, but we will also describe some methods that
require a more global perspective, including methods that we have developed for local spectral clustering
and for graph analysis via spectral densities.