Norbert Wiener Center Archives for Academic Year 2018


Organizational meeting

When: Tue, August 29, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: () -


Super-resolution without minimum separation assumptions

When: Tue, September 12, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Weilin Li (UMD) -


Distributed Noise Shaping of Signal Quantization

When: Tue, September 19, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Kung-Ching Lin (UMD) -
Abstract: Quantization theory is an integral part of signal processing, focusing on the discrete nature of data storage in electronic devices and the effect of that. It stems from the needs to navigate the errors occurred from the numerous physical constraints during both sampling and reconstruction. This talk will give a quick exhibition of the development on this theory and discuss about a specific quantization scheme: Distributed noise shaping. Such scheme is robust against the usual physical constraints and has near-optimal error decay rate, making it a favorable choice.

Frames -- two case studies: ambiguity and uncertainty

When: Tue, September 26, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: John Benedetto (UMD) -


Measures with locally finite support and spectrum

When: Tue, October 3, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Chenzhi Zhao (UMD) -


Multiscale analysis of data and functions in high dimension

When: Tue, October 17, 2017 - 2:00pm
Where: Kirwan Hall 1308
Speaker: Stefano Vigogna (JHU) -


Topological Integral Transforms with Applications to Sensing

When: Tue, October 24, 2017 - 2:00pm
Where: 3206 Kirwan Hall
Speaker: Robert Ghrist (UPenn) -
Abstract: This talk will outline a topological approach to constructing novel integral transforms. The basis for these methods is a blend of combinatorial and homological methods fused into an integration theory with respect to Euler characteristic. Using this as a starting point, several integral transforms will be described with applications given to problems of data aggregation over networks of sensors.

Image Space Embeddings for Data Visualization and Processing

When: Tue, October 31, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Nate Strawn (Georgetown) -
Abstract: We propose several extensions of PCA to isometrically embed arbitrary Euclidean datasets into high-dimensional spaces of images. In particular, this procedure produces a "visually coherent" icon for each data point. Such embeddings provide an interesting tool for Exploratory Data Analysis, and also allow us to apply mature techniques from Image Processing and Computer Vision to arbitrary datasets. We discuss theory, algorithms, and applications to Dictionary Learning, Deep Learning, and Topological Data Analysis.

Optimal coherence from finite group actions

When: Tue, November 14, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Joey Iverson (UMD) -
Abstract: In applications such as compressed sensing and quantum information theory, it is critically important to find examples of frames whose vectors are spread apart in the sense of having wide angles between them, as measured by the coherence. This is an old problem, going back at least to the work of van Lint and Seidel in the 1960s, and it remains an active and challenging area of research today. In this talk we will present a new recipe for converting transitive actions of finite groups into tight frames, many of which have optimal coherence. The main idea is to use an association scheme as a kind of converter to pass from the discrete world of permutation groups into the continuous setting of frames. This process is easy to implement in a program like GAP. We will present several examples of optimally coherent frames produced in this way, including the first infinite family of equiangular tight frames with Heisenberg symmetry. (These are not SIC-POVMs, but they appear to be related.)

Parsimonious Online Learning with Kernels and Connections with Deep Learning

When: Tue, November 28, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Ethan Stump (ARL) -


Unsupervised Geometric Learning: Theory and Applications

When: Tue, December 5, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: James Murphy (JHU) -
Abstract: Machine learning and data science are revolutionizing how humans gain knowledge. Algorithmic tools are making breakthroughs in virtually all scientific areas including computer vision, precision medicine, and geoscience. Despite their empirical successes, machine learning methods are often poorly understood theoretically. We present a mathematical framework for unsupervised learning of data based on geometry. By considering data-dependent metrics on high-dimensional and noisy data, intrinsically low-dimensional structures in the data are revealed. Our algorithms enjoy robust performance guarantees for accuracy and parameter dependence that surpass known results in the case that the intrinsic dimension of the data is small relative to the ambient dimension. In particular, our algorithms are provably robust to large amounts of noise. Our methods of proof combine percolation theory, manifold learning, and spectral graph analysis. Beyond performance guarantees, we present efficient implementations of our algorithms that scale quasilinearly in the number of datapoints, demonstrating the applicability of our methods to the "big data" regime. The proposed algorithms are validated on a variety of synthetic and real datasets, and applications to hyperspectral data and other remotely sensed images will be discussed at length.

Organizational meeting

When: Tue, January 30, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: () -


FFT 2018

When: Thu, February 15, 2018 - 9:00am
Where: Kirwan Hall 3206
Speaker: FFT 2018 () - http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Abstract: See the list of speakers and abstracts at http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Main lectures:
1) Yannis Kevrekidis (JHU) 4:15 - 5
2) Peggy Kidwell (Smithsonian) 6:40 - 7:30
3) William Johnson (TAMU), 3:45 - 4:30

FFT 2018

When: Fri, February 16, 2018 - 9:00am
Where: Kirwan Hall 3206
Speaker: FFT 2018 - http://www.norbertwiener.umd.edu/FFT/2018/schedule.htmlAbstract: See the list of speakers and abstracts at http://www.norbertwiener.umd.edu/FFT/2018/schedule.htmlMain lectures:1) Yannis Kevrekidis (JHU) 4:15 - 52) Peggy Kidwell (Smithsonian) 6:40 - 7:303) William Johnson (TAMU), 3:45 - 4:30

Heat diffusion on inverse limit spaces

When: Tue, February 20, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Patricia Alonso Ruiz (U Conn) -
Abstract: Inverse (or projective) limits give rise to a wide range of spaces that
can show remarkably different properties. The present talk aims to illustrate
how different the (in some sense natural) diffusion processes occurring on them
can be by looking at two examples: a parametric family of diamond fractals
and pattern spaces of aperiodic Delone sets.

On a generalized diamond fractal, a canonical diffusion process can be con-
structed following a procedure proposed by Barlow and Evans for inverse lim-
its of metric measure spaces. The associated heat semigroup has a kernel, of

which many properties have been studied by Hambly and Kumagai in the case
of constant parameters. It turns out that in general it is possible to give a
rather explicit expression of the heat kernel, that is in particular uniformly
continuous and admits an analytic continuation.
In contrast, pattern spaces feature quite an opposite scenario. Regarded as
compact metric measure spaces of a suitable type, one can introduce a diffusion
process with an especially simple expression whose associated heat semigroup
has no density, that is a heat kernel, with respect to the natural measure of
the space.