Norbert Wiener Center Archives for Academic Year 2017


Organizational meeting

When: Tue, August 29, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: () -


Super-resolution without minimum separation assumptions

When: Tue, September 12, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Weilin Li (UMD) -


Distributed Noise Shaping of Signal Quantization

When: Tue, September 19, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Kung-Ching Lin (UMD) -
Abstract: Quantization theory is an integral part of signal processing, focusing on the discrete nature of data storage in electronic devices and the effect of that. It stems from the needs to navigate the errors occurred from the numerous physical constraints during both sampling and reconstruction. This talk will give a quick exhibition of the development on this theory and discuss about a specific quantization scheme: Distributed noise shaping. Such scheme is robust against the usual physical constraints and has near-optimal error decay rate, making it a favorable choice.

Frames -- two case studies: ambiguity and uncertainty

When: Tue, September 26, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: John Benedetto (UMD) -


Measures with locally finite support and spectrum

When: Tue, October 3, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Chenzhi Zhao (UMD) -


Multiscale analysis of data and functions in high dimension

When: Tue, October 17, 2017 - 2:00pm
Where: Kirwan Hall 1308
Speaker: Stefano Vigogna (JHU) -


Topological Integral Transforms with Applications to Sensing

When: Tue, October 24, 2017 - 2:00pm
Where: 3206 Kirwan Hall
Speaker: Robert Ghrist (UPenn) -
Abstract: This talk will outline a topological approach to constructing novel integral transforms. The basis for these methods is a blend of combinatorial and homological methods fused into an integration theory with respect to Euler characteristic. Using this as a starting point, several integral transforms will be described with applications given to problems of data aggregation over networks of sensors.

Image Space Embeddings for Data Visualization and Processing

When: Tue, October 31, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Nate Strawn (Georgetown) -
Abstract: We propose several extensions of PCA to isometrically embed arbitrary Euclidean datasets into high-dimensional spaces of images. In particular, this procedure produces a "visually coherent" icon for each data point. Such embeddings provide an interesting tool for Exploratory Data Analysis, and also allow us to apply mature techniques from Image Processing and Computer Vision to arbitrary datasets. We discuss theory, algorithms, and applications to Dictionary Learning, Deep Learning, and Topological Data Analysis.

Optimal coherence from finite group actions

When: Tue, November 14, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Joey Iverson (UMD) -
Abstract: In applications such as compressed sensing and quantum information theory, it is critically important to find examples of frames whose vectors are spread apart in the sense of having wide angles between them, as measured by the coherence. This is an old problem, going back at least to the work of van Lint and Seidel in the 1960s, and it remains an active and challenging area of research today. In this talk we will present a new recipe for converting transitive actions of finite groups into tight frames, many of which have optimal coherence. The main idea is to use an association scheme as a kind of converter to pass from the discrete world of permutation groups into the continuous setting of frames. This process is easy to implement in a program like GAP. We will present several examples of optimally coherent frames produced in this way, including the first infinite family of equiangular tight frames with Heisenberg symmetry. (These are not SIC-POVMs, but they appear to be related.)

Parsimonious Online Learning with Kernels and Connections with Deep Learning

When: Tue, November 28, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: Ethan Stump (ARL) -


Unsupervised Geometric Learning: Theory and Applications

When: Tue, December 5, 2017 - 2:00pm
Where: Kirwan Hall 1311
Speaker: James Murphy (JHU) -
Abstract: Machine learning and data science are revolutionizing how humans gain knowledge. Algorithmic tools are making breakthroughs in virtually all scientific areas including computer vision, precision medicine, and geoscience. Despite their empirical successes, machine learning methods are often poorly understood theoretically. We present a mathematical framework for unsupervised learning of data based on geometry. By considering data-dependent metrics on high-dimensional and noisy data, intrinsically low-dimensional structures in the data are revealed. Our algorithms enjoy robust performance guarantees for accuracy and parameter dependence that surpass known results in the case that the intrinsic dimension of the data is small relative to the ambient dimension. In particular, our algorithms are provably robust to large amounts of noise. Our methods of proof combine percolation theory, manifold learning, and spectral graph analysis. Beyond performance guarantees, we present efficient implementations of our algorithms that scale quasilinearly in the number of datapoints, demonstrating the applicability of our methods to the "big data" regime. The proposed algorithms are validated on a variety of synthetic and real datasets, and applications to hyperspectral data and other remotely sensed images will be discussed at length.

Organizational meeting

When: Tue, January 30, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: () -


FFT 2018

When: Thu, February 15, 2018 - 9:00am
Where: Kirwan Hall 3206
Speaker: FFT 2018 () - http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Abstract: See the list of speakers and abstracts at http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Main lectures:
1) Yannis Kevrekidis (JHU) 4:15 - 5
2) Peggy Kidwell (Smithsonian) 6:40 - 7:30
3) William Johnson (TAMU), 3:45 - 4:30

FFT 2018

When: Fri, February 16, 2018 - 9:00am
Where: Kirwan Hall 3206
Speaker: FFT 2018 - http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Abstract: See the list of speakers and abstracts at http://www.norbertwiener.umd.edu/FFT/2018/schedule.html
Main lectures:
1) Yannis Kevrekidis (JHU) 4:15 - 5
2) Peggy Kidwell (Smithsonian) 6:40 - 7:30
3) William Johnson (TAMU), 3:45 - 4:30

Heat diffusion on inverse limit spaces

When: Tue, February 20, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Patricia Alonso Ruiz (U Conn) -
Abstract: Inverse (or projective) limits give rise to a wide range of spaces that
can show remarkably different properties. The present talk aims to illustrate
how different the (in some sense natural) diffusion processes occurring on them
can be by looking at two examples: a parametric family of diamond fractals
and pattern spaces of aperiodic Delone sets.

On a generalized diamond fractal, a canonical diffusion process can be con-
structed following a procedure proposed by Barlow and Evans for inverse lim-
its of metric measure spaces. The associated heat semigroup has a kernel, of

which many properties have been studied by Hambly and Kumagai in the case
of constant parameters. It turns out that in general it is possible to give a
rather explicit expression of the heat kernel, that is in particular uniformly
continuous and admits an analytic continuation.
In contrast, pattern spaces feature quite an opposite scenario. Regarded as
compact metric measure spaces of a suitable type, one can introduce a diffusion
process with an especially simple expression whose associated heat semigroup
has no density, that is a heat kernel, with respect to the natural measure of
the space.

Deep Geometric Learning and Data Fusion

When: Tue, March 6, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Jerry Emidih (UMD) -
Abstract: This talk will survey recent developments in machine learning with non-Euclidean data. There are many settings in which both standard Fourier techniques and modern neural network schemes fail to capture intrinsic data features. In these cases, data-dependent representations must be learned from geometric properties of the data. We will consider a few graph-based data representations and examine models for extending convolutional neural networks to geometries without the highly regularized structure of Euclidean space. We will also investigate some applications of geometric learning to genomics and neuroscience.

Stable denoising with generative networks

When: Tue, April 17, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Soledad Villar (NYU) -
Abstract: It has been experimentally established that deep neural networks can be used to produce good generative models for real world data. It has also been established that such generative models can be exploited to solve classical inverse problems like compressed sensing and super resolution. In this work we focus on the classical signal processing problem of image denoising. We propose a theoretical setting that uses spherical harmonics to identify what mathematical properties of the activation functions will allow signal denoising with local methods.

Bilinear inverse problems: theory, algorithms, and applications in imaging science and signal processing

When: Tue, April 24, 2018 - 2:00pm
Where:
Speaker: Shuyang Ling (NYU)
When: Tues, April 24, 2018 -- 2:00 pm
Where: Kirwan Hall 3206

Subspace methods and smallest singular value of Vandermonde matrices

When: Tue, May 1, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Weilin Li (UMD) -
Abstract: This talk consists of two parts. The first part examines the following problem: Given a finite set X of cardinality S and given an integer M >= S, estimate the smallest singular value of the MxS Vandermonde matrix associated with X and M. We show how to obtain accurate lower bounds using duality and Fourier analysis. The second part connects this problem to topics in signal processing such as super-resolution and parameter estimation. The smallest singular value of such Vandermonde matrices essentially characterize the stability of a collection of algorithms, called subspace methods, to noise. This is joint work with Wenjing Liao.

The Hilbert Transform and the maximal (Hardy-Littlewwod) operator along variable families of non-flat curves

When: Tue, May 8, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Victor Lie (Purdue University) -


Zak transform analysis of shift-invariant subspaces

When: Thu, May 10, 2018 - 2:00pm
Where: Kirwan Hall 3206
Speaker: Joey Iverson (UMD) -
Abstract: Shift-invariant (SI) spaces play a prominent role in the study of wavelets, Gabor systems, and other group frames. Working in the setting of LCA groups, we use a variant of the Zak transform to classify SI spaces, and to simultaneously describe families of vectors whose shifts form frames for the SI spaces they generate.