CSCAMM Archives for Fall 2021 to Spring 2022


Frequency Bias in Deep Learning

When: Wed, September 16, 2020 - 2:00pm
Where: Zoom link: https://umd.zoom.us/j/94665233234?pwd=NzZNY0cvNTV0dEZidkF6MWN5THRxUT09
Speaker: Prof. David Jacobs (Department of Computer Science and UMIACS, University of Maryland) - https://www.cs.umd.edu/~djacobs/
Abstract: Recent results have shown that highly overparameterized deep neural networks act as linear systems. Fully connected networks are equivalent to kernel methods, with a neural tangent kernel (NTK). This talk will describe our work on better understanding the properties of this kernel. We study the eigenvalues and eigenvectors of NTK, and quantify a frequency bias in neural networks, that causes them to learn low frequency functions more quickly than high frequency ones. In fact, these eigenvectors and eigenvalues are the same as those of the well-known Laplace kernel, implying that these two kernels interpolate functions with the same smoothness properties. On a large number of datasets, we show that Kernel-based classification with NTK and the Laplace kernel perform quite similarly.

Vulnerability-Aware Poisoning Mechanism for Online RL with Unknown Dynamics

When: Wed, September 30, 2020 - 2:00pm
Where: Zoom link https://umd.zoom.us/j/94665233234?pwd=NzZNY0cvNTV0dEZidkF6MWN5THRxUT09
Speaker: Prof. Furong Huang (Department of Computer Science, University of Maryland) - http://furong-huang.com/
Abstract: Poisoning attacks, although have been studied extensively in supervised learning, are not well understood in Reinforcement Learning (RL), especially in deep RL. Prior works on poisoning RL usually either assume the attacker knows the underlying Markov Decision Process (MDP), or directly apply the poisoning methods in supervised learning to RL. In this work, we build a generic poisoning framework for online RL via a comprehensive investigation of heterogeneous types/victims of poisoning attacks in RL, considering the unique challenges in RL such as data no longer being i.i.d. Without any prior knowledge of the MDP, we propose a strategic poisoning algorithm called Vulnerability-Aware Adversarial Critic Poison (VA2C-P), which works for most policy-based deep RL agents, using a novel metric, stability radius in RL, that measures the vulnerability of RL algorithms. Experiments on multiple deep RL agents and multiple environments show that our poisoning algorithm successfully prevents agents from learning a good policy, with a limited attacking budget. Our experiment results demonstrate varying vulnerabilities of different deep RL agents in multiple environments, benefiting the understanding and applications of deep RL under security threat scenarios.

Solutions to two conjectures in branched transport: stability and regularity of optimal paths

When: Wed, October 14, 2020 - 2:00pm
Where: Zoom link: https://umd.zoom.us/j/95489510397?pwd=M2lqOFRLNlZQWkh2ajhKSGNrSll0Zz09
Speaker: Prof. Antonio De Rosa (Department of Mathematics, University of Maryland) - https://sites.google.com/view/antonioderosa/home
Abstract: Transport models involving branched structures are employed to describe several biological, natural and supply-demand systems. The transportation cost in these models is proportional to a concave power of the intensity of the flow.

In this talk, we focus on the stability of optimal transports with respect to variations of the source and target measures. Stability was known to hold just for special regimes (supercritical concave powers degenerating with the dimension). We prove that stability holds for every lower semicontinuous cost functional continuous in 0 and we provide counterexamples when these assumptions are not satisfied. Thus we completely solve a conjecture of Bernot, Caselles and Morel.

To conclude, we prove stability for the mailing problem, too. This was completely open in the literature and allows us to obtain the regularity of the optimal networks.

Gaps and effective gaps in Floquet media

When: Wed, November 18, 2020 - 2:00pm
Where: https://umd.zoom.us/j/96192636574?pwd=Q2EvYkZGMWRPaXdzcE5WUTl0eFJVUT09
Speaker: Dr. Amir Sagiv (Applied Mathematics, Columbia University) - https://www.apam.columbia.edu/faculty/amir-sagiv
Abstract: Applying time-periodic forcing is a common technique to effectively change materials properties. A well-known example is the transformation of graphene from a conductor to an insulator ("Floquet topological insulator'') by applying to it a time-dependent magnetic potential. We will see how this phenomenon is derived from certain reduced models of graphene. We will then turn to the first-principle, continuum, model of graphene. There, it is not at all obvious how to derive the insulation property, or even what its mathematical expression is. We will introduce the notion of an "effective gap", or low-oscillations gap, and prove its existence in forced graphene. This new notion distinguishes a part of the energy-spectrum in a quantitative way. It implies that the medium is approximately insulating for a certain class of physically-likely wavepackets.

Based on joint work with MI Weinstein.

Dynamic Network-level Analysis of Neural Data Underlying Behavior: Case Studies in Auditory Processing

When: Wed, December 2, 2020 - 2:00pm
Where: https://umd.zoom.us/j/99801377079?pwd=b0lFVVhwTWFNV0pXT3ZuSytiekp5Zz09
Speaker: Prof. Behtash Babadi (Department of Electrical and Computer Engineering, University of Maryland) - https://user.eng.umd.edu/~behtash/
Abstract: In this talk, I present computational methodologies for extracting dynamic neural functional networks that underlie behavior. These methods aim at capturing the sparsity, dynamicity and stochasticity of these networks, by integrating techniques from high-dimensional statistics, point processes, state-space modeling, and adaptive filtering. I demonstrate their utility using several case studies involving auditory processing, including 1) functional auditory-prefrontal interactions during attentive behavior in the ferret brain, 2) network-level signatures of decision-making in the mouse primary auditory cortex, and 3) cortical dynamics of speech processing in the human brain.

Graph Convolutional Neural Networks: The Mystery of Generalization

When: Wed, February 3, 2021 - 2:00pm
Where: Zoom link: https://umd.zoom.us/j/96550429672?pwd=ZnJGblUxZ0xlL1lsMXZMdHl3THRnZz09
Speaker: Prof. Gitta Kutyniok (Mathematical Institute of the University of Munich) - https://www.mathematik.uni-muenchen.de/~kutyniok/
Abstract: The tremendous importance of graph structured data due to
recommender systems or social networks led to the introduction of
graph convolutional neural networks (GCN). Those split into spatial
and spectral GCNs, where in the later case filters are defined as
elementwise multiplication in the frequency domain of a graph.
Since often the dataset consists of signals defined on many
different graphs, the trained network should generalize to signals
on graphs unseen in the training set. One instance of this problem
is the transferability of a GCN, which refers to the condition that
a single filter or the entire network have similar repercussions on
both graphs, if two graphs describe the same phenomenon. However, for a long time it was believed that spectral filters are not transferable.

In this talk we aim at debunking this common misconception by
showing that if two graphs discretize the same continuous metric
space, then a spectral filter or GCN has approximately the same
repercussion on both graphs. Our analysis also accounts for large
graph perturbations as well as allows graphs to have completely
different dimensions and topologies, only requiring that both
graphs discretize the same underlying continuous space. Numerical
results then even imply that spectral GCNs are superior to spatial
GCNs if the dataset consists of signals defined on many different
graphs.

This is joint work with R. Levie, W. Huang, L. Bucci, and M.
Bronstein.

Consensus-Based Optimization Over the Sphere: Theoretical Guarantees and Applications in Machine Learning

When: Wed, February 17, 2021 - 2:00pm
Where: https://umd.zoom.us/j/96550429672?pwd=ZnJGblUxZ0xlL1lsMXZMdHl3THRnZz09
Speaker: Prof. Massimo Fornasier (Department of Mathematics, Technical University of Munich) - https://www-m15.ma.tum.de/Allgemeines/MassimoFornasier
Abstract: We introduce a new stochastic Kuramoto-Vicsek type model for global optimization of nonconvex functions on the sphere. This model belongs to the class of Consensus-Based Optimization methods. In fact, particles move on the sphere driven by a drift towards an instantaneous consensus point, computed as a convex combination of the particle locations weighted by the cost function according to Laplace's principle, which represents an approximation to a global minimizer. The dynamics is further perturbed by a random vector field to favor exploration, whose variance is function of the distance of the particles with respect to the consensus point. In particular, as soon as consensus is reached the stochastic component vanishes. In the first part of the talk, we study the well-posedness of the model and we derive rigorously its mean-field approximation for large particle limit. The main results of the second part of the talk are about the proof of convergence of the numerical scheme to global minimizers provided conditions of well-preparation of the initial datum. The proof combines the previous results of mean-field limit with a novel asymptotic analysis, and classical convergence results of numerical methods for SDE. We present several numerical experiments, which show that the algorithm scales well with the dimension and is extremely versatile. To quantify the performances of the new approach, we show that the algorithm is able to perform essentially as good as ad hoc state of the art methods and in some instances it obtains quantifiable better results in challenging problems in signal processing and machine learning, namely the phase retrieval problem and the robust subspace detection.

Deep Analog-to-Digital Compression: Tasks, Structures, and Models

When: Wed, March 3, 2021 - 2:00pm
Where: https://umd.zoom.us/j/96550429672?pwd=ZnJGblUxZ0xlL1lsMXZMdHl3THRnZz09
Speaker: Prof. Yonina Eldar (Electrical Engineering, Weizmann Institute of Science) -
Abstract: The famous Shannon-Nyquist theorem has become a landmark in the development of digital signal and image processing. However, in many modern applications, the signal bandwidths have increased tremendously, while the acquisition capabilities have not scaled sufficiently fast. Consequently, conversion to digital has become a serious bottleneck. Furthermore, the resulting digital data requires storage, communication and processing at very high rates which is computationally expensive and requires large amounts of power. In the context of medical imaging sampling at high rates often translates to high radiation dosages, increased scanning times, bulky medical devices, and limited resolution.

In this talk, we present a framework for sampling and processing a large class of wideband analog signals at rates far below Nyquist in space, time and frequency, which allows to dramatically reduce the number of antennas, sampling rates and band occupancy.

Our framework relies on exploiting signal structure and the processing task. We consider applications of these concepts to a variety of problems in communications, radar and ultrasound imaging and show several demos of real-time sub-Nyquist prototypes including a wireless ultrasound probe, sub-Nyquist MIMO radar, super-resolution in microscopy and ultrasound, cognitive radio, and joint radar and communication systems. We then discuss how the ideas of exploiting the task, structure and model can be used to develop interpretable model-based deep learning methods that can adapt to existing structure and are trained from small amounts of data. These networks achieve a more favorable trade-off between increase in parameters and data and improvement in performance, while remaining interpretable.

A regularity method for lower bounds on the Lyapunov exponent for stochastic differential equations

When: Wed, March 24, 2021 - 2:00pm
Where: https://go.umd.edu/cscamm_spring2021
Speaker: Prof. Jacob Bedrossian (Department of Mathematics, University of Maryland) - https://www.math.umd.edu/~jbedross/
Abstract: Chaos is commonly observed in a wide variety of high dimensional systems in physics, however, there are few mathematical tools for obtaining positivity or quantitative estimates on Lyapunov exponents for the vast majority of physical systems. In a recent joint work with Alex Blumenthal and Sam Punshon-Smith, we put forward a new method for obtaining quantitative lower bounds on the top Lyapunov exponent of stochastic differential equations (SDEs) based on the Fisher information of a particular auxiliary Markov process. As an initial application, we prove the positivity of the top Lyapunov exponent for a class of weakly-dissipative, weakly forced SDE and that this class includes the Lorenz 96 model, originally introduced as a toy model for atmospheric dynamics, with any number of unknowns greater than or equal to 7 (the original model has 40). This is the first mathematically rigorous proof of chaos (in the sense of positive Lyapunov exponents) for Lorenz 96, despite the overwhelming numerical evidence of chaos. If time permits, I will also discuss application of the method also to more complicated models such as the finite dimensional truncations of the classical shell models of hydrodynamic turbulence, GOY and SABRA.

Probing Electron Transport with Light: A Flavor of Edge States in 2D Materials

When: Wed, April 21, 2021 - 2:00pm
Where: https://go.umd.edu/cscamm_spring2021
Speaker: Prof. Dionisios Margetis ( Department of Mathematics, University of Maryland) - https://www.math.umd.edu/~diom/
Abstract: Recent studies in the properties of atomically thin materials have offered valuable insights into aspects of 2D electron transport. In many nanophotonics applications, 2D materials such as doped graphene behave as Ohmic conductors and allow for the excitation of fine-scale electromagnetic waves. These surface waves are tightly confined to the 2D material and can beat the optical diffraction limit. In another kinetic regime of interest, 2D electron systems exhibit a behavior similar to that of classical fluids. These findings point to the need for predictions that may guide future experiments. A goal is to understand how distinct kinetic regimes of 2D electron systems can be reasonably probed by electromagnetic signals.

In this talk, I will discuss recent theoretical progress in describing the existence of edge modes in flat anisotropic conducting sheets. Some emphasis will be placed on an emergent topological concept for such modes. If time permits, I will also discuss recent results on the excitation of bulk modes by antennas over viscous 2D electron systems.

Modeling fracture in hydrogels

When: Wed, May 5, 2021 - 2:00pm
Where: https://go.umd.edu/cscamm_spring2021
Speaker: Prof. Maria Cameron (Department of Mathematics, University of Maryland ) - https://www.math.umd.edu/~mariakc/index.html
Abstract: Modeling and analysis of fracture in various materials has been a problem of great practical importance. Typically, breaking strength and work of fracture of a real material sample is reduced by orders of magnitude by imperfections present in it. Micro-level physics of fracture for some materials such as glass is well-understood, while for others like hydrogel needs yet to be explained. The goal of this work is to design and investigate a model that sheds light on physics of breaking hydrogel. We propose a 2D network model featuring a nonlinear stress-strain relationship and a scatter in equilibrium link lengths. We investigate this model by means of numerical simulations and demonstrate a dramatic reduction in breaking strength compared to that of the corresponding perfect network.