<?xml version="1.0" encoding="UTF-8" ?>
	<rss version="2.0">
		<channel><title>Norbert Wiener Center</title><link>http://www-math.umd.edu/research/seminars.html</link><description></description><item>
	<title>Exponential bases for partitions of intervals</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 27 Sep 2021 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, September 27, 2021 - 2:00pm<br />Where: Online<br />Speaker: David Walnut (GMU) - <br />
Abstract: For a partition of [0,1] into intervals I1,...,In we prove the existence of a partition of ℤ into Λ1,..., Λn such that the complex exponential functions with frequencies in Λk form a Riesz basis for L2(Ik), and furthermore, that for any J⊆{1,2,...,n}, the exponential functions with frequencies in ⋃j∈J Λj form a Riesz basis for L2(I) for any interval I with length |I|=Σj∈J |Ij|. The construction extends to infinite partitions of [0,1], but with size limitations on the subsets J⊆ ℤ. The construction utilizes an interesting assortment of tools from analysis, probability, and number theory.<br />
This is joint work with Shauna Revay (GMU and Novetta), and Goetz Pfander (Catholic University of Eichstaett-Ingolstadt).<br />
<br />
Zoom Link for September 27th Webinar:<br />
<br />
https://umd.zoom.us/j/91097809301?pwd=Q1hXWXpkYlhDWGFHb1BLWnNGV2lmQT09<br />
<br />
Meeting ID: 910 9780 9301<br />
<br />
Meeting Passcode: FFT927<br />]]></description>
</item>

<item>
	<title>Empirical Bayesian inference using joint sparsity</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 04 Oct 2021 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, October 4, 2021 - 2:00pm<br />Where: Online<br />Speaker: Anne Gelb (Dartmouth College) - <br />
Abstract: Meeting ID: 984 1143 6340<br />
<br />
Meeting Passcode:  104593<br />]]></description>
</item>

<item>
	<title>Deep Approximation via Deep Learning</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 11 Oct 2021 12:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, October 11, 2021 - 12:00pm<br />Where: Online<br />Speaker: Prof. Zuowei Shen (National University of Singapore) - <br />
Abstract: The primary task of many applications is approximating/estimating  a function  through  samples drawn from a probability distribution on the input space.  The deep approximation  is to  approximate  a function by compositions of many layers of simple functions, that can be viewed as  a series of nested feature extractors. The  key idea of deep learning  network is to convert layers of compositions to  layers of tuneable parameters that  can be adjusted through a  learning process,  so that it achieves a good approximation with respect to the input data.  In this talk, we  shall discuss mathematical theory behind  this new approach and approximation rate of deep network;   we will also how  this new approach  differs from  the classic approximation theory, and  how this new theory can be used to understand and design  deep learning network.<br />
<br />
Zoom Link for October 11th Webinar:<br />
<br />
https://umd.zoom.us/j/99381978785?pwd=V3ozeEpMVTI2TDZwcFVKNk9FcktnQT09<br />
<br />
Meeting ID: 993 8197 8785<br />
<br />
Meeting Passcode:  513626<br />]]></description>
</item>

<item>
	<title>Wilson Statistics: Derivation, Generalization, and Applications to Cryo-EM</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 18 Oct 2021 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, October 18, 2021 - 2:00pm<br />Where: Online<br />Speaker: Amit Singer (Princeton) - <br />
Abstract: The power spectrum of proteins at high frequencies is remarkably well described by the flat Wilson statistics. Wilson statistics therefore plays a significant role in X-ray crystallography and more recently in cryo-EM. Specifically, modern computational methods for three-dimensional map sharpening and atomic modeling of macromolecules by single particle cryo-EM are based on Wilson statistics. In this talk we use certain results about the decay rate of the Fourier transform to provide the first rigorous mathematical derivation of Wilson statistics. The derivation pinpoints the regime of validity of Wilson statistics in terms of the size of the macromolecule. Moreover, the analysis naturally leads to generalizations of the statistics to covariance and higher order spectra. These in turn provide theoretical foundation for assumptions underlying the widespread Bayesian inference framework for three-dimensional refinement and for explaining the limitations of autocorrelation based methods in cryo-EM.<br />
<br />
Meeting ID: 936 8202 8028<br />
Meeting Passcode: 783856<br />]]></description>
</item>

<item>
	<title>Expanding Quantum Field Theory Using Affine Quantization</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 25 Oct 2021 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, October 25, 2021 - 2:00pm<br />Where: Online<br />Speaker: John Klauder (University of Florida) - <br />
Abstract: Quantum field theory uses canonical quantization (CQ), and often fails, e.g., $\varphi^4_4$, etc. Affine quantization (AQ) - which will be introduced - can solve a variety of problems that CQ cannot. AQ can even be used to solve certain models regarded as nonrenormalizable. The specific procedures of AQ lead to a novel Fourier transformation that illustrates how AQ can create a generous contribution to quantum field theory.<br />
<br />
Meeting ID: 952 9031 5793<br />
Meeting  Passcode: 582647<br />]]></description>
</item>

<item>
	<title>Climbing the Diagonal Clifford Hierarchy</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 08 Nov 2021 14:00:00 EST</pubDate>
	<description><![CDATA[When: Mon, November 8, 2021 - 2:00pm<br />Where: Online<br />Speaker: Prof. Robert Calderbaank (Duke U.) - <br />
Abstract: Quantum computers are moving out of physics labs and becoming generally programmable. In this talk, we start from quantum algorithms like magic state distillation and Shor factoring that make essential use of diagonal logical gates. The difficulty of reliably implementing these gates in some quantum error correcting code (QECC) is measured by their level in the Clifford hierarchy, a mathematical framework that was defined by Gottesman and Chuang when introducing the teleportation model of quantum computation. We describe a method of working backwards from a target logical diagonal gate at some level in the Clifford hierarchy to a quantum error correcting code (CSS code) in which the target logical can be implemented reliably.<br />
<br />
This talk describes joint work with my graduate students Jingzhen Hu and Qingzhong Liang.<br />
<br />
Meeting ID: 937 3374 5978<br />
Meeting Passcode: FFT<br />]]></description>
</item>

<item>
	<title>Design of Unbiased, Adaptive and Robust AI Systems</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 22 Nov 2021 14:00:00 EST</pubDate>
	<description><![CDATA[When: Mon, November 22, 2021 - 2:00pm<br />Where: Online<br />Speaker: Raama Chellappa (JHU) - <br />
Abstract: Over the last decade, algorithms and systems based on deep learning and other data-driven methods have contributed to the reemergence of Artificial Intelligence-based systems with applications in national security, defense, medicine, intelligent transportation, and many other domains. However, another AI winter may be lurking around the corner if challenges due to bias, domain shift and lack of robustness to adversarial attacks are not considered while designing the AI systems. In this talk, I will present our approach to bias mitigation and designing AI systems that are robust to domain shift and a variety of adversarial attacks.<br />
Zoom Link for November 22nd Webinar:<br />
<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Meeting Passcode: FFT<br />]]></description>
</item>

<item>
	<title>Complex Networks Workshop: Analysis, Numerics, and Applications</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Fri, 18 Feb 2022 09:30:00 EST</pubDate>
	<description><![CDATA[When: Fri, February 18, 2022 - 9:30am<br />Where: Kirwan Hall 3206<br />Speaker: Assorted (Various) - https://www.norbertwiener.umd.edu/Events/2022/ComplexNetworksWorkshop.html<br />
Abstract: See schedule at: https://www.norbertwiener.umd.edu/Events/2022/ComplexNetworksWorkshop.html<br />]]></description>
</item>

<item>
	<title>Complex Networks Workshop: Analysis, Numerics, and Applications</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Sat, 19 Feb 2022 09:30:00 EST</pubDate>
	<description><![CDATA[When: Sat, February 19, 2022 - 9:30am<br />Where: Kirwan Hall 3206<br />Speaker: Assorted (Various) - https://www.norbertwiener.umd.edu/Events/2022/ComplexNetworksWorkshop.html<br />
Abstract: Full schedule at:  https://www.norbertwiener.umd.edu/Events/2022/ComplexNetworksWorkshop.html<br />]]></description>
</item>

<item>
	<title>A Spline Perspective of Deep Learning</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 28 Feb 2022 14:00:00 EST</pubDate>
	<description><![CDATA[When: Mon, February 28, 2022 - 2:00pm<br />Where: Online<br />Speaker: Richard Baraniuk (Rice University) - <br />
Abstract: We study the geometry of deep learning through the lens of approximation theory via spline functions and operators. Our key result is that a large class of deep networks (DNs) can be written as a composition of continuous piecewise affine (CPA) splines, which provide a powerful portal through which to interpret and analyze their inner workings. We explore links to the classical theory of optimal classification via matched filters, the effects of data memorization, vector quantization (VQ), and K-means clustering, which open up new geometric avenue to study how DNs organize signals in a hierarchical and multiscale fashion. <br />
<br />
Join Zoom Meeting<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Passcode: FFT<br />]]></description>
</item>

<item>
	<title>A Harnack inequality for certain degenerate/singular elliptic PDEs shaped by convex functions</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 07 Mar 2022 14:00:00 EST</pubDate>
	<description><![CDATA[When: Mon, March 7, 2022 - 2:00pm<br />Where: Online<br />Speaker: Diego Maldonaldo (Kansas State University) - <br />
Abstract: Certain convex functions on $R^n$ produce quasi-distances and Borel measures on $R^n$. Furthermore, their Hessians can shape degenerate/singular elliptic operators. We will describe geometric and measure-theoretic conditions on convex functions that allow to prove a Harnack inequality for nonnegative solutions to their associated elliptic operators.<br />
<br />
Zoom Link for February 28th Webinar:<br />
<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Passcode: FFT<br />]]></description>
</item>

<item>
	<title>Passive Source Localization</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 14 Mar 2022 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, March 14, 2022 - 2:00pm<br />Where: Online<br />Speaker: Margaret Cheney (Colorado State University) - <br />
Abstract: This talk introduces the problem of localizing electromagnetic sources from measurements of their radiated fields at two moving sensors. Two approaches are discussed, the first based on measuring quantities known as “time difference of arrivals“ and “frequency difference of arrivals”. This approach leads to some interesting geometrical problems.   The second approach is a synthetic-aperture approach that also involves some unsolved problems.  <br />
<br />
<br />
Zoom Link for March 14th Webinar:<br />
<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Passcode: FFT<br />]]></description>
</item>

<item>
	<title>Online nonnegative matrix factorization and applications</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 28 Mar 2022 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, March 28, 2022 - 2:00pm<br />Where: Online<br />Speaker: Deanna Needell (UCLA) - <br />
Abstract: Online Matrix Factorization (OMF) is a fundamental tool for dictionary learning problems, giving an approximate representation of complex data sets in terms of a reduced number of extracted features. Convergence guarantees for most of the OMF algorithms in the literature assume independence between data matrices, and the case of dependent data streams remains largely unexplored. In this talk, we present results showing that a non-convex generalization of the well-known OMF algorithm for i.i.d. data converges almost surely to the set of critical points of the expected loss function, even when the data matrices are functions of some underlying Markov chain satisfying a mild mixing condition. As the main application, by combining online non-negative matrix factorization and a recent MCMC algorithm for sampling motifs from networks, we propose a novel framework of Network Dictionary Learning that extracts `network dictionary patches&#039; from a given network in an online manner that encodes main features of the network. We demonstrate this technique on real-world data and discuss recent extensions and variations. <br />
<br />
Zoom Link for March 28th Webinar:<br />
<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Passcode: FFT<br />]]></description>
</item>

<item>
	<title>Data Representation Learning from a Single Pass of the Data</title>
	<link>http://www-math.umd.edu/research/seminars.html</link>
	<pubDate>Mon, 04 Apr 2022 14:00:00 EDT</pubDate>
	<description><![CDATA[When: Mon, April 4, 2022 - 2:00pm<br />Where: Online<br />Speaker: Prof. Alexander Cloninger (UCSD) - <br />
Abstract: In many applications, constructing kernel matrices or pairwise distance matrices can be prohibitively expensive.  This can be due to the expense of storing data in a streaming setting, or the expense of accessing multiple large data sets to compute some statistical distance between them. In this talk, I highlight several settings in which we can compute representations of data points (or entire point clouds) on a single pass. This includes Linearized Optimal Transport for computing Wasserstein distances between distributions and performing supervised learning, boosted kernel regression on streaming data, and streaming quantization of translation invariant kernel feature spaces. <br />
<br />
Zoom Link for April 4th Webinar:<br />
<br />
https://umd.zoom.us/j/93733745978?pwd=UWVSOG5lblVvNThPS2RoVFVSWjRGZz09<br />
<br />
Meeting ID: 937 3374 5978<br />
Passcode: FFT<br />]]></description>
</item>


	</channel>
</rss>