Probability, Analysis, and Data Science Seminar, 2018-2019






 

 






Spring Semester

 

Wednesdays, 3:10pm, Carver 290

 

Speaker

Title and Abstract

1/23/19

Hung Nguyen, ISU

The generalized Langevin equation with power-law memory

1/30/19

2/6/19

David Herzog, ISU

Title: Ergodicity and Lyapunov functions for Langevin dynamics with singular potentials

Abstract: We discuss Langevin dynamics of N particles on R^d interacting through a singular repulsive potential, e.g. the well-known Lennard-Jones type, and show that the system converges to the unique invariant Gibbs measure exponentially fast in a weighted total variation distance. The proof of the result turns on an explicit construction of a Lyapunov function. In contrast to previous results for such systems, our result implies geometric convergence to equilibrium starting from an essentially optimal family of initial distributions.

2/13/19

Hrvoje Sikic, (University of Zagreb & WUSTL)

Title: Wavelets and low pass filters

Abstract: Given a principal shift invariant space V and the dyadic dilation D, consider when is the space V contained within D(V). We completely resolve this problem via an emphasis on generalized filter studies. We show that the entire generalized MRA theory is a natural consequence of this approach, with a detailed classification of all the special cases of what we term as Pre-GMRA structures. Special attention is devoted to the analysis of the general form of a filter associated with the space V. Our theory splits into two subcases, based on the filter properties with respect to dyadic orbits; we distinguish the “full-orbit" case and the “non full-orbit" case. In both cases we introduce new Tauberian conditions which provide complete characterizations of “usable" filters. This approach further splits into the analysis of low frequencies versus high frequencies. There is a fundamental new result here which shows that, based on the “ergodic properties" of the generating function, the two frequency regimes exhibit radically different behavior; low frequencies allow completely localized adjustments while high frequencies can only be treated in a global sense. Various known results, like the Smith-Barnwell condition, the Cohen condition and its generalizations, the Lawton condition and its generalizations, are extracted naturally from our general approach. This is a joint work with P.Luthy, F.Soria, G.Weiss, and E.N.Wilson.

2/20/19

David Herzog (ISU)

Title: Ergodicity and Lyapunov functions for Langevin dynamics with singular potentials

2/27/19

Krishna Athreya (ISU)

Title: What is Brownian Motion on R plus

Abstract: An easy and quick answer is that it is a Gaussian process on R plus with mean and covariance prescribed.Does such a process exist? It does indeed.For this one needs Kolmogorov's consistency thorem .One of the problem with this is that the sample space is too big but the sigma algebra is very small. Weiner gave an alternate construction with sample space the set of all continuous functions on R plus but no where differentiable. We shall give these results and also some ergodic theorems similar to that of Birkhoff but with denominator root t and some conditional limit theorem of the past and the future given the present is in a bounded open set.

3/6/19

3/13/19

John Jasper (SDSU)

Title: Equiangular tight frames: difference sets and their generalizations

Abstract: Several applications in signal processing require lines through the origin of a finite-dimensional Hilbert space with the property that the smallest interior angle is as large as possible. Packings that achieve equality in the Welch bound are known as equiangular tight frames (ETFs). The central problem in the study of ETFs is the question of existence, that is, given integers n and d, does there exist an ETF with n lines in d-dimensional space? To tackle this problem the primary approach has been to develop new constructions of ETFs. One of the oldest general constructions uses harmonic analysis on finite abelian groups, together with special subsets of the group called difference sets. In this talk we will discuss two recently discovered constructions and how they were directly inspired by the difference set construction. In the first we generalize abelian groups to association schemes, and find the proper analogue of difference sets in this setting. In the second we ignore the algebra to find the underlying combinatorics behind several known types of difference sets.

3/27/19

Eric Weber

Conjugate Phase Retrieval in the Paley-Wiener Space

4/3/19

Eric Weber

Conjugate Phase Retrieval in the Paley-Wiener Space

4/10/19


4/17/19

Praneeth Narayanamurthy (ECPE, ISU)

Title: Subspace Learning from Bad Data -- Part 1: (dynamic) Robust PCA

Abstract: Principal Components Analysis (PCA), a.k.a. subspace learning, is one of the most widely used dimension reduction techniques. It is a preprocessing step before a large number of data analysis applications. The underlying assumption for PCA is that the true data (``signal’’) sequence lies close to a low-dimensional subspace of the ambient space. For long sequences, one can also consider a dynamic setting where one allows the signal subspace can change with time. This talk with discuss two problems that involve PCA or subspace learning from ``bad’’ data. Here the term ``bad’’ means one of many things -- the signals could have missing entries and/or outliers or the observed data could be a nonlinear function of the signals. The former is the well-known Robust PCA problem and is posed as one of decomposing a given data matrix into the sum of a low-rank and a sparse matrix. In a recent body of work, we have studied its dynamic version (Robust Subspace Tracking) and have introduced the first provably correct and practically usable online solution framework for it that we call Recursive Projected Compressive Sensing (ReProCS). The latter is a problem we have recently studied for the specific setting of the measurements being phaseless (magnitude-only) linear projections of each column of the unknown low-rank matrix. We call this Phaseless PCA. For this again, we provide a simple alternating-minimization solution and guarantee that its sample complexity is significantly better than singe (unstructured) phase retrieval solutions. Applications in video analytics and imaging are discussed.

4/24/19

Nate Harding

5/1/19

Fernando Charro (Wayne State)

Title: The Aleksandrov-Bakelman-Pucci Maximum Principle revisited. Applications

Abstract: The Aleksandrov-Bakelman-Pucci maximum principle is a crucial tool in Caffarelli’s proof of the Harnack inequality for fully nonlinear uniformly elliptic equations, and the doorstep to regularity theory. In this talk we will prove the Aleksandrov-Bakelman-Pucci estimate for a class of nonlinear, possibly degenerate, elliptic and parabolic equations that include the p-Laplacian and the Mean Curvature Flow among others. This is a joint work with Roberto Argiolas and Ireneo Peral. Time permitting, we will show a proof of the classical isoperimetric inequality with the best constant using the ABP method, due to X. Cabre.

 

 

 



Fall Semester

 

Wednesdays, 3:10pm, Carver 290

 

Speaker

Title and Abstract

8/22/18

Organizational Meeting

8/29/18

Jennifer Newman (ISU)

Title: What’s in a Picture: Steganography and Digital Image Forensics

Abstract: This is an introductory talk on steganography, detection of steganography (steganalysis), and the CSAFE Project StegoAppDB.

9/5/18

Caleb Camrud, Evan Camrud, and Lee Przybylski (ISU)

Stability of the Kaczmarz Reconstruction for Stationary Sequences

9/12/18

Cathy O'Neil, Miller Lecture
"The Dark Side of Big Data"
9/11/18, 7:00pm, Great Hall, MU
Info.

9/19/18

Alex Neal-Riasanovsky (ISU)

Title: There's Probably Analysis and Data Science in a Graphon
Abstract: Graph limits (graphons) are an analytic version of the combinatorial object know as a graph. Born out of the Theory Group of Microsoft Research in Redmond, Washington in 2003 and motivated by long-standing trends in extremal combinatorics, data analytics, probability theory, and computer science, graphons have since spawned several new tools and unified old ones under a common theme. In this talk, we survey some recent results and applications.

9/26/18

10/3/18

Eric Weber (ISU)

Neural Networks and Ridgelet Transforms

10/10/18

Krishna Athreya (ISU)

Title: What is standard Brownian motion? Construction and some basic properties.

10/17/18

10/24/18

Tim McNicholl (ISU)

10/31/18

INFAS at UNL, 11/3/18

11/7/18

Joey Iverson (ISU)

11/14/18

Tom Needham (OSU)

Title: Gromov-Monge Metrics and Distance Distributions

Abstract: Applications in data analysis and computer vision often require a registration between objects; that is, a map from one object to another with minimal distortion of geometry. We give a flexible notion of object comparison which captures this idea by defining a metric on the space of all metric measure spaces (metric spaces endowed with probability measures). The metric, called Gromov-Monge distance, is defined by blending ideas from the theory of optimal transport with the Gromov-Hausdorff construction. We show that this distance has polynomial-time computable lower bounds defined in terms of classical invariants of metric measure spaces called distance distributions. Using tools from topological data analysis, we provide rigorous results on the effectiveness of these lower bounds when restricted to simple classes of mm-spaces such as metric graphs or plane curves.This is joint work with Facundo Mémoli.

11/28/18

Ananda Weerasinghe (ISU)

Optimal admission policies for matching queues

12/5/18

Ananda Weerasinghe (ISU)

Optimal admission policies for matching queues

 

 

 

 

 

For more information contact:

 

Eric Weber; 454 Carver Hall; 294-8151; E-mail esweber at iastate dot edu

 

David Herzog; 474 Carver Hall; 294-6408; E-mail dherzog at iastate dot edu