Posted on Saturday, November 2, 2019
The NESS Colloquium series will continue on Thursday, November 14th at Boston
University with a talk by Professor David Dunson from Duke University. This
colloquium is open to the general public and should be accessible to those
interested in statistics at all levels. The event is sponsored by the New
England Statistical Society (NESS) and the Department of Mathematics and
Statistics at Boston University.
Professor David Dunson is the Arts and Sciences Professor of Statistical
Science at Duke University
Time: Thursday, November 14, 2019
Starting at 4 pm (with refreshments served starting at 3:25 pm)
Location: College of General Studies Building
871 Commonwealth Avenue, Boston, MA, 02215
Room CGS 129
Professor David Dunson is the Arts and Sciences Professor of Statistical Science at
Duke University, has been given numerous awards including the COPSS Presidents’
Award, and is a Fellow in the Institute of Mathematical Statistics and the
American Statistical Association.
Professor Dunson’s research spans numerous areas of statistics with a focus on
scalable procedures with provable guarantees that can be applied to complex
data structures. His work has had broad impact within the statistics community
and in many other fields including biomedical research, genomics, ecology,
criminal justice, and neuroscience.umonia and other diseases.
Learning & Exploiting Low-dimensional Structure In High-Dimensional Data
This talk will focus on the problem of learning low-dimensional geometric
structure in high-dimensional data. We allow the lower-dimensional subspace to
be non-linear. There are a variety of algorithms available for “manifold
learning” and non-linear dimensionality reduction, mostly relying on locally
linear approximations and not providing a likelihood-based approach for
inferences. We propose a new class of simple geometric dictionaries for
characterizing the subspace, along with a simple optimization algorithm and
a model-based approach to inference. We provide strong theory support, in terms
of tight bounds on covering numbers, showing advantages of our approach
relative to local linear dictionaries. These advantages are shown to carry over
to practical performance in a variety of settings including manifold learning,
manifold de-noising, data visualization (providing a competitor to the popular
tSNE), classification (providing a competitor to deep neural networks that
requires fewer training examples), and geodesic distance estimation. We
additionally provide a Bayesian nonparametric methodology for inference, using
a new class of kernels, which is shown to outperform current methods, such as
mixtures of multivariate Gaussians.