Students in the Master's programs in Statistics:
Questions about specific course work should be directed to the
advising team (email address: firstname.lastname@example.org) or to your assigned faculty advisor.
Please go to this page to this page
set up an appointment
with me or other advisors. Students in the dual Master's program are
welcome to sign up for an appointment with me.
- Office hours in Winter 2020: 10--11:15am, Thursdays, WH 461 or by email appt
Prospective PhD students: Thank you for your interest. Admissions decision is
made a graduate admissions committee, please see
this link for further information. My apology if I am unable to respond to your enquiry
due to the large volume of such emails.
- Nonparametric Bayesian statistics
- Machine learning and optimization
- Hierarchical, mixture and graphical models
- Spatiotemporal and functional data analysis
- Stochastic, variational and geometric methods in statistical inference
Synopsis: Statistical inference is the computational process of turning data into statistics, prediction
and understanding. I work with richly structured data, such as those
extracted from texts, images and other
I am particularly interested in a
field in statistics known as Bayesian nonparametrics, which provides
a fertile and powerful mathematical framework for the development of
many computational and statistical modeling ideas. The spirit of
Bayesian nonparametric statistics is to enable the kind of
inferential procedures according to which both the statistical modeling
and computational complexity may adapt to increasingly large and
complex data patterns in a graceful and effective way.
In this framework, stochastic processes and random measures, along with
latent variable models such as mixture, hierarchical and graphical
models figure prominently. My students and I seek to understand
between statistical inference and the theory of optimal transport
that arises in the learning of complex hierarchical models.
My motivation for all this came originally from an early interest in
machine learning, which continues to be an area of major activities.
A primary focus in our machine learning research is to develop more effective
inference algorithms using variational, stochastic and geometric viewpoints.
Former students and postdocs
- Mikhail Yurochkin PhD 2018;
Research scientist, IBM Research, Cambridge, MA and MIT-IBM Watson AI Lab
- Nhat Ho PhD 2017;
Postdoctoral fellow, University of California, Berkeley
- Hossein Keshavarz PhD 2017;
Postdoctoral fellow at the IMA, University of Minnesota,
now Senior Researcher at General Motors, MI
- Zhaoshi Meng PhD 2014;
Senior Researcher, Vicarious, CA
Arash Ali Amini Postdoctoral fellow 2011--2014;
Assistant Professor, University of California, Los Angeles
- Vijay Manikandan Janakiraman PhD 2013;
Research Scientist, NASA's Ames Research
- Jian Tang
Visiting PhD student from Peking University 2011--2013; Assistant Professor, Université de Montréal
- Kohinoor Dasgupta, PhD 2012;
Senior Biostatistician, Novartis India
- Cen Guo, PhD 2012;
Data Scientist, Uber, CA
Undergraduate honor thesis advisees
- Ziyi Song (AMDP)
- Sijun Zhang (AMDP)
- Jawad Mroueh, MS 2019
- Bopeng Li, MS 2012; in Statistics PhD program, University of Michigan
- 2019--2020: Jingyi Jia (graduate student at UM Statistics)
- 2018--2019: Yingsi Jian (graduate student at Harvard), Jiayue Lu (graduate student
at Univ of Southern California)
- 2017--2018: Jiahui Ji (graduate student at UM Biostatistics), Zui Chen (graduate student
at Parsons School of Design, NYC)
- Giuseppe Di Benedetto Visiting PhD student from Oxford University,
Federico Camerlenghi Postdoctoral visiting scholar, April--May 2016;
Assistant Professor, University of Milano-Bicocca
- Hyun-Chul Kim, Visiting Scholar 2010--2011; Research Professor, Yonsei University, Korea
Selected talk slides
Parameter estimation and interpretability in Bayesian mixture models.
Keynote talk, 12th Bayesian Nonparametrics Conference, Oxford, June 2019.
Elements of data science.
Summer School on Data Science,
Vietnam Institute for Advanced Study in Mathematics,
Hanoi and Ho Chi Minh, May 2017.
Multi-level clustering with contexts via hierarchical nonparametric Bayesian inference.
Biostatistics Seminar, University of Michigan, October 2016.
Singularity structures and parameter estimation in finite mixture models.
Workshop on Empirical Likelihood Methodology,
National University of Singapore, June 2016.
Topic modeling with more confidence: a theory and some algorithms.
Keynote talk, Pacific-Asia Knowledge Discovery and Data Mining Conference, Ho Chi Minh, May 2015.
Borrowing strength in hierarchical Bayes: convergence of the Dirichlet base measure.
9th Bayesian Nonparametrics Conference, Amsterdam, June 2013.
Convergence of latent mixing measures in finite and infinite mixture models.
Bayesian Nonparametrics Workshop at ICERM, Providence, September 2012.
Clustering problems, mixture models and Bayesian nonparametrics.
VIASM Summer School, Hanoi, July 2012.
[Additional notes ]
Message-passing sequential detection of multiple change points in networks.
IEEE Symposium on Information Theory, Boston,
Inference of functional clusters from non-functional data.
Midwest Statistics Research Colloquium, Madison, March 2012.
Dirichlet labeling and hierarchical processes for clustering functional data.
IMS-China Conference, Xi'an, July 2011.
Decentralized decision making with spatially distributed data.
AI Seminar, University of Michigan, Oct 2009.
Surrogate loss functions, divergences and decentralized detection.
Thesis Talk, UC Berkeley, May 2007.
Anomaly and sequential detection with time series data.
Tutorial lectures given at Berkeley, 2006.