Since its introduction and application to the Central Limit Theorem in 1972, Stein’s method has offered a novel way of evaluating the quality of distributional approximations through its use of characterizing equations. The method can often produce not only asymptotic information on the error made when approximating a complicated distribution by a simpler one, but also rates of convergence, and in some cases, finite sample bounds with computable constants. In addition, Stein’s method can often be applied in the presence of complicated dependence, generalizing, in the case of Normal approximation, the classical Berry-Esseen theorem in a number of directions. The characterizing equation approach to distributional approximation is not specific to the normal, and Stein’s method has successfully been applied to dozens of distributions, both classical ones, and those less known.
Following on its success in distributional approximation, the techniques developed for the method have found useful connections to Gaussian inequalities and Mallivin Calculus, Concentration of measure inequalities, and high dimensional statistics.

The course will cover the fundamentals of Stein’s method, starting with distribuitonal approximation for the Normal and Poisson to illustrate the construction of the Stein equation and the derivation of the properties of its solution. A number of coupling methods for use in the Stein equation will be presented. In addition to the basic case of independence, a sampling of potential applications involving the Poisson and Normal include sequence matching, the birthday problem and random graphs, hierarchical sequences, cone measure projections, the combinatorial central limit theorem, simple random sampling, geometric coverage processes, character ratios, the anti-voter model, random graphs and the lightbulb process; non-normal examples include the Curie-Weiss model from physics and the time complexity of the Quick-Select algorithm.

The course will also include the derivation of concentration inequalities using Stein type couplings, with applications to permutations and random regular graphs, and applications to high dimensional statistics, drawn from the study of the recovery threshold in high dimensional regression and its relation to Gaussian inequalities, and the relaxation of Gaussian conditions in high dimensional single index models, such as in one bit compressed sensing and shrinkage estimation.

Input from course members will help determine the selection of topics from the wide range of choices available. Students will be evaluated on the basis of course participation and a final project presentation.

Schedule: Mondays and Wednesdays, 8:40-10AM, to be held in KAP 414

Recommended Text:

Normal Approximation by Stein’s Method
Chen, L., Goldstein, L., and Shao, Q.M.
Springer Verlag, 2010 [Springer Link]

Supplemental Reference:

Fundamental’s of Stein’s Method
Ross, N.

Sampling of additional books and articles of interest:

Poisson Approximation

Poisson Approximation
Barbour, A.D., Holst, L., and Janson, S.
Oxford Science Publications, 1992
MR1163825 (93g:60043)

Two Moments Suffice for Poisson Approximations: The Chen-Stein Method
 Arratia, R., Goldstein, L., and Gordon, L.
The Annals of Probability, Vol. 17, No. 1. (Jan., 1989), pp. 9-25
MR0972770 (90b:60021)

 Poisson Approximation and the Chen-Stein Method
Arratia; R., Goldstein, L. and Gordon, L.
Statistical Science, Vol. 5, No. 4. (Nov., 1990), pp. 403-424.
MR1092983 (92e:62036)

Total Variation Distance for Poisson Subset Numbers
Goldstein, L, and Reinert, G.
Annals of Combinatorics (2006), vol 10,  pp. 333–341

Normal Approximation

A Probabilistic Proof of the Lindeberg-Feller Central Limit Theorem
Goldstein, L.
American Mathematical Monthly (2009), vol 116, pp. 45–60 [pdf]

Bounds on the Constant in the Mean Central Limit Theorem
Goldstein, L.
Annals of Probability (2010), vol 38, pp. 1672-1689.

Spin glasses and Stein’s method
Chatterjee, S.
Probab. Theory Related Fields  148  (2010),  no. 3-4, 567–600

On coupling constructions and rates in the CLT for dependent summandswith applications to the anti-voter model and weighted U-statisticss
Rinott Y. and Rotar V.
Annals of Applied Probability 7, pp 1080-1105

Multivariate Normal Approximation by Stein’s Method and Size Bias Couplings
Goldtein, L. and Rinott Y 

Multivariate Normal Approximation by Stein’s Method: The Concentration Inequality Approach
Chen, L.H.Y., and Fang, X.

Fluctuations of Eigenvalues and Second Order Poincaré Inequalities
Chatterjee, S.

Stein’s method on Wiener chaos
Nourdin I, and Peccati, G.

Other distributional Approximations

 Non asymptotic distributional bounds for the Dickman approximation of the running time of the      Quickselect algorithm
Goldstein, L.
Electronic Journal of Probability, (2018) vol. 23, pp. 1–13 DOI: 10.1214/18-EJP227

Dickman approximation in simulation, summations and perpetuities
Bhattacharjee, C. and Goldstein, L.
Bernoulli, (2019) vol 25, No. 4A, pp. 2758–2792

Non normal approximation by Stein’s method of exchangeable pairs with application to the Curie-Weiss model
Chatterjee, S., Shao, Q.M.
Ann. Appl. Probab. 21 (2011), no. 2, 464–483

Degree asymptotics with rates for preferential attachment random graphs
Peköz, E., Röllin, A., and Ross, Nathan R.

Concentration Inequalities

Stein’s method for concentration inequalities
Chatterjee, S.

Applications of size biased couplings for concentration of measures
Ghosh, S., and Goldstein, L.
Electronic Communications in Probability (2011), vol 16, pp. 70-83.

Bounded size biased couplings, log concave distributions and concentration of measure for occupancy models
Bartroff, J., Goldstein, L. and Işlak, Ü
Bernoulli, (2018) vol  24, No. 4B, pp. 3283-3317.

Concentration inequalities via zero bias couplings
Goldstein, L. and Işlak, Ü.
Statistics and Probability Letters, (2014), vol 86, pp. 17-23
[][Elsevier] [10.1016/j.spl.2013.12.001]

Size biased couplings and the spectral gap for random regular graphs
Cook, N., Goldstein, L. and Johnson, T.
Annals of Probability, (2018), vol 46, No.1, 72-125

Statistics and Machine Learning Applications

 Non-Gaussian Observations in Nonlinear Compressed Sensing via Stein Discrepancies
Goldstein, L. and Wei, X
Information and Inference: A Journal of the IMA, (2019) vol 8.1, pp. 125-159. iay006.

Gaussian Phase Transitions and Conic Intrinsic Volumes:  Steining the Steiner Formula
Goldstein, L., Nourdin, I. and Peccati, G.
Annals of Applied Probability (2017), vol 27, pp. 1-47

A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation
Qiang L., Q, Lee, J., Jordan, M.

Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm
Qiang L. and Wang, D.

Relaxing the Gaussian assumption in Shrinkage and SURE in high dimension
Fathi, M., Goldstein, L., Reinert, G. and Saumard, A.


Additional material of potential interest

Exchangeable pairs from switchings
Johnson, T.

On a connection between Stein characterizations and Fisher information
Ley, C., and Swan, Y.

Estimation of the mean of a multivariate normal distribution
Stein, C.
MR0630098 (83a:62080)

A General Size biased distribution

Size bias for one and all
Arratia, R., Goldstein, L. and Kochman, F.
Probability Surveys, (2019) vol 16, pp. 1-61