Lecture Notes on statistics and information Theory
This
lecture note navigates through information theory, statistics and measure theory. It
covers fundamental concepts such as definitions, chain rules, data processing
inequalities, and divergences and extends to optimal procedures, LeCam’s and
Fano’s inequalities, and operational results like entropy and source coding. It
also focus on exponential families and statistical modeling, fitting procedures,
and lower bounds on testing parameters, sub-Gaussian and sub-exponential random
variables, martingale methods, uniformity covering topics such as
Kullback-Leibler divergence, PAC-Bayes bounds, interactive data analysis, and
error bounds.
Author(s): John Duchi
464 Pages