Information Theory and its applications in theory of computation
This set of lecture notes by Venkatesan Guruswami
and Mahdi Cheraghchi addresses the intersection of information theory and
theoretical computer science. The core topics to be covered in the lecture note
include entropy, Kraft's inequality, source coding theorem, conditional entropy,
and mutual information. It also covers KL-divergence, Chernoff bounds, data
processing, and Fano's inequalities. Key concepts include AEP, universal source
coding using the Lempel-Ziv algorithm, and proof of its optimality. It covers
discrete channels and channel capacity, the Noisy Channel Coding Theorem, and
how to construct capacity-achieving codes by concatenation and by polar codes.
Additional topics: Bregman's theorem, Shearer's Lemma, graph entropy, and
applications to optimal set disjointness lower bounds. This text offers a
wide-ranging view of how the basic principles of information theory shed light
on the construction of algorithms, and the establishment of bounds-on the
complexity of problems in the field of theoretical computation.
Author(s): Venkatesan
Guruswami and Mahdi Cheraghchi
NA Pages