The PDF covers the following topics
related to Information Theory : Foundations: probability, uncertainty,
information, Entropies defined, and why they are measures of information, Source
coding theorem; prefix, variable-, and fixed-length codes, Channel types,
properties, noise, and channel capacity, Continuous information, density, noisy
channel coding theorem, Fourier series, convergence, orthogonal representation,
Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier
transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom
in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov
complexity.
This book explains basics of thermodynamics, including thermodynamic
potentials, microcanonical and canonical distributions, and evolution in the
phase space, The inevitability of irreversibility, basics of information theory,
applications of information theory, new second law of thermodynamics and quantum
information.
This PDF covers the
following topics related to Information Theory : Introduction, Entropy, Relative
Entropy, and Mutual Information, Asymptotic Equipartition Properties,
Communication and Channel Capacity, Method of Types, Conditional and Joint
Typicality, Lossy Compression & Rate Distortion Theory, Joint Source Channel
Coding.
The PDF covers the following topics related to Information
Theory : Information Theory for Data Communications and Processing, On the
Information Bottleneck Problems: Models, Connections,Applications and
Information Theoretic Views, Variational Information Bottleneck for Unsupervised
Clustering: Deep Gaussian Mixture Embedding, Asymptotic Rate-Distortion Analysis
of Symmetric Remote Gaussian Source Coding: Centralized Encoding vs. Distributed
Encoding, Non-Orthogonal eMBB-URLLC Radio Access for Cloud Radio Access Networks
with Analog Fronthauling, Robust Baseband Compression Against Congestion in
Packet-Based Fronthaul Networks Using Multiple Description Coding, Amplitude
Constrained MIMO Channels: Properties of Optimal Input Distributions and Bounds
on the Capacity, Quasi-Concavity for Gaussian Multicast Relay Channels, Gaussian
Multiple Access Channels with One-Bit Quantizer at the Receiver, Efficient
Algorithms for Coded Multicasting in Heterogeneous Caching Networks,
Cross-Entropy Method for Content Placement and User Association in Cache-Enabled
Coordinated Ultra-Dense Networks, Symmetry, Outer Bounds, and Code
Constructions: A Computer-Aided Investigation on the Fundamental Limits of
Caching.
This note covers the following topics: Entropy,
Kraft's inequality, Source coding theorem, conditional entropy, mutual
information, KL-divergence and connections, KL-divergence and Chernoff bounds,
Data processing and Fano's inequalities, Asymptotic Equipartition Property,
Universal source coding: Lempel-Ziv algorithm and proof of its optimality,
Source coding via typical sets and universality, joint typicality and joint AEP,
discrete channels and channel capacity, Proof of Noisy channel coding theorem,
Constructing capacity-achieving codes via concatenation, Polarization, Arikan's
recursive construction of a polarizing invertible transformation, Polar codes
construction, Bregman's theorem, Shearer's Lemma and applications, Source coding
and Graph entropy, Monotone formula lower bounds via graph entropy, Optimal set
Disjointness lower bound and applications, Compression of arbitrary
communication protocols, Parallel repetition of 2-prover 1-round games.
Author(s): Venkatesan
Guruswami and Mahdi Cheraghchi
This note explains the
following topics: Shearer's Lemma, Entropy, Relative Entropy, Hypothesis
testing, total variation distance and Pinsker's lemma, Stability in Shearer's
Lemma, Communication Complexity, Set Disjointness, Direct Sum in Communication
Complexity and Internal Information Complexity, Data Structure Lower Bounds via
Communication Complexity, Algorithmic Lovasz Local Lemma, Parallel Repetition
Theorem, Graph Entropy and Sorting.