The PDF covers the following topics
related to Information Theory : Foundations: probability, uncertainty,
information, Entropies defined, and why they are measures of information, Source
coding theorem; prefix, variable-, and fixed-length codes, Channel types,
properties, noise, and channel capacity, Continuous information, density, noisy
channel coding theorem, Fourier series, convergence, orthogonal representation,
Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier
transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom
in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov
complexity.
This note explains the following topics: uncertainty
and information, Efficient coding of information, Stationary processes and
markov chains, Coding for noisy transmission, Complements to efficient coding of
Information, Error correcting codes and cryptography.
This book explains basics of thermodynamics, including thermodynamic
potentials, microcanonical and canonical distributions, and evolution in the
phase space, The inevitability of irreversibility, basics of information theory,
applications of information theory, new second law of thermodynamics and quantum
information.
This PDF covers the following
topics related to Information Theory : Information measures, Lossless data
compression, Binary hypothesis testing, Channel coding, Lossy data compression,
Advanced topics.
The PDF covers the following topics
related to Information Theory : Foundations: probability, uncertainty,
information, Entropies defined, and why they are measures of information, Source
coding theorem; prefix, variable-, and fixed-length codes, Channel types,
properties, noise, and channel capacity, Continuous information, density, noisy
channel coding theorem, Fourier series, convergence, orthogonal representation,
Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier
transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom
in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov
complexity.
This note will explore the basic
concepts of information theory. It is highly recommended for students planning
to delve into the fields of communications, data compression, and statistical
signal processing. Topics covered includes: Entropy and mutual information,
Chain rules and inequalities, Data processing, Fano's inequality, Asymptotic
equipartition property, Entropy rate, Source coding and Kraft inequality,
Optimal code length and roof code, Huffman codes, Shannon-Fano-Elias and
arithmetic codes, Maximum entropy, Channel capacity, Channel coding theorem,
Differential entropy, Gaussian channel, Parallel Gaussian channel and
water-filling, Quantization and rate-distortion.