The PDF covers the following topics
related to Information Theory : Foundations: probability, uncertainty,
information, Entropies defined, and why they are measures of information, Source
coding theorem; prefix, variable-, and fixed-length codes, Channel types,
properties, noise, and channel capacity, Continuous information, density, noisy
channel coding theorem, Fourier series, convergence, orthogonal representation,
Useful Fourier theorems, transform pairs, Sampling, aliasing, Discrete Fourier
transform, Fast Fourier Transform Algorithms, The quantised degrees-of-freedom
in a continuous signal, Gabor-Heisenberg-Weyl uncertainty relation, Kolmogorov
complexity.
Author(s): J G Daugman
75Pages