Computer Science BooksInformation Theory Books

An Introduction to Information Theory and Applications

Advertisement

An Introduction to Information Theory and Applications

An Introduction to Information Theory and Applications

This note explains the following topics: uncertainty and information, Efficient coding of information, Stationary processes and markov chains, Coding for noisy transmission, Complements to efficient coding of Information, Error correcting codes and cryptography.

Author(s):

s293 Pages
Similar Books
An Introduction to Information Theory and Applications

An Introduction to Information Theory and Applications

This note explains the following topics: uncertainty and information, Efficient coding of information, Stationary processes and markov chains, Coding for noisy transmission, Complements to efficient coding of Information, Error correcting codes and cryptography.

s293 Pages
Lecture Notes on statistics and information Theory

Lecture Notes on statistics and information Theory

This lecture note navigates through information theory, statistics and measure theory. It covers fundamental concepts such as definitions, chain rules, data processing inequalities, and divergences and extends to optimal procedures, LeCam’s and Fano’s inequalities, and operational results like entropy and source coding. It also focus on exponential families and statistical modeling, fitting procedures, and lower bounds on testing parameters, sub-Gaussian and sub-exponential random variables, martingale methods, uniformity covering topics such as Kullback-Leibler divergence, PAC-Bayes bounds, interactive data analysis, and error bounds.

s464 Pages
Basics of information theory

Basics of information theory

This book explains basics of thermodynamics, including thermodynamic potentials, microcanonical and canonical distributions, and evolution in the phase space, The inevitability of irreversibility, basics of information theory, applications of information theory, new second law of thermodynamics and quantum information.

s165 Pages
Information Theory in Computer Science

Information Theory in Computer Science

This note explains the following topics: Shearer's Lemma, Entropy, Relative Entropy, Hypothesis testing, total variation distance and Pinsker's lemma, Stability in Shearer's Lemma, Communication Complexity, Set Disjointness, Direct Sum in Communication Complexity and Internal Information Complexity, Data Structure Lower Bounds via Communication Complexity, Algorithmic Lovasz Local Lemma, Parallel Repetition Theorem, Graph Entropy and Sorting.

sNA Pages
Information Theory by Yao Xie

Information Theory by Yao Xie

This note will explore the basic concepts of information theory. It is highly recommended for students planning to delve into the fields of communications, data compression, and statistical signal processing. Topics covered includes: Entropy and mutual information, Chain rules and inequalities, Data processing, Fano's inequality, Asymptotic equipartition property, Entropy rate, Source coding and Kraft inequality, Optimal code length and roof code, Huffman codes, Shannon-Fano-Elias and arithmetic codes, Maximum entropy, Channel capacity, Channel coding theorem, Differential entropy, Gaussian channel, Parallel Gaussian channel and water-filling, Quantization and rate-distortion.

sNA Pages
Information Theory, Inference, and Learning Algorithms (David J.C. MacKay)

Information Theory, Inference, and Learning Algorithms (David J.C. MacKay)

Currently this section contains no detailed description for the page, will update this page soon.

s Pages

Advertisement