Information Theory and Coding

Welcome to the Information Theory and Coding course (CE-40676, fall 2021). If you are interested, please sign up on the Quera page of the course. The lectures will be held on Saturday/Monday 10:30–12:30 in this CE-202 class.

Course Objectives

Information theory was introduced by Claude Shannon in a seminal paper in 1948. The first aim of information theory was to formalize and quantify transmission of information over communication channels. Later, information theory has been used in many other fields, including statistics, theoretical computer science, game theory, machine learning, etc.

In this course, we will introduce the basic concepts of information theory and review some the main applications, including source coding, channel capacity, etc.

Main References

The main part of the course is based on the following book:

  • Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory,” Wiley Series in Telecommunications and Signal Processing, 2nd Edition, July 18, 2006.

We also may use the following great book by Mackey:

  • David J. C. MacKay, “Information Theory, Inference and Learning Algorithms,” Cambridge University Press, 1st Edition, October 6, 2003.

Fall 2020 Lectures

In the following, the lectures and notes of the course for 2020 fall semester can be found (the lectures and notes are in Persian).

Course Introduction,
Review of the Probability
Lec 01Note 01
Review of the ProbabilityLec 02Note 02
Review of the Probability,
Introduction to Entropy
Lec 03Note 03
Relative Entropy,
Mutual Information
Lec 04Note 04
Convex Functions,
Properties of Information Measures
Lec 05Note 05
Properties of Information MeasuresLec 06Note 06
Sufficient Statistics,
Fano’s Inequality,
Counting Primes using Entropy
Lec 07Note 07
Shearer’s Lemma,
Introduction to Source Coding
Lec 08Note 08
Kraft Inequality,
Optimal Instantaneous Source Codes:
Lower Bound
Lec 09Note 09
Optimal Instantaneous Source Codes:
Lower Bound,
One-Shot vs. Multi-Shot Coding,
Kraft Inequality for
Uniquely Decodable Codes
Lec 10Note 10
Huffman CodeLec 11Note 11
Optimality of Huffman Code,
Introduction to Typical Sequences
Lec 12Note 12
Typical Set and Typical Sequences,
Proof of the Source Coding Theorem
by using Typical Sequences
Lec 13Note 13
Review of Stochastic Processes,
Entropy Rate of Stochastic Processes
Lec 14Note 14
Introduction to Channel Capacity,
Data Transmission over Noisy Channel,
Discrete Memoryless Channel,
Information Capacity of a Channel
Lec 15Note 15
Some Examples about Channel Capacity Computation,
Properties of Channel Capacity,
An Intuition about Channel Capacity
Lec 16Note 16
1799-09-08Some Definitions, Statement of the Noisy
Channel Coding Theorem, Converse Proof
of the Channel Coding Theorem
Lec 17Note 17
1899-09-10The Achievability Proof of the
Channel Coding Theorem
Lec 18Note 18
1999-09-15The Achievability Proof of the
Channel Coding Theorem
Lec 19Note 19
2099-09-17Optimal Decoder, Hamming Distance,
Optimal Decoder for BSC, Minimum
Distance of a Code
Lec 20Note 20
2199-09-22Some Definitions (Group, Field, Vector Space),
Linear Codes
Lec 21Note 21
2299-09-24Linear Codes, Generator and Parity Check
Matrix of a Linear Code, Some
Examples of Linear Codes
Lec 22Note 22
2399-10-01Tanner Graph, LDPC Codes,
Differential Entropy, Relation between
Differential Entropy and Discrete Entropy
Lec 23Note 23
2499-10-06AEP for Continuous Random Variables,
Joint and Conditional Differential Entropy,
Relative Entropy and Mutual Information,
Maximization Property of Gaussian Distribution
for Differential Entropy
Lec 24Note 24