Information Theory and Coding

Welcome to the Information Theory and Coding course (CE-40676, fall 2021). If you are interested, please sign up on the Quera page of the course. The lectures will be held on Saturday/Monday 10:30–12:30 in this CE-202 class.

Course Objectives

Information theory was introduced by Claude Shannon in a seminal paper in 1948. The first aim of information theory was to formalize and quantify transmission of information over communication channels. Later, information theory has been used in many other fields, including statistics, theoretical computer science, game theory, machine learning, etc.

In this course, we will introduce the basic concepts of information theory and review some the main applications, including source coding, channel capacity, etc.

Main References

The main part of the course is based on the following book:

  • Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory,” Wiley Series in Telecommunications and Signal Processing, 2nd Edition, July 18, 2006.

We also may use the following great book by Mackey:

  • David J. C. MacKay, “Information Theory, Inference and Learning Algorithms,” Cambridge University Press, 1st Edition, October 6, 2003.

Fall 2020 Lectures

In the following, the lectures and notes of the course for 2020 fall semester can be found (the lectures and notes are in Persian).

#DateSubjectVideoNote
199-06-31
(20-09-21)
Course Introduction,
Review of the Probability
Lec 01Note 01
299-07-05
(20-09-26)
Review of the ProbabilityLec 02Note 02
399-07-07
(20-09-28)
Review of the Probability,
Introduction to Entropy
Lec 03Note 03
499-07-12
(20-10-03)
Relative Entropy,
Mutual Information
Lec 04Note 04
599-07-14
(20-10-05)
Convex Functions,
Properties of Information Measures
Lec 05Note 05
699-07-19
(20-10-10)
Properties of Information MeasuresLec 06Note 06
799-07-21
(20-10-12)
Sufficient Statistics,
Fano’s Inequality,
Counting Primes using Entropy
Lec 07Note 07
899-07-28
(20-10-19)
Shearer’s Lemma,
Introduction to Source Coding
Lec 08Note 08
999-08-03
(20-10-24)
Kraft Inequality,
Optimal Instantaneous Source Codes:
Lower Bound
Lec 09Note 09
1099-08-05
(20-10-26)
Optimal Instantaneous Source Codes:
Lower Bound,
One-Shot vs. Multi-Shot Coding,
Kraft Inequality for
Uniquely Decodable Codes
Lec 10Note 10
1199-08-10
(20-10-31)
Huffman CodeLec 11Note 11
1299-08-19
(20-11-09)
Optimality of Huffman Code,
Introduction to Typical Sequences
Lec 12Note 12
1399-08-24
(20-11-14)
Typical Set and Typical Sequences,
Proof of the Source Coding Theorem
by using Typical Sequences
Lec 13Note 13
1499-08-26
(20-11-16)
Review of Stochastic Processes,
Entropy Rate of Stochastic Processes
Lec 14Note 14
1599-09-01
(20-11-21)
Introduction to Channel Capacity,
Data Transmission over Noisy Channel,
Discrete Memoryless Channel,
Information Capacity of a Channel
Lec 15Note 15
1699-09-03
(20-11-23)
Some Examples about Channel Capacity Computation,
Properties of Channel Capacity,
An Intuition about Channel Capacity
Lec 16Note 16
1799-09-08Some Definitions, Statement of the Noisy
Channel Coding Theorem, Converse Proof
of the Channel Coding Theorem
Lec 17Note 17
1899-09-10The Achievability Proof of the
Channel Coding Theorem
Lec 18Note 18
1999-09-15The Achievability Proof of the
Channel Coding Theorem
Lec 19Note 19
2099-09-17Optimal Decoder, Hamming Distance,
Optimal Decoder for BSC, Minimum
Distance of a Code
Lec 20Note 20
2199-09-22Some Definitions (Group, Field, Vector Space),
Linear Codes
Lec 21Note 21
2299-09-24Linear Codes, Generator and Parity Check
Matrix of a Linear Code, Some
Examples of Linear Codes
Lec 22Note 22
2399-10-01Tanner Graph, LDPC Codes,
Differential Entropy, Relation between
Differential Entropy and Discrete Entropy
Lec 23Note 23
2499-10-06AEP for Continuous Random Variables,
Joint and Conditional Differential Entropy,
Relative Entropy and Mutual Information,
Maximization Property of Gaussian Distribution
for Differential Entropy
Lec 24Note 24