Welcome to the Information Theory and Coding course (CE-40676, fall 2021). If you are interested, please sign up on the Quera page of the course. The lectures will be held on Saturday/Monday 10:30–12:30 in this CE-202 class.
Course Objectives
Information theory was introduced by Claude Shannon in a seminal paper in 1948. The first aim of information theory was to formalize and quantify transmission of information over communication channels. Later, information theory has been used in many other fields, including statistics, theoretical computer science, game theory, machine learning, etc.
In this course, we will introduce the basic concepts of information theory and review some the main applications, including source coding, channel capacity, etc.
Main References
The main part of the course is based on the following book:
- Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory,” Wiley Series in Telecommunications and Signal Processing, 2nd Edition, July 18, 2006.
We also may use the following great book by Mackey:
- David J. C. MacKay, “Information Theory, Inference and Learning Algorithms,” Cambridge University Press, 1st Edition, October 6, 2003.
Fall 2020 Lectures
In the following, the lectures and notes of the course for 2020 fall semester can be found (the lectures and notes are in Persian).
# | Date | Subject | Video | Note |
1 | 99-06-31 (20-09-21) | Course Introduction, Review of the Probability | Lec 01 | Note 01 |
2 | 99-07-05 (20-09-26) | Review of the Probability | Lec 02 | Note 02 |
3 | 99-07-07 (20-09-28) | Review of the Probability, Introduction to Entropy | Lec 03 | Note 03 |
4 | 99-07-12 (20-10-03) | Relative Entropy, Mutual Information | Lec 04 | Note 04 |
5 | 99-07-14 (20-10-05) | Convex Functions, Properties of Information Measures | Lec 05 | Note 05 |
6 | 99-07-19 (20-10-10) | Properties of Information Measures | Lec 06 | Note 06 |
7 | 99-07-21 (20-10-12) | Sufficient Statistics, Fano’s Inequality, Counting Primes using Entropy | Lec 07 | Note 07 |
8 | 99-07-28 (20-10-19) | Shearer’s Lemma, Introduction to Source Coding | Lec 08 | Note 08 |
9 | 99-08-03 (20-10-24) | Kraft Inequality, Optimal Instantaneous Source Codes: Lower Bound | Lec 09 | Note 09 |
10 | 99-08-05 (20-10-26) | Optimal Instantaneous Source Codes: Lower Bound, One-Shot vs. Multi-Shot Coding, Kraft Inequality for Uniquely Decodable Codes | Lec 10 | Note 10 |
11 | 99-08-10 (20-10-31) | Huffman Code | Lec 11 | Note 11 |
12 | 99-08-19 (20-11-09) | Optimality of Huffman Code, Introduction to Typical Sequences | Lec 12 | Note 12 |
13 | 99-08-24 (20-11-14) | Typical Set and Typical Sequences, Proof of the Source Coding Theorem by using Typical Sequences | Lec 13 | Note 13 |
14 | 99-08-26 (20-11-16) | Review of Stochastic Processes, Entropy Rate of Stochastic Processes | Lec 14 | Note 14 |
15 | 99-09-01 (20-11-21) | Introduction to Channel Capacity, Data Transmission over Noisy Channel, Discrete Memoryless Channel, Information Capacity of a Channel | Lec 15 | Note 15 |
16 | 99-09-03 (20-11-23) | Some Examples about Channel Capacity Computation, Properties of Channel Capacity, An Intuition about Channel Capacity | Lec 16 | Note 16 |
17 | 99-09-08 | Some Definitions, Statement of the Noisy Channel Coding Theorem, Converse Proof of the Channel Coding Theorem | Lec 17 | Note 17 |
18 | 99-09-10 | The Achievability Proof of the Channel Coding Theorem | Lec 18 | Note 18 |
19 | 99-09-15 | The Achievability Proof of the Channel Coding Theorem | Lec 19 | Note 19 |
20 | 99-09-17 | Optimal Decoder, Hamming Distance, Optimal Decoder for BSC, Minimum Distance of a Code | Lec 20 | Note 20 |
21 | 99-09-22 | Some Definitions (Group, Field, Vector Space), Linear Codes | Lec 21 | Note 21 |
22 | 99-09-24 | Linear Codes, Generator and Parity Check Matrix of a Linear Code, Some Examples of Linear Codes | Lec 22 | Note 22 |
23 | 99-10-01 | Tanner Graph, LDPC Codes, Differential Entropy, Relation between Differential Entropy and Discrete Entropy | Lec 23 | Note 23 |
24 | 99-10-06 | AEP for Continuous Random Variables, Joint and Conditional Differential Entropy, Relative Entropy and Mutual Information, Maximization Property of Gaussian Distribution for Differential Entropy | Lec 24 | Note 24 |