Welcome to the Information Theory and Coding course (CE-40676, fall 2020). If you are interested, please sign up on the Quera page of the course.
Course Objectives
Information theory was introduced by Claude Shannon in a seminal paper in 1948. The first aim of information theory was to formalize and quantify transmission of information over communication channels. Later, information theory has been used in many other fields, including statistics, theoretical computer science, game theory, machine learning, etc.
In this course, we will introduce the basic concepts of information theory and review some the main applications, including source coding, channel capacity, etc.
Main References
The main part of the course is based on the following book:
- Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory,” Wiley Series in Telecommunications and Signal Processing, 2nd Edition, July 18, 2006.
We also may use the following great book by Mackey:
- David J. C. MacKay, “Information Theory, Inference and Learning Algorithms,” Cambridge University Press, 1st Edition, October 6, 2003.
Fall 2020
In the following, the lectures and notes of fall 2020 semester can be found (the lectures and notes are in Persian).
# | Date | Subject | Video | Note |
1 | 99-06-31 (20-09-21) | Course Introduction, Review of the Probability | Lec 01 | Note 01 |
2 | 99-07-05 (20-09-26) | Review of the Probability | Lec 02 | Note 02 |
3 | 99-07-07 (20-09-28) | Review of the Probability, Introduction to Entropy | Lec 03 | Note 03 |
4 | 99-07-12 (20-10-03) | Relative Entropy, Mutual Information | Lec 04 | Note 04 |
5 | 99-07-14 (20-10-05) | Convex Functions, Properties of Information Measures | Lec 05 | Note 05 |
6 | 99-07-19 (20-10-10) | Properties of Information Measures | Lec 06 | Note 06 |
7 | 99-07-21 (20-10-12) | Sufficient Statistics, Fano’s Inequality, Counting Primes using Entropy | Lec 07 | Note 07 |
8 | 99-07-28 (20-10-19) | Shearer’s Lemma, Introduction to Source Coding | Lec 08 | Note 08 |
9 | 99-08-03 (20-10-24) | Kraft Inequality, Optimal Instantaneous Source Codes: Lower Bound | Lec 09 | Note 09 |
10 | 99-08-05 (20-10-26) | Optimal Instantaneous Source Codes: Lower Bound, One-Shot vs. Multi-Shot Coding, Kraft Inequality for Uniquely Decodable Codes | Lec 10 | Note 10 |
11 | 99-08-10 (20-10-31) | Huffman Code | Lec 11 | Note 11 |
12 | 99-08-19 (20-11-09) | Optimality of Huffman Code, Introduction to Typical Sequences | Lec 12 | Note 12 |
13 | 99-08-24 (20-11-14) | Typical Set and Typical Sequences, Proof of the Source Coding Theorem by using Typical Sequences | Lec 13 | Note 13 |
14 | 99-08-26 (20-11-16) | Review of Stochastic Processes, Entropy Rate of Stochastic Processes | Lec 14 | Note 14 |
15 | 99-09-01 (20-11-21) | Lec 15 | Note 15 | |
16 | 99-09-03 (20-11-23) | Lec 16 | Note 16 | |
17 | ||||
18 |