Information theory was introduced by Claude Shannon in a seminal paper in 1948. The first aim of information theory was to formalize and quantify transmission of information over communication channels. Later, information theory has been used in many other fields, including statistics, theoretical computer science, game theory, machine learning, etc.
In this course, we will introduce the basic concepts of information theory and review some the main applications, including source coding, channel capacity, etc.
If you are interested in the course, please sign up on the piazza site (piazza.com/sharif/fall2019/ce40676).
The main part of the course is based on the following book:
- Thomas M. Cover and Joy A. Thomas, “Elements of Information Theory,” Wiley Series in Telecommunications and Signal Processing, 2nd Edition, July 18, 2006.
We also may use the following great book by Mackey:
- David J. C. MacKay, “Information Theory, Inference and Learning Algorithms,” Cambridge University Press, 1st Edition, October 6, 2003.