ENS Research Course | INFO5147 | Academic Year 2020-21

INFO5147: Selected Topics in Information Theory

École Normale Supérieure de Lyon (ENS de Lyon)
Computer Science Department
Master 2 – 2020/2021 – 5 ECTS

Course Description

The course ‘‘INFO5147 – Selected Topics in Information Theory’’ is divided into two parts: Theoretical Foundations and Applications.

The objective of the first part is to level the ground to study information theory outside the classical framework of communications theory. The motivation for studying information theory outside its most prominent application domain is to widen and strengthen its connections with other disciplines and mathematical theories, in particular, real analysis, measure theory, probability theory, optimization, game theory, and statistics. This choice provides a more general look to the theory and might inspire new applications in different fields. Certainly, by adopting this choice, information theory can be truly appreciated and embraced as a developing mathematical theory whose impact on pure and applied sciences is yet to be discovered.

The second part focuses on the applications of information theory in statistics, in particular, stochastic approximations. The next lectures focus on storage and data transmission. These problems are studied from a modern perspective in which asymptotic assumptions are not considered. That is, these problems are formulated taking into account that data storage takes place with finite storage capacity; and data transmission takes place within a finite duration. This rises the consideration of distorsion and decoding-error probabilities that are certainly bounded away from zero. Within this context, the fundamental limits of data storage and data transmission are studied in scenarios as close as possible to real-system implementations. Open problems in multi-user information theory in the finite blocklength regime are briefly presented. The final lecture tackles the problem of compressive sensing and its applications to networking.

Evaluation

  • Weekly homeworks (65 %)

  • In-class work (10 %) – In the form of oral questions.

  • Final Exam (25 %) – In the form of 30-minute presentation (possibly on the blackboard)

Part I: Theoretical Foundations

Lecture Notes

The lecture notes are available here.

  • Lecture 1: Elements of Measure Theory by S. Perlaza – Sep. 08, 2020. 10h15 - 12h15

    • Review of algebra of sets

    • Review of Darboux-Riemann integration

    • The problem of measure

    • Jordan and Lebesgue measures

  • Lecture 2: Elements of Measure Theory by S. Perlaza – Sep. 10, 2020. 15h45 - 17h45

    • Lebesgue integral

    • Measures, measurable spaces

    • Measurable functions

Homework 1: Deadline by Sep. 27, 2020. 23h59

  • Lecture 3: Elements of Measure Theory by S. Perlaza – Sep. 15, 2020. 10h15 - 12h15

    • A general theory of Lebesgue integration

    • Monotone convergence theorem

    • Dominated convergence theorem

  • Lecture 4: Measure Theoretic Probability by S. Perlaza – Sep. 17, 2020. 15h45 - 17h45

    • The Radon-Nikodym derivative

    • Distance between measures

    • Probability spaces and random variables

    • Expectation, conditional expectation, and independence

Homework 2: Deadline by Oct. 4, 2020. 23h59

  • Lecture 5: Information Measures by S. Perlaza – Sep. 22, 2020. 10h15 - 12h15

    • Information and measures of information

    • Information, joint information, and conditional information

    • Entropy, joint entropy, and conditional entropy

  • Lecture 6: Information Measures by S. Perlaza – Sep. 24, 2020. 15h45 - 17h45

    • Relative information and Relative entropy

    • Mutual information

    • Bounds on information measures

Homework 3: Deadline by Oct. 11, 2020. 23h59

  • Lecture 7: Hypothesis Testing – by S. Perlaza Sep. 29, 2020. 10h15 - 12h15

    • The problem of statistical hypothesis testing

    • Bayesian method

    • Minmax method

  • Lecture 8: Hypothesis Testing – by S. Perlaza Oct. 1, 2020. 15h45 - 17h45

    • Neyman-Pearson method

    • Method of Types

    • Sanov's theorem

    • Chernoff-Stein lemma

Homework 4: Deadline by Oct. 25, 2020. 23h59

Part II - A: Applications to Maximum Likelihood Estimation and Model Selection

Lecture Notes

The lecture notes are available here.

  • Lecture 9: Expectation-Maximization Algorithms by M. Egan – Oct. 6, 2020. 10h15 - 12h15

  • Lecture 10: Information Theoretic Criteria for Model Selection by M. Egan – Oct. 8, 2020. 15h45 - 17h45

Homework 5: Deadline by Oct. 31, 2020. 23h59

Part II - B: Applications to Communication Theory

Lecture Notes

The lecture notes are available here.

  • Lecture 11: Information and Estimation by J.-M. Gorce – Oct. 13, 2020. 10h15 - 12h15

  • Lecture 12: Lossless and Lossy Compression by J.-M. Gorce – Oct. 15, 2020. 15h45 - 17h45

Homework 6: Deadline by Nov. 15, 2020. 23h59

  • Lecture 13: Channel Coding by J.-M. Gorce – Oct. 20, 2020. 10h15 - 12h15

  • Lecture 14: Multi-User Networks by J.-M. Gorce – Oct. 22, 2020. 15h45 - 17h45

Student Context (Final Exams)

  • Lecture 15: Student Context (Final Exams) – Nov. 3, 2020. 10h15 - 12h15

    • Students’ Project Presentations – Groups 5 - 9

  • Lecture 16: Student Context (Final Exams) – Nov. 5, 2020. 15h45 - 17h45

    • Students’ Project Presentations – Groups 1 - 4