ENS Research Course | INFO5147 | Academic Year 2020-21

INFO5147: Selected Topics in Information Theory

École Normale Supérieure de Lyon (ENS de Lyon)
Computer Science Department
Master 2 – 2020/2021 – 5 ECTS

Course Description

The course ‘‘INFO5147 – Selected Topics in Information Theory’’ is divided into two parts: Theoretical Foundations and Applications.

The objective of the first part is to level the ground to study information theory outside the classical framework of communications theory. The motivation for studying information theory outside its most prominent application domain is to widen and strengthen its connections with other disciplines and mathematical theories, in particular, real analysis, measure theory, probability theory, optimization, game theory, and statistics. This choice provides a more general look to information theory and might inspire new applications in different fields. Certainly, by adopting this choice, information theory can be truly appreciated as a developing mathematical theory whose impact on pure and applied sciences is yet to be discovered.

The second part focuses on the applications of information theory in statistics, in particular, stochastic approximations and expectation maximization algorithms; and communications theory, more specifically, storage and data transmission. These problems are studied from a modern perspective in which asymptotic assumptions are avoided. That is, these problems are formulated taking into account that data storage takes place with finite storage capacity; and data transmission takes place within a finite period. This rises the consideration of distorsion and decoding-error probabilities that are certainly bounded away from zero. Within this context, the fundamental limits of data storage and data transmission are studied in scenarios that are close to real-system implementations. Open problems in multi-user information theory in the finite blocklength regime are briefly presented. The end of this part is dedicated to a brief introduction to compressive sensing and its applications in networking.

Evaluation

  • Weekly homeworks (40 %)

  • In-class work (10 %) – In the form of oral questions

  • Final Exam (50 %) – In the form of 30-minute presentation (possibly on the blackboard)

Part I: Theoretical Foundations

Lecture Notes

The lecture notes are available here.

  • Lecture 1: Elements of Measure Theory by S. Perlaza – Sep. 08, 2020. 10h15 - 12h15

    • Review of algebra of sets

    • Review of Darboux-Riemann integration

    • The problem of measure

    • Jordan and Lebesgue measures

  • Lecture 2: Elements of Measure Theory by S. Perlaza – Sep. 10, 2020. 15h45 - 17h45

    • Lebesgue measurable functions

    • Lebesgue integral

    • Measures, measurable spaces

Homework 1: Deadline by Sep. 27, 2020. 23h59

  • Lecture 3: Elements of Measure Theory by S. Perlaza – Sep. 15, 2020. 10h15 - 12h15

    • A general theory of Lebesgue integration

    • Monotone convergence theorem

    • Dominated convergence theorem

  • Lecture 4: Measure Theoretic Probability by S. Perlaza – Sep. 17, 2020. 15h45 - 17h45

    • The Radon-Nikodym derivative

    • Distance between measures

    • Probability spaces and random variables

    • Expectation, conditional expectation, and independence

Homework 2: Deadline by Oct. 4, 2020. 23h59

  • Lecture 5: Information Measures by S. Perlaza – Sep. 22, 2020. 10h15 - 12h15

    • Information and measures of information

    • Information, joint information, and conditional information

    • Entropy, joint entropy, and conditional entropy

  • Lecture 6: Information Measures by S. Perlaza – Sep. 24, 2020. 15h45 - 17h45

    • Relative information and Relative entropy

    • Mutual information

    • Bounds on information measures

Homework 3: Deadline by Oct. 11, 2020. 23h59

  • Lecture 7: Hypothesis Testing – by S. Perlaza Sep. 29, 2020. 10h15 - 12h15

    • The problem of statistical hypothesis testing

    • Bayesian method

    • Minmax method

  • Lecture 8: Hypothesis Testing – by S. Perlaza Oct. 1, 2020. 15h45 - 17h45

    • Neyman-Pearson method

    • Method of Types

    • Sanov's theorem

    • Chernoff-Stein lemma

Homework 4: Deadline by Oct. 25, 2020. 23h59

Part II - A: Applications to Maximum Likelihood Estimation and Model Selection

Lecture Notes

The lecture notes are available here.

  • Lecture 9: Expectation-Maximization Algorithms by M. Egan – Oct. 6, 2020. 10h15 - 12h15

  • Lecture 10: Information Theoretic Criteria for Model Selection by M. Egan – Oct. 8, 2020. 15h45 - 17h45

Homework 5: Deadline by Oct. 31, 2020. 23h59

Part II - B: Applications to Communication Theory

Lecture Notes

The lecture notes are available here for Lectures 11 - 12 and here for Lectures 13 - 14.

  • Lecture 11: Information and Estimation by J.-M. Gorce – Oct. 13, 2020. 10h15 - 12h15

  • Lecture 12: Lossless and Lossy Compression by J.-M. Gorce – Oct. 15, 2020. 15h45 - 17h45

Homework 6: Download it here. Deadline by Nov. 15, 2020. 23h59

  • Lecture 13: Channel Coding by J.-M. Gorce – Oct. 20, 2020. 10h15 - 12h15

  • Lecture 14: Multi-User Networks by J.-M. Gorce – Oct. 22, 2020. 15h45 - 17h45

Homework 7: Download it here. Deadline by Nov. 24, 2020. 23h59

Student Context (Final Exams)

  • Lecture 15: Student Context (Final Exams) – Nov. 10, 2020. 10h15 - 12h15

    • Students’ Project Presentations – Groups 5 - 8

  • Lecture 16: Student Context (Final Exams) – Nov. 12, 2020. 15h45 - 17h45

    • Students’ Project Presentations – Groups 1 - 4