InfoCoBuild

Information Theory

Information Theory. Instructor: Prof. Himanshu Tyagi, Department of Electrical Engineering, IISc Bangalore. This is a graduate level introductory course in Information Theory where we will introduce the mathematical notion of information and justify it by various operational meanings. This basic theory builds on probability theory and allows us to quantitatively measure the uncertainty and randomness in a random variable as well as information revealed on observing its value. We will encounter quantities such as entropy, mutual information, total variation distance, and KL divergence and explain how they play a role in important problems in communication, statistics, and computer science. Information theory was originally invented as a mathematical theory of communication, but has since found applications in many areas ranging from physics to biology. In fact, any field where people want to evaluate how much information about an unknown is revealed by a particular experiment, information theory can help. In this course, we will lay down the foundations of this fundamental field. (from nptel.ac.in)

 Information Theory

Go to the Course Home or watch other lectures:

 Information and Probabilistic Modeling (Unit 1) Lecture 01 - What is Information? Lecture 02 - How to Model Uncertainty? Lecture 03 - Basic Concepts of Probability Lecture 04 - Estimates of Random Variables Lecture 05 - Limit Theorems Uncertainty, Compression, and Entropy (Unit 2) Lecture 06 - Unit 1 Review and Source Model Lecture 07 - Motivating Examples Lecture 08 - A Compression Problem Lecture 09 - Shannon Entropy Lecture 10 - Random Hash Randomness and Entropy (Unit 3) Lecture 11 - Unit 2 Review and Uncertainty and Randomness Lecture 12 - Total Variation Distance Lecture 13 - Generating almost Random Bits Lecture 14 - Generating Samples from a Distribution using Uniform Randomness Lecture 15 - Typical Sets and Entropy Information and Statistical Inference (Unit 4) Lecture 16 - Unit 3 Review and Hypothesis Testing and Estimation Lecture 17 - Examples Lecture 18 - The Log-Likelihood Ratio Test Lecture 19 - Kullback-Leibler Divergence and Stein's Lemma Lecture 20 - Properties of KL Divergence Information and Statistical Inference (Unit 5) Lecture 21 - Unit 4 Review and Information per Coin-Toss Lecture 22 - Multiple Hypothesis Testing Lecture 23 - Error Analysis of Multiple Hypothesis Testing Lecture 24 - Mutual Information Lecture 25 - Fano's Inequality Properties of Measures of Information (Unit 6) Lecture 26 - Measures of Information Lecture 27 - Chain Rules Lecture 28 - Shape of Measures of Information Lecture 29 - Data Processing Inequality Properties of Measures of Information (Unit 7) Lecture 30 - Review So Far and Proof of Fano's Inequality Lecture 31 - Variational Formulae Lecture 32 - Capacity as Information Radius Lecture 33 - Proof of Pinsker's Inequality Lecture 34 - Continuity of Entropy Information Theoretic Lower Bounds (Unit 8) Lecture 35 - Lower Bound for Compression Lecture 36 - Lower Bound for Hypothesis Testing Lecture 37 - Review Lecture 38 - Lower Bound for Random Number Generation Lecture 39 - Strong Converse Lecture 40 - Lower Bound for Minmax Statistical Estimation Data Compression (Unit 9) Lecture 41 - Variable Length Source Codes Lecture 42 - Review and Kraft's Inequality Lecture 43 - Shannon Code Lecture 44 - Huffman Code Universal Compression (Unit 10) Lecture 45 - Minmax Redundancy Lecture 46 - Type based Universal Compression Lecture 47 - Review and Arithmetic Code Lecture 48 - Online Probability Assignment Compression of Databases (Unit 11) Lecture 49 - Compression of Databases: A Scheme Lecture 50 - Compression of Databases: A Lower Bound Channel Coding and Capacity (Unit 12) Lecture 51 - Repetition Code Lecture 52 - Channel Capacity Shannon's Channel Coding Theorem Proof (Unit 13) Lecture 53 - Sphere Packing Bound for BSC Lecture 54 - Random Coding Bound for BSC Lecture 55 - Random Coding Bound for General Channel Lecture 56 - Review Lecture 57 - Converse Proof for Channel Coding Theorem Gaussian Channels (Unit 14) Lecture 58 - Additive Gaussian Noise Channel Lecture 59 - Mutual Information and Differential Entropy Lecture 60 - Channel Coding Theorem for Gaussian Channel Lecture 61 - Parallel Channels and Water-Filling

 References Information Theory Instructor: Prof. Himanshu Tyagi, Department of Electrical Engineering, IISc Bangalore. This course introduces the mathematical notion of information and justify it by various operational meanings.