InfoCoBuild

Information Theory and Coding

Information Theory and Coding. Instructor: Prof. S. N. Merchant, Department of Electrical Engineering, IIT Bombay. This course covers lessons on information theory and coding, entropy, block code and its properties, Shannon's theorem, huffman coding, Shannon-Fano-Elias coding, arithmetic coding, information channels, rate-distortion theory, Lloyd-Max Quantizer, vector quantization, and transform coding. (from nptel.ac.in)

Lecture 05 - Properties of Joint and Conditional Information Measures and a Markov Source


Go to the Course Home or watch other lectures:

Lecture 01 - Introduction to Information Theory and Coding
Lecture 02 - Definition of Information Measure and Entropy
Lecture 03 - Extension of an Information Source and Markov Source
Lecture 04 - Adjoint of an Information Source, Joint and Conditional Information Measure
Lecture 05 - Properties of Joint and Conditional Information Measures and a Markov Source
Lecture 06 - Asymptotic Properties of Entropy and Problem Solving in Entropy
Lecture 07 - Block Code and its Properties
Lecture 08 - Instantaneous Code and its Properties
Lecture 09 - Kraft-Mcmillan Equality and Compact Codes
Lecture 10 - Shannon's First Theorem
Lecture 11 - Coding Strategies and Introduction to Huffman Coding
Lecture 12 - Huffman Coding and Proof of its Optimality
Lecture 13 - Competitive Optimality of the Shannon Code
Lecture 14 - Non-Binary Huffman Code and Other Codes
Lecture 15 - Adaptive Huffman Coding, Part I
Lecture 16 - Adaptive Huffman Coding, Part II
Lecture 17 - Shannon-Fano-Elias Coding and Introduction to Arithmetic Coding
Lecture 18 - Arithmetic Coding, Part I
Lecture 19 - Arithmetic Coding, Part II
Lecture 20 - Introduction to Information Channels
Lecture 21 - Equivocation and Mutual Information
Lecture 22 - Properties of Different Information Channels
Lecture 23 - Reduction of Information Channels
Lecture 24 - Properties of Mutual Information and Introduction to Channel Capacity
Lecture 25 - Calculation of Channel Capacity for Different Information Channels
Lecture 26 - Shannon's Second Theorem
Lecture 27 - Discussion on Error Free Communication over Noisy Channel
Lecture 28 - Error Free Communication over a Binary Symmetric Channel and Introduction to Continuous Sources and Channels
Lecture 29 - Differential Entropy and Evaluation of Mutual Information for Continuous Sources and Channels
Lecture 30 - Channel Capacity of a Bandlimited Continuous Channel
Lecture 31 - Introduction to Rate-Distortion Theory
Lecture 32 - Definition and Properties of Rate-Distortion Functions
Lecture 33 - Calculation of Rate-Distortion Functions
Lecture 34 - Computational Approach for Calculation of Rate-Distortion Functions
Lecture 35 - Introduction to Quantization
Lecture 36 - Lloyd-Max Quantizer
Lecture 37 - Companded Quantization
Lecture 38 - Variable Length Coding and Problem Solving in Quantizer Design
Lecture 39 - Vector Quantization
Lecture 40 - Transform Coding, Part I
Lecture 41 - Transform Coding, Part II