InfoCoBuild

Principles of Communication

Principles of Communication (Part 2). Instructor: Prof. Aditya K. Jagannatham, Department of Electrical Engineering, IIT Kanpur. This course is a sequel to Principles of Communication-Part I and covers fundamental concepts of communication systems, especially focusing on various aspects of modern digital communication systems. Beginning with the basic theory of digital communication systems pertaining to pulse shaping, modulation and optimal detection, the course will also cover several important digital modulation techniques such as Binary Phase Shift Keying (BPSK), Frequency Shift Keying (FSK), Quadrature Amplitude Modulation (QAM), M-ary Phase Shift Keying (M-PSK) etc. Other fundamental concepts such as Information Theory, Channel Capacity, Entropy Coding and Error Control Coding will be dealt with in the later parts of the course. (from nptel.ac.in)

Lecture 02 - Spectrum of Transmitted Digital Communication Signal, Wide Sense Stationarity


Go to the Course Home or watch other lectures:

Lecture 01 - Introduction to Digital Communication Systems
Lecture 02 - Spectrum of Transmitted Digital Communication Signal, Wide Sense Stationarity
Lecture 03 - Spectrum of Transmitted Digital Communication Signal, Autocorrelation Function and Power Spectral Density
Lecture 04 - Spectrum of Transmitted Digital Communication Signal, Relation to Energy Spectral Density, Introduction to AWGN Channel
Lecture 05 - Additive White Gaussian Noise (AWGN) Properties, Gaussian Noise and White Noise
Lecture 06 - Structure of Digital Communication Receiver, Receiver Filter and Signal-to-Noise Power Ratio (SNR)
Lecture 07 - Digital Communication Receiver, Noise Properties and Output Noise Power
Lecture 08 - Digital Communication Receiver, Optimal SNR and Matched Filter
Lecture 09 - Probability of Error in Digital Communication, Probability Density Functions of Output
Lecture 10 - Probability of Error in Digital Communication, Optimal Decision Rule and Gaussian Q function
Lecture 11 - Introduction to Binary Phase Shift Keying (BPSK) Modulation
Lecture 12 - Introduction to Amplitude Shift Keying (ASK) Modulation
Lecture 13 - Optimal Decision Rule for Amplitude Shift Keying (ASK)
Lecture 14 - Introduction to Signal Space Concept and Orthonormal Basis Signals
Lecture 15 - Introduction to Frequency Shift Keying (FSK)
Lecture 16 - Optimal Decision Rule for Frequency Shift Keying (FSK)
Lecture 17 - Introduction to Quadrature Phase Shift Keying (QPSK)
Lecture 18 - Waveforms of Quadrature Phase Shift Keying (QPSK)
Lecture 19 - Matched Filtering, Bit Error Rate and Symbol Error Rate for QPSK
Lecture 20 - Introduction to M-ary PAM (Pulse Amplitude Modulation)
Lecture 21 - M-ary PAM: Optimal Decision Rule and Probability of Error
Lecture 22 - Introduction to M-ary QAM (Quadrature Amplitude Modulation)
Lecture 23 - M-ary QAM: Optimal Decision Rule, Probability of Error, Constellation Diagram
Lecture 24 - Introduction to M-ary PSK, Transmitted Waveform and Constellation Diagram
Lecture 25 - M-ary PSK: Optimal Decision Rule, Nearest Neighbor Criterion and Approximate Probability of Error
Lecture 26 - Introduction to Information Theory, Relevance of Information Theory and Characterization of Information
Lecture 27 - Definition of Entropy, Average of Information/ Uncertainty of Source Properties of Entropy
Lecture 28 - Entropy Example: Binary Source, Maximum and Minimum Entropy of Binary Source
Lecture 29 - Maximum Entropy of Source with M-ary Alphabet, Concave/Convex Functions, Jensen's Inequality
Lecture 30 - Joint Entropy, Definition of Joint Entropy of Two Sources
Lecture 31 - Properties of Joint Entropy, Relation between Joint Entropy and Marginal Entropies
Lecture 32 - Conditional Entropy, Example and Properties of Conditional Entropy
Lecture 33 - Mutual Information, Diagrammatic Representation, Properties of Mutual Information
Lecture 34 - Examples of Mutual Information
Lecture 35 - Channel Capacity, Implications of Channel Capacity
Lecture 36 - Differential Entropy, Example for Uniform Probability Density Function
Lecture 37 - Differential Entropy of Gaussian Source and Insights
Lecture 38 - Joint Conditional/ Differential Entropies, Mutual Information
Lecture 39 - Capacity of Gaussian Channel
Lecture 40 - Capacity of Gaussian Channel: Practical Implications, Maximum Rate in Bits/sec
Lecture 41 - Introduction to Source Coding and Data Compression, Variable Length Codes, Unique Decodability
Lecture 42 - Uniquely Decodable Codes, Prefix-free Code, Instantaneous Code, Average Code Length
Lecture 43 - Binary Tree Representation of Code, Example and Kraft Inequality
Lecture 44 - Lower Bound on Average Code Length, Kullback-Leibler Divergence
Lecture 45 - Optimal Code Length, Constrained Optimization and Morse Code Example
Lecture 46 - Approaching Lower Bound on Average Code Length, Block Coding
Lecture 47 - Huffman Code, Algorithm, Example and Average Code Length
Lecture 48 - Introduction to Channel Coding, Rate of Code, Repetition Code, Hamming Distance
Lecture 49 - Introduction to Convolutional Codes, Binary Field Arithmetic and Linear Codes
Lecture 50 - Example of Convolutional Code Output, Convolution Operation for Code Generation
Lecture 51 - Matrix Representation of Convolutional Codes, Generator Matrix
Lecture 52 - State Diagram Representation of Convolutional Code, State Transitions
Lecture 53 - Trellis Representation of Convolutional Code, Valid Code Words
Lecture 54 - Decoding of the Convolutional Code, Minimum Hamming Distance, Maximum Likelihood Codeword Estimate
Lecture 55 - Principle of Decoding of Convolutional Code
Lecture 56 - Viterbi Decoder for Maximum Likelihood Decoding of Convolutional Code using Trellis Representation