infocobuild

6.050J Information and Entropy

6.050J/2.110J Information and Entropy (Spring 2008, MIT OCW). Instructors: Professor Paul Penfield and Professor Seth Lloyd. This course explores the ultimate limits to communication and computation, with an emphasis on the physical nature of information and information processing. Topics include: information and computation, digital signals, codes and compression, applications such as biological representations of information, logic circuits, computer architectures, and algorithmic information, noise, probability, error correction, reversible and irreversible operations, physics of computation, and quantum computation. The concept of entropy applied to channel capacity and to the second law of thermodynamics. (from ocw.mit.edu)

01 - Unit 1: Bits and Codes, Lecture 2
02 - Unit 2: Compression, Lecture 1
03 - Unit 3: Noise and Errors, Lecture 2
04 - Unit 4: Probability, Lecture 1
05 - Unit 4: Probability, Lecture 2
06 - Unit 5: Communications, Lecture 1
07 - Unit 5: Communications, Lecture 2
08 - Unit 6: Processes, Lecture 1
09 - Unit 7: Inference, Lecture 1
10 - Unit 7: Inference, Lecture 2
11 - Unit 8: Maximum Entropy, Lecture 1
12 - Unit 8: Maximum Entropy, Lecture 2
13 - Unit 10: Physical Systems, Lecture 1
14 - Unit 10: Physical Systems, Lecture 3
15 - Unit 11: Energy, Lecture 1
16 - Unit 11: Energy, Lecture 2
17 - Unit 12: Temperature, Lecture 1
18 - Unit 12: Temperature, Lecture 2
19 - Unit 13: Quantum Information, Lecture 1

References
Information and Entropy
Instructors: Prof. Paul Penfield and Prof. Seth Lloyd. Online textbooks. Assignments and Solutions. This course explores the ultimate limits to communication and computation, with an emphasis on the physical nature of information and information processing.