LEC # | TOPICS | LECTURE NOTES |
---|---|---|
1 | Introduction, entropy | (PDF) |
2 | Jensen’s inequality, data processing theorem, Fanos’s inequality | (PDF) |
3 | Different types of convergence, asymptotic equipartition property (AEP), typical set, joint typicality | (PDF) |
4 | Entropies of stochastic processes | (PDF) |
5 | Data compression, Kraft inequality, optimal codes | (PDF) |
6 | Huffman codes | (PDF) |
7 | Shannon-Fano-Elias codes, Slepian-Wolf | (PDF 1) (PDF 2) |
8 | Channel capacity, binary symmetric and erasure channels | (PDF) |
9 | Maximizing capacity, Blahut-Arimoto | (PDF) |
10 | The channel coding theorem | (PDF) |
11 | Strong coding theorem, types of errors | (PDF) |
12 | Strong coding theorem, error exponents | (PDF) |
13 | Fano’s inequality and the converse to the coding theorem | (PDF) |
14 | Feedback capacity | (PDF) |
15 | Joint source channel coding | (PDF) |
16 | Differential entropy, maximizing entropy | (PDF) |
17 | Additive Gaussian noise channel | (PDF) |
18 | Gaussian channels: parallel, colored noise, inter-symbol interference | (PDF) |
19 | Gaussian channels with feedback | (PDF) |
20 | Multiple access channels | (PDF) |
21 | Broadcast channels | (PDF) |
22 | Finite state Markov channels | (PDF) |
23 | Channel side information, wide-band channels | (PDF) |
Lecture Notes
Course Info
Instructor
Departments
As Taught In
Spring
2010
Level
Topics
Learning Resource Types
assignment
Problem Sets
notes
Lecture Notes