Lecture Videos

Lecture 1: Overview: Information and Entropy

Description: This lecture covers some history of digital communication, with a focus on Samuel Morse and Claude Shannon, measuring information and defining information, the significance of entropy on encodings, and Huffman’s coding algorithm.

Instructor: George Verghese

Course Info

Learning Resource Types
Problem Sets
Exams
Lecture Notes
Online Textbook
Lecture Videos