Potential list of topics/lectures
A very preliminary and partial list of possible topics.
My plan is to have somebody lecture on prefix codes and the Kraft inequality
on Wendsday, Feb. 14, somebody lecture on Shannon's second theorem on
Friday, Feb. 16, and somebody lecture on Huffman codes and maybe adaptive
Huffman
codes on Monday Feb. 19, and then various topics in source coding for the
few lectures after that, depending on how much interest there is in the
topic. After that we continue on to Shannon's Second Theorem and channel
coding, and then to other information theory topics
Source Coding
- Lempel Ziv codes. Sections 13.4 and 14.5 of Cover and Thomas.
There's enough material for two or three lectures here.
- .
Arithmetic codes, Section 13.3 of Cover and Thomas.
-
The Burrows-Wheeler transform (Wikipedia has a good intro,
and references to the scientific papers).
-
Tunstall codes
-
Shannon-Fano-Elias coding
-
Lossy source coding methods, e.g. Fourier transforms for images and
audio, etc.
Channel Coding
Shannon's theorem
Reverse Shannon theorem
converse to Shannon theorem
The Gilbert-Varshamov bound
Sparse graph codes and message-passing (probably two lectures)
Continuous channels
Mutual information for continuous variables
Gaussian channels, Shannon's theorem, signal-to-noise ratio.
Information theory and statistical inference
Asymptotic hypothesis testing and relative entropy: Stein's theorem
Mathematical axioms for entropy and its uniqueness
Stock portfolios and information theory
Rate distortion theory
Kolmogorov complexity
Vector quantization?