When and where: The class meets on MWF from 3:00 to 4:00 in room 2-136.
Instructor: Peter Shor, room 2-369.
Office hours: I will be available
by appointment to help prepare the presentations if need be.
I will also hold scheduled office hours the days before the homework
is due. The next will be Thursday, April 10, from 2:30 to 3:30.
and Friday, April 11, before class (1:00-3:00).
Homework
Homework 1: do problems 2.6, 2.9, 2.10, 2.14, and 2.25 in Cover and Thomas. Due Tuesday, Feb. 20.
Homework 2: do problems 5.5, 5.19, 5.25, 5.27, 5.30 and 5.32 in Cover and Thomas. Due Monday, March 5.
Homework 3: is posted. Due Monday, March 19
Homework 4: Write a brief paragraph about the subject of your term
paper. Do problems 7.24, 11.5, 11.15a, 13.5, 13.6 Due Friday, April 13.
This is a new undergraduate seminar in information theory. It carries CIM credit for the math department. Enrollment is limited by the department.
The textbook is Elements of Information Theory, 2nd edition, by Cover and Thomas. The good news about this textbook is that the 2nd edition just came out, and has numerous mistakes fixed, many more exercises, and additional material. It is available in the book store. The bad news is because it's new, you may not be able to get a 2nd hand copy of the 2nd edition cheaply. We may do some material in MacKay's book, which is available online as well as in hardcopy.
This is a seminar class, so you will actually be teaching it to yourselves. Everybody will have to give three periods worth of lectures.
I think Information Theory is a fairly good topic for a seminar class, since (except for certain explicit constructions of error correcting codes) it is fairly broad. There is a small amount of material that underlies all of it. These are the concepts of information and entropy, and the properties of these quantities. There are many different possible branches of information theory that build on this, but which are relatively independent. Everybody is thus going to have to learn the underlying material fairly well. So I feel that I should probably start off by giving three lectures about properties of information, the equipartition property, and Shannon's source coding theorem. I will also give approximately four homework assignments during the term.
We will then start off by having students give lectures on various methods of source coding. After the source coding section, we'll do the channel coding theorem, possibly do some methods for channel coding, and we will also do certain other applications of information theory. These might be the rate-distortion theory, information capacity of networks, or how information theory relates to probabilistic inference or to gambling. Exactly what we do will depend on the students' interests. I expect around half of the topics will be out of the textbook.
Students will be expected to present two and a half class periods during the term. You will also be expected to write a 10-page paper on some topic related to information theory.
Dates You should come up with a topic for your term paper around April 2. The first draft of the paper will be due on April 25.
Grading policy. Your grade will be based on your participation (i.e. your own presentations, and also your participation during other presentations), the homework, and your project, in the ratio 50% participation, 25% homework, and 25% project. You are expected to attend all lectures (if not excused).
The prerequisites are having some courses on linear algebra and probability theory.
Here is a partial list of possible topics.
Date | Presenter | Topic | Reference |
---|---|---|---|
W Feb 7 | Peter Shor | Introduction | - |
F Feb 9 | Peter Shor | Information inequalities | Thomas and Cover Chapter 2.1-2.6, 2.8. |
M Feb 12 | Peter Shor | Asymptotic Equipartition Property | Thomas and Cover chapter 3 |
W Feb 14 | Joel | Source coding, prefix codes and Kraft's inequality | 5.1-5.3 |
F Feb 16 | Steve Hall | Optimal codes, block coding, and McMillan's inequality. | 5.4-5.5 |
T Feb 20 | Yahli | Huffman codes (and maybe adaptive Huffman codes if time permits) | 5.6 - 5.8 |
W Feb 21 | Ravi | the Burrows-Wheeler transform and move-to-front coding | Wikipedia, original paper and this paper |
F Feb 23 | Boris | Shannon-Fano-Elias codes | 5.9, 5.10 |
M Feb 26 | Omari | arithmetic codes | 13.3 |
W Feb 28 | Sergio | gambling and information theory | 6.1,6.2 |
F Mar 2 | Yoyo Zhou | 4.1-4.3 | ? |
M Mar 5 | Jonathan Santiago | Channel Capacities and Hamming Codes | 7.1, 7.2, 7.11 |
W Mar 7 | Kevin | Channel Capacities Theorem I: Jointly typical sequences | 7.4, 7.6, and the pieces of 7.5 needed for 7.6 |
F Mar 9 | Yahli | Channel Capacities Theorem, II | the rest of 7.5 and 7.7 |
M Mar 12 | Mike Anderson | Fano's Inequality and the Converse Channel Capacity theorem | 2.10, 7.9 (and 7.10 if time permits) |
W Mar 14 | Steve Hall | Perfect codes | MacKay Chap. 13.1-13.4 |
F Mar 16 | Ravi | Gilbert Varshamov bound and Shannon's bound for linear codes | MacKay Chap. 13.5 (and maybe some of 13.6 - 13.8) and 14.1 |
M Mar 19 | Joel | method of types | 11.1-2 |
W Mar 21 | Boris | universal source coding | 11.3 |
F Mar 23 | Sergio | Sanov's theorem | 11.4, 11.5 |
M Apr 2 | Omari | Lempel Ziv coding | 13.4.1, 13.5.1 |
W Apr 4 | Mike | More source coding? (tree-structured Lempel-Ziv or something else) | ? |
F Apr 6 | Steve Hall | differential entropy | 8.1-8.5 |
M Apr 9 | Yoyo | Gaussian channels | 9.1-9.2 |
W Apr 11 | Peter Shor | more on continuous channels | 9.3-9.4 |
F Apr 13 | Kevin | Gaussian multi-user channels | 15.1 |
W Apr 18 | Jon | multiple access channels | 15.3-15.3.2 |
F Apr 20 | Jon | more multiple access channels | 15.3.3-15.3.4 |
M Apr 23 | Sergio | Shannon's paper on secrecy | On-line here |
W Apr 25 | Peter Shor | Wyner "The Wire-Tap Channel" | |
F Apr 27 | Peter Shor | Maurer, "Secret key agreement by public discussion from common information" | ? |
M Apr 30 | Mike | Renner and Wolf "New bounds in secret-key agreement" | ? |
W May 2 | Yahli | MacKay Chap. 16 | ? |
F May 4 | Kevin | MacKay Chap. 17 | ? |
M May 7 | Ravi | MacKay Chap. 18 | ? |
W May 9 | Joel | Gambling and Entropy of English | Cover and Thomas, 6.4-6.6 |
F May 11 | Boris | MacKay Chap. 47 | |
M May 14 | Yoyo | Arimoto Blahut algorithm | ? |
W May 16 | Omari | - | ? |