ECE 729 Theory of Information Processing and Transmission, Spring 2004


Last Modified: Tue 15 Oct 2019, 01:45 PM, CDT

Clock Icon  Instructor Office Hours: M W F 11-noon or by appointment.
See what we did in Spring 2001. Spring 2004 will be ....
Papers Icon Syllabus
Papers Icon Homework Solutions
You can download Shannon's original paper, A Mathematical Theory of Communication,
          which started the whole field of Information Theory.
Books about information theory.
R. M. Gray's book, Entropy and Information Theory, is available free as a PDF file.
References on Source Coding, Rate Distortion, and Multiple Descriptions
Information Theory Resources Available on the web.
Tacked Note Icon Class Schedule for Spring 2004
1/21	Wednesday.  Overview of main topics we'll cover in information theory.
	Started study of variable-length codes.

1/23	Distributed syllabus.  Defined code, codebook, expected length of a
	source code.  Extension code.  Unique decodability.  Examples.  Prefix
	code or instantaneous codes.  Stated and proved the Kraft Inequality.
	Assigned HW #1 handout.  DUE WED 1/28.

1/26 Stated and derived McMillan's Theorem. Defined entropy. Stated and proved ln(x) <= x-1 for all x>0. 1/28 Went through proof of theorem on p. 9 that E[l(C(X))] >= H(X)/log D. Proved theorem on Shannon-Fano codes. Compared extension of Shannon-Fano code to N symbols on p. 11.1 with Theorm on p. 12 that uses Shannon-Fano code on composite symbol (X_1,...,X_N). Discussed Huffman coding for binary code letters. Assigned HW #2 handout. DUE WED 2/4. 1/30 Showed how to do ternary Huffman codes. Discussed entropy as a measure of uncertainty - compared uniformly distributed vs. constant RVs, etc. Went back to p. 11 to discuss what happens when Shannon-Fano codes are constructed from the "wrong" pmf - leads to Kullback-Leibler informational divergence. Started NEW SECTION on Data Compression--Fixed-Length Codes. Gave example of providing codewords only for source sequences that have high probability, but not for source sequences that have low probability.
2/2 Defined achievable rate. Stated and proved Shannon's Source Coding Theorem. 2/4 Discussed differences between fixed-length and variable length source coding (p. 6). Discussed Properties of Achievable Rates (pp. 7-8): Proved the set of achievable rates is closed and convex. Defined Gaussian random vectors (not in your notes). Assigned Design Project #1 handout. DUE WED 2/11. When getting your script to run, you might try to run it on smaller problems with only 3-bit words. When it's working use 5-bit words with p = 0.25 and lambda = 0.07. 2/6 Went over HW #2 problems 2(b) and 2(c). Finished discussion of Gaussian random vectors and Gaussian random processes and white noise. Started antipodal signaling over the AWGN channel.
2/9 Finished deriving bit-error probability for antipodal signaling over the AWGN channel. Mentioned binary symmetric channel (BSC) and binary erasure channel (BEC). Started NEW SECTION of notes on The Discrete Memoryless Channel (DMC) and Channel Capacity. 2/11 Discussed Design Project #1. Continued DMC notes. Derived explicit expression for P(phi(Y)~=M). Defined channel code rate and achievable rate for a channel code. Assigned HW #3 handout. DUE WED 2/18. 2/13 Defined the DMC; emphasized its stationarity, lack of memory in the sense of not depending on past inputs or outputs, nonanticipativeness in the sense of not depending on future inputs or outputs. Stated and proved lemma that says that if the average of g(u) < lambda, then for some u_0, g(u_0) < lambda. Sketched the idea of Shannon's Random Coding Argument. Also gave interpretation of random codes as in frequency-hopped spread spectrum. Defined the set A_n on p. 9. Stated Lemma on p. 10 (which is basically the weak law of large numbers).
2/16 Discussed specification of decoding functions using collections of sets that may not be disjoint or whose union may not be the whole space. Defined the decoder for proving Shannon's noisy channel coding theorem using the sets A_n. 2/18 Finished proof of forward part of noisy channel coding theorem. Assigned Design Project #2: Download the two MATLAB files PC.m and W.m. Run the script PC.m; look at the script and its output and understand how it works. Then modify the script PC.m to accomodate n=7 bit words and N=4 codewords. You must choose the decoding sets D_i to try to minimize the average probability of error. Turn in a printout of your modified program and a copy of its output. DUE WED 2/25. 2/20 Re-capped the Noisy Channel Coding Theorem. Stated Lemma from p. 14.1 that is NOT in your notes. Stated weak converse theorem. Started developing properties of mutual information and entropy; e.g., I(X^Y) = H(Y) - H(Y|X) >= 0, etc.
2/23 Derived H(Y|X) = sum_k H(Y_k|X_k) if X and Y are connected through a DMC and X has any joint pmf. Defined conditional average mutual information I(X^Y|Z) and showed that I(XZ^Y) = I(Y^Z) + I(X^Y|Z) >= I(Y^Z). Derived the Data Processing Lemma. Derived Fano's inequality. 2/25 Proved Weak Converse Theorm for the DMC. Discussed the Strong Converse for the DMC. Started discussion of the maximal probability of error criterion. Assigned HW #4 handout. DUE WED 3/3. 2/27 Finished proof that if a rate is achievable under the average probability of error criterion, then the rate is achievable under the maximal probability of error criterion pp. 35-37. Started Evaluation of the DMC Capacity for Weakly Symmetric Channels, p. 24. Announced Night Exam for Wed. Mar. 24.
3/1 Went over examples for which we can compute sup_p I(X^Y) on pp. 25-27, including the BSC and BEC. Started NEW SECTION on Gaussian channel. 3/3 Allowed decoder to declare an error (output 0). Introduced average power constraint on channel codewords. Defined achievable rate and capacity for channels with continuous-valued inputs and outputs. Gave formula for the Gaussian channel with average power constraint P and variance N as (1/2)log(1+P/n). Introduced the joint distribution P(M=i,Y in A). Defined differential entropy and average mutual information for continuous random variables. Assigned HW #5 handout. DUE WED 3/10. 3/5 For Gaussian channel showed I(X^Y) is maximized subject to E[X^2]<=P by p(x) ~ N(0,P) (pp. 5-6). Showed differential entropy of X is maximized by Gaussian pdf (p. 7). Started proof of achievability of rate R under a power constraint (pp. 8-10) for channel w(y|x), not necessarily Gaussian.
3/8 Finished achievability proof. Proved weak converse for Gaussian channel. 3/10 Derived capacity in bits/second for the bandlimited AWGN channel. Started NEW SECTION on Lloyd-Max Algorithm. Assigned HW #6 handout. DUE WED 3/31. 3/12 Finished derivation of the Lloyd-Max Algorithm. Briefly discussed rate-distortion theory. Handed out three old midterm exams.
3/15 No class -- Spring Break. 3/17 No class -- Spring Break. 3/19 No class -- Spring Break.
3/22 Went over old midterms. 3/24 Went over old midterms. Night Exam 5-6:15 pm in 2345 EH Design Project 3 Directory 3/26 Went over midterm. Started NEW SECTION on Rate Distortion Theory. Assigned HW #6 handout (previously distributed). DUE WED 3/31.
3/29 Stated several Lemmas about R(D) and \tilde R(D). Proved converse of rate-distortion theorem. 3/31 Covered Lemma on p. 7 of notes and proved forward part of rate- distortion coding theorem. Assigned HW #7 via download. DUE WED 4/7. 4/2 Characterized R~*. Found the rate-distortion function of a binary source with the Hamming distortion.
4/5 Computed rate distortion function for a uniformly distributed m-ary source with mod m addition. Derived the Shannon lower bound on the rate distortion function. 4/7 Started NEW SECTION on Source Coding and Rate Distortion for encoding of correlated sources (download Slepian-Wolf notes). Discussed key results. No derivations today. Assigned HW #8 via download. DUE WED 4/14. 4/9 Gave a derivation of basic source coding theorem using random codes. Started proving forward part of the Slepian-Wolf theorem.
4/12 Finished proving Slepian-Wolf theorem. Stated and started to prove Caratheodory's theorem. 4/14 Used Caratheodory's theorem to prove the Lemma on pp. 10-11 of the notes. This lemma allows the restriction on the cardinality of the set U in the theorem on p. 9. Started NEW SECTION on the DM MAC. Assigned HW #9 handout. DUE WED 4/21. 4/16 Proved weak converse for DM MAC. Showed R^* is convex. Showed that |V| can be restricted to 3.
4/19 Reviewed DM MAC. Proved that R^* is closed. Introduced convex hull and R_1(p,q). 4/21 Showed that R^* = co(union R_1(p,q)). Briefly discussed two DM MAC examples. Started NEW SECTION on Broadcast Channels. Assigned HW #10 handout. DUE WED 4/28. 4/23 Continued discussion of broadcast channels (BCs), degraded BCs. Stated theorem for capacity region of the degraded BC. Started proof through p. 7.
4/26 Continued proof of forward part for degraded BC; only emphasized main points. Started weak converse. 4/28 Finished proof of weak converse. Pointed out that R^* is convex and that in its definition, we can restrict the cardinality of the set V to be <= 3. Showed that R^* is equal to the convex hull of the union of R_1(r,p). Gave a detailed derivation using the time-sharing argument that the capacity region of the broadcast channel is convex. 4/30 Reviewed significance of entropy and channel capacity. Started discussion of stationary sources. Handed out teaching evaluations. Handed out additional problems (HW #11) will NOT be collected.
5/3 Finished discussion of source coding for discrte, stationary, ergodic sources. Proved the Shannon-McMillan-Breiman Theorem. Handed out review questions for the final exam. 5/5 Worked review problems. 5/7 Friday. Last day of classes. Worked review problems.
5/9 Sunday. FINAL EXAM 2:45-4:45 in 3534 Engr. Hall. OPEN NOTES. YOU MUST BRING ALL YOUR NOTES - YOU WILL NEED THEM.

Web Page Contact: John (dot) Gubner (at) wisc (dot) edu