ECE 729 Theory of Information Processing and Transmission, Spring 2001


Eye Icon See what we did in Spring 1997.
Syllabus
Homework Solutions.
You can download Shannon's original paper, A Mathematical Theory of Communication,
          which started the whole field of Information Theory.
Books about information theory.
R. M. Gray's book, Entropy and Information Theory, is available free as a PDF file.
References on Source Coding, Rate Distortion, and Multiple Descriptions
Information Theory Resources Available on the web.
Class Schedule for Spring 2001
1/22	Monday.  Went over comm. system diagram, "big picture",
	list of topics we hope to cover.

1/24	Began discussion of variable length codes for data compression.
	Key ideas: unique decodability, expected length of a codeword,
	and high-probabilty letters should have short codewords.
	Defined code, codebook, extension of a code.
	Started discussion of prefix or instantaneous codes.

1/26	Derived the Kraft Inequality and started proof of McMillan's Theorem.
	Assigned HW #1 DUE WED 1/31.
	Extra copies of HW (on both sides of paper) are outside my office.

1/29 Proved McMillan's Theorem, Theorem on lower bound on E[l(C(X))]. Showed that Shannon-Fano codes achieve lower bound + 1. 1/31 Summarized what we've done so far. Introduced discrete memoryless source (DMS) and discussed variable length codes for sequences of source letters. For a DMS, shorter expected lengths (per source letter) are possible by coding blocks of source letters together. Introduced binary Huffman codes. Discussed entropy as a measure of uncertainty. Assigned HW #2 DUE WED 2/7. 2/2 Worked example for ternary Huffman codes. Started fixed length codes for data compression. Started proof that H(p) is an achievable rate.
2/5 Finished proof that for DMS's, the set of achievable rates is exactly the interval [ H(p), infinity ). This included showing that the set of achievable rates is closed and convex. 2/7 Review of Gaussian random variables, vectors, and processes. Discussed "Sending Bits over the AWGN Channel." Binary channels, the Binary Symmetric Channel (BSC). No HW Assigned. 2/9 Introduction to error correcting codes. Hamming distance, dmin of a code, decoding sets and their error-correction and error-detection capabilities. Brief description of Reed-Solomon codes.
2/12 Started notes on The Discrete Memoryless Channel (DMC). Focused on pages 4-8. 2/14 Proved achievability of rates < I(p x W) for the DMC, covering pages 9-14. (Skipped 10.1 and 10.2) Assigned HW #3 DUE WED 2/21. 2/16 Went over page 14.1 and properties of mutual information through page 18. Stated Data Processing Lemma on p. 19.
2/19 Derived the Data Processing Lemma. Gave an alternative proof of Fano's inequality, different from the one in the notes. Proved the Weak Converse Theorem, but glossed over some details to be discussed next time. 2/21 Showed that the hyptheses of the Data Processing Lemma are satisfied by the RVs in the proof of the weak converse. Proved Lemma: if P(u|v,w) does not depend on w, then P(u|v,w)=P(u|v). Applied several times. Discussed the fact that a problem is not well posed until you can write down the joint pmf of all basic RVs. Assigned HW #4 DUE WED 2/28. 2/23 Defined lambda achievability, and discussed the strong converse as compared with the weak converse (pp. 23.1-23.2). Introduced the maximum probability of error criterion (p. 4) and compared it to the average probability of error criterion used in the original definition of achievable rate on p. 6. See also pages 35-37, where it is shown that for the DMC, a rate R is achievable under the average probability of error criterion if and only if it is achievable under the maximum probability of error criterion.
2/26 pp. 24-27: Notion of symmetric and weakly symmetric channels and how to compute their capacity almost by inspection. Examples include the BSC and BEC. 2/28 Outlined the Arimoto-Blahut Algorithm and a couple of lemmas. Assigned HW #5: Write a program, e.g., in Matlab, to implement the Arimoto-Blahut Algorithm. Your program should print out the maximizing pmf vector and the corresponding capacity (in bits per channel use). Test your program on the BSC with crossover probability epsilon = 1/10 and on the BEC with epsilon = 1/10. Use your program to find the capacity of the channel W(.|1) = (2/3, 1/3, 0) W(.|2) = (1/3, 1/3, 1/3) W(.|3) = ( 0, 1/3, 2/3). DUE WED 3/7. 3/2 Proved convergence of the Arimoto-Blahut Algorithm to the DMC capacity.
3/5 Started notes on the Gaussian channel, pp. 1-6. 3/7 Continued discussing the Gaussian channel, pp. 7-9. Distributed two old exams. 3/9 Finished proof of forward theorem for the Gaussian channel, pp. 10-13.
3/12 NO CLASS - Spring Break 3/14 NO CLASS - Spring Break 3/16 NO CLASS - Spring Break
3/19 Went over problems from old exams. 3/20 TUESDAY Exam 1 - night exam in 3534 Engineering Hall from 7:15-8:30 Closed book, closed notes. Bring a calculator. 3/21 Went over Exam 1. Used the sampling theorem to show that a bandlimited function cannot be time limited. Also argued that the set of bandlimited function that is approximately time limited of duration T has approximate dimension 2WT. 3/23 Returned Exam 1. Proved the weak converse for the Gaussian channel. Extended results to the ideal bandlimited Gaussian channel. Distributed handout covering material from lecture of 3/21. Assigned HW #6 DUE FRI Mar. 30.
3/26 Distributed handout covering the ideal bandlimited Gaussian channel. Distributed handout covering average mutual information for mixed random variables. Covered notes on the Lloyd-Max Algorithm. 3/28 Started notes on Rate Distortion Theory, pp. 1-3.2. 3/30 Showed that R~(D) is nonincreasing and convex on [0,infinity). Skipped proof of lemma on p. 5-5.1. Proved the converse on p. 6. Proved lemma on p. 7. Assigned HW #7 DUE FRI Apr. 6.
4/2 Proved the forward part of the rate-distortion coding theorem, pp. 8-12. 4/4 Showed that R(D)=0 if and only if D >= ~D*, pp. 13-14. Evaluated the rate distortion function for a Bernoulli sequence with Hamming distortion, pp. 15-17. 4/6 Evaluated the rate distortion function for an i.i.d. source uniformly distributed on {0,...,m-1} with the Hamming distortion, pp. 19-21. Discussed the Shannon Lower Bound on the rate distortion function, pp. 17-18.
4/9 Finished discussion of Shannon Lower bound. Skipped Arimoto-Blahut Algorithm for computing the rate distortion function. Discussed multiple descriptions. 4/11 Distributed handout on "Source Coding and Rate Distortion Theory for Multiple Correlated Sources." Extra copies outside my office. Summarized systems covered, went over pp. 1-3 giving a new proof of the single-source source coding theorem (forward part). Assigned HW #8 DUE WED Apr. 18. Extra copies outside my office. 4/13 Proved the forward part of the Slepian-Wolf Theorem, pp. 4-6.
4/16 Proved converse of the Slepian-Wolf Theorem, pp. 7-8. Discussed Source Coding with Side Information, p. 9. Discussed time sharing and convexity of the achievable rate region. Briefly discussion Rate Distortion with Side Information, p. 10. Distributed handout on Carathéodory's Theorem. 4/18 Handed out solutions to HW #8 and discussed the solution of Problem 4. Handed out and went over a proof that the cardinality of the set U on p. 9 of the notes can be restricted to the cardinality of Y + 2. Handed out HW #9 DUE WED Apr. 25. 4/20 Engineering Expo - Classes Canceled.
4/23 Started notes on "Source Coding Revisited -- Discrete Stationary Sources." Covered everything except pp. 4-5 and the top of 6. 4/25 Reviewed solution of Problem 3 on HW #8. Finished notes on "Source Coding Revisited." Discussed solution of Problem 4 on HW #9. Handed out HW #10 DUE WED May 2. 4/27 Discussed Lempel-Ziv Alg. pp. 1-2. Stated and proved Lemma on pp. 4-6, which is a discrete version of a HW problem.
4/30 5/2 5/4
5/7 5/9