ECE 729 Theory of Information Processing and Transmission, Spring 2006


Last Modified: Tue 15 Oct 2019, 01:45 PM, CDT

Clock Icon  Instructor Office Hours: Spring 06: W 11:00 am - noon or by appointment.
See what we did in Spring 2004. Spring 2006 will be ....
Papers Icon Syllabus
Papers Icon Homework Solutions

Books about information theory.
R. M. Gray's book, Entropy and Information Theory, is available free as a PDF file.
References on Source Coding, Rate Distortion, and Multiple Descriptions
Information Theory Resources Available on the web.
Tacked Note Icon Class Schedule for Spring 2006
1/18	Wednesday.  Overview of main topics we'll cover in information theory.
	Started study of variable-length codes.

1/20	Covered pages 1-5 of notes titled "Data Compression (Variable-length
	codes):  Codewords, codes, expected length of codes, Good Codes,
	one-to-one, extension codes, uniquely decodable codes, prefix or
	instantaneous codes.  Stated the Kraft Inequality.

1/23 Derived Kraft Ineq. Stated and proved McMillan's Th. Defined entropy and Discrete Memoryless Source (DMS). Stated and prove log inequality, ln t <= t-1 with equality if and only if t=1. Assigned HW #1 (sol.) & #2 (sol.) DUE WED 2/1. 1/25 Showed that for a uniquely decodable code, E[l(C(X))] >= H(X)/log D. Showed that Shannon-Fano codes achieve E[l(C(X))] <= H(X)/log D + 1. 1/27 Demonstrated Huffman Coding procedure. Discussed entropy as a measure of uncertainty. Started discussion of fixed-length codes.
1/30 Stated and proved Shannon's Source Coding Theorem for fixed-length codes. 2/1 Finished discussion of fixed-length source coding. Asymptotic Equipartition Property (AEP). Showed that the set of achievable rates is closed and convex. Started discussion of Communication System Models, Channel Capacity, and The Discrete Memoryless Channel. Assigned Design Project #1 DUE WED 2/8. When getting your script to run, you might try to run it on smaller problems with only 3-bit words. Use p = 0.2 and lambda = 0.05 for the printout you turn in. 2/3 Talked about encoders, probabilistic channel model, defined achievable rate, channel capacity.
2/6 Presented random coding approach to proving the channel coding theorem. Defined notation, including A_n. Showed P((X,Y)\notin A_n) goes to zero. 2/8 Continued proving the channel coding theorem. Assigned HW #3 (sol.) DUE WED 2/15. 2/10 Completed derivation of forward theorem. Started discussing properties of entropy, conditional entropy, and mutual information.
2/13 Continued deriving properties of entropy and conditional mutual information. Derived Data Processing Lemma, stated Fano's inequality. 2/15 Gave an alternative proof of Fano's inequality. Proved the weak converse for the DMC. Assigned Design Project #2 DUE WED 2/22. Download the two MATLAB files PC.m and W.m. Run the script PC.m; look at the script and its output and understand how it works. Then modify the script PC.m to accomodate n=7 bit words and N=4 codewords. You must choose the decoding sets D_i to try to minimize the average probability of error. Turn in a printout of your modified program and a copy of its output. 2/17 Covered strong converse for the DMC. Discussed maximizing the average mutual information. Introduced weakly symmetric channels.
2/20 Continued discussion of weakly symmetric channels and examples. Briefly discussed Arimoto-Blahut Algorithm. 2/22 Showed that if a rate is achievable under the average error criterion, then it is achievable under the maximal error criterion. Assigned HW #4 (sol.) DUE WED 3/1. In Problem 6, p and q are as in Problem 4. 2/24 Started the Gaussian Channel. Defined input power constraint, achievable rates under an input power constraint, capcity region and capacity under an input power constraint. Defined differential entropy and average mutual information for continuous random variables.
2/27 Derived results about the entropy of Gaussian RVs. Started forward theorem for channels with a continuous transition density w(y|x). Supplement for Gaussian Channel Notes. 3/1 Continued proof of forward theorem for continuous channels. Assigned HW #5 (sol.) DUE WED 3/8. 3/3 Proved weak converse for the Gaussian channel.
3/6 Discussed handout, Channel Coding - The Big Picture. Also discussed handout on the ideal bandlimited Gaussian channel. 3/8 Finished bandlimited Gaussian channel. Started Lloyd-Max Quantizers. Assigned HW #6 (sol.) DUE WED 3/22. Old exams are here. 3/10 Lloyd-Max Algorigthm Vector Quantization, Voronoi Regions, Linear Estimation, and the Lloyd-Max Algorithm.
3/13 NO CLASS -- SPRING BREAK 3/15 NO CLASS -- SPRING BREAK 3/17 NO CLASS -- SPRING BREAK
3/20 Started Section on Rate Distortion Theory. 3/22 Proved converse part of rate distortion theorem. 3/24 Exam 1 in class.
3/27 Went over Exam 1. 3/29 Proved forward part of rate distortion theorem. Assigned HW #7 (sol.) DUE WED 4/5. 3/31 Proved Proposition about \tilde D^* on p. 13. Started Example on pp. 15-17.
4/3 Finished Example on pp. 15-17. Worked Example on pp. 19-21. Started Shannon Lower Bound on p. 17. 4/5 Finished Shannon Lower Bound. Started Separate Encoding of Correlated Sour