Spring 2007 - Noise in Computation (236601)
Time and place: Tuesday, 10:30-12:30, Taub 8
Noise is an essential part of the world. Computers and Communication devices need to function properly in its presence and overcome the problems it creates. The first part of this course will discuss methods for overcoming noise.
Paradoxically, noise is not only a problem, but often also a solution. The second part of this course will deal with the positive aspects of noise in various applications of computation.
Tentative list of subjects:
Estimation Theory
Noise in Signal Processing
Noise in Communication - Error Correcting Codes
Fault-tolerant computation
Noise in algorithms, cryptography and complexity
The course is of theoretical nature and assumes mathematical maturity.
Final grade based on participation in class (30%), homework (40%) and final project (30%).
Schedule, References and slides
Lecture 1 - Introduction; Review of probability theory. [slides]
Lecture 2 - Parametric Estimation Theory I: Unbiased Estimators, Fischer Information, Cramer-Rao Inequality. [See also Section 12.11 in "Elements of Information Theory" By Cover & Thomas].
Lecture 3 - Parametric Estimation Theory II: The Expectation-Maximization algorithm. [Demo by Abhimanyu Lad, three tutorials by Thomas Minka, by Jeff Blimes and by Frank Dellaert]
Lecture 4 - Noise in Signal Processing I: Introduction; The Fourier representation of signals; Band filters.
Lecture 5 - Noise in Signal Processing II: Wavelets; Denoising via Shrinkage. [slides]; [Section 10.2 in "A wavelet tour of signal processing" by Stephane Mallat; Ideal Spatial Adaptation via Wavelet Shrinkage by David L. Donoho and Iain M. Johnstone].
Lectures 6 - Noise in Signal processing III: Guest lecture by Prof. Miki Elad.
Lecture 7 - Noise in Signal Processing IV: Stable signal recovery from incomplete and inaccurate measurements. [ Paper by E. J. Candès, J. Romberg and T. Tao.].
Lectures 8- Noise in Communication I: Error correcting codes, Shannon's entropy and the noisy channel coding theorem [Introduction to Coding Theory by Ron Roth; Madhu Sudan's lecture notes on coding theory ].
Lecture 9 - Noise in Communication II: Linear codes; Random LDPC codes and the Sipser-Spielman decoding algorithm [Madhu Sudan's lecture notes on coding theory].
Lecture 10 - Noise in Communication III: Reed-Solomon codes; The Welch-Berlekamp decoding algorithm [Madhu Sudan's lecture notes on coding theory].
Lectures 11 - Noise in Communication IV: List decoding [Madhu Sudan's lecture notes on coding theory].
Lecture 12-13 - Benefits of noise: Randomized algorithms, smoothed analysis.
Homework
Final project
Read and summarize a research paper related to our course topic. You may suggest your own paper. Pointers to possible articles