ECE 7251: Signal Detection and Estimation
Georgia Institute of Technology
Spring 2002
Instructor: Prof. Aaron Lanterman
Office: GCATT 334B
Phone: 4043852548
Email: lanterma@ece.gatech.edu
Course website: users.ece.gatech.edu/~lanterma/ece7251
When and where: MWF, 12:0512:55,
212 Engineering Science and Mechanics Building
Syllabus
News
 The first quiz has been postponed to Monday, Feb. 11.
 The second quiz has been postponed to Monday, March 18.
Lectures, Suggested Readings, and Suggested Problems
 Lecture 1: Introduction (1/4/01)
(ppt)
(pdf)  Lecture 2: Sufficient Statistics and Exponential Families (1/7/01)
(ppt)
(pdf) (Poor, pp. 158160, 164; Hero, pp. 2430; some lecture
examples taken from pp. 3033 of Srinath)
Suggested problems: Hero, Sec. 3.5, #5, #6 (parts a and b only)  Lecture 3: Introduction to Bayesian Estimation (1/9/01)
(ppt)
(pdf) (Poor, pp. 142147)
(Poor, pp. 142147; Hero pp.
3238)  Lecture 4: Examples of Bayesian Estimation (1/11/01)
(whiteboard)
(Poor, pp. 147152, pay particular attention to Example IV.B.2;
Hero, pp. 3842)
Suggested problems: Poor, Sec. IV.F: #1, #7  Lecture 5: More Examples and Properties
of Bayesian Estimation
(1/14/01)
(whiteboard)
(Hero, pp. 4246)
Suggested problem: Poor, Sec. IV. F: #25 (parts a and b only)
Highly suggested problem: Try to derive the CME,
CmE, and MAP estimators
on pp. 4344 of Hero (good practice with erf functions; you may need to do
integration by parts)  Lecture 6: The Orthogonality Principle in MMSE Estimation (1/16/01)
(ppt)
(pdf)
(Poor, pp. 221229; Hero, pp. 8296)
Suggested problems:
Three interesting MMSE problems with solutions
(PDF), Scan of an old UIUC exam problem
(This one is interesting since it shows how the orthogonality principle
is useful for things beyond computing linear MMSE. Here, you compute a
quadratic MMSE!)  Lecture 7: Examples of Linear and Nonlinear MMSE Estimation (1/18/01)
 Lecture 8: Nonrandom Parameter Estimation (1/23/01)
(ppt)
(pdf)
(Poor, pp. 173185; the discussion on p. 179 and continuing on to the
top of p. 180 is particularly enlightening; Hero, pp. 5160, pp. 7076)  Lecture 9: The CramerRao Bound (1/25/01)
(ppt)
(pdf)
(Poor, pp. 167173, pp. 185186; Hero, pp. 6070)
Suggested problems: Poor, Sec. IV. F: #15, #25 (now try parts c and d)  Lecture 10: Estimation Under Additive Gaussian Noise (a.k.a. Least
Squares Solutions) (1/28/01)
(ppt, revised 2/08)
(pdf, revised 2/08)
(Poor, pp. 155157)  Lecture 11: Examples with NonGaussian Data, Part I
 Lecture 12: Examples with NonGaussian Data, Part II
 Lecture 13:”The” ExpectationMaximization Algorithm
(Basic Formulation and Simple Example) (2/4/02)
(ppt)
(pdf)  Lecture 14:”The” ExpectationMaximization Algorithm
(Theory) (2/6/02)
(ppt)
(pdf)  Lecture 15: EM Algorithm for Gaussian Mixture
 Lecture 16: The Kalman Filter
(ppt)
(pdf)  Lecture 17: Variations of the Kalman Filter
(ppt)
(pdf)  Lecture 18: Wiener Filtering
(ppt)
(pdf)  Lecture 19: Introduction to Detection Theory
(including Bayesian and MinMax Tests)
(ppt)
(pdf)
(read Poor, pp. 518, but try not to get too bogged down in the proofs, and
skim the parts on randomization; Hero, p. 140160)
Suggested problems: Poor, Sec. II.F, #2 (parts a and b), #4 (parts
a and b), #6 (parts a and b)  Lecture 20: Gaussian Detection Example (Equal Variances)
(Poor, Example II.C.1)  Lecture 21: Gaussian Detection Example (Equal Means)
 Lecture 22: NeymanPearson Tests and ROC Curves
(ppt)
(pdf)
(Poor, pp. 2229; Hero, pp. 160174)
)
Suggested problems: Poor, Sec. II.F, #2 (part c), #4 (part c)  Lecture 23: Chernoff Bounds (Theory)
(ppt, revised 3/18)
(pdf, revised 3/18)
Suggested problems: See exercises on lecture slides  Lecture 24: Chernoff Bounds (Gaussian Examples)
(ppt, revised 3/18)
(pdf, revised 3/18)
Suggested problems: See exercises on lecture slides  Lecture 25: The Role of Information Theory in Detection Theory
(ppt)
(pdf)  Lecture 26: Uniformly Most Powerful Tests
(ppt)
(pdf)  Lecture 28: Locally Most Powerful Tests
(ppt)
(pdf)  Lecture 29: Generalized Likelihood Ratio Tests and
Model Order Selection Criteria
(ppt)
(pdf)  Lecture 30: KarhunenLoeve Expansions, Part I
 Lecture 31: KarhunenLoeve Expansions, Part II
 Lecutre 32: KarhunenLoeve Expansions, Part III
Suggested problems:
A Hard KL Expansion Problem  Lecture 33: Detecting Deterministic Signals in Noise
(ppt)  Lecture 34: General Multivariate Gaussian Detection Problems
(ppt)  Lecture 35: ContinousTime Detection of Deterministic Signals
in White Gaussian Noise
(ppt)  Lecture 36: A Peek at Channel Capacity
(ppt)  Lecture 37: ContinousTime Detection of Deterministic Signals
in Colored Gaussian Noise
(ppt)  Lecture 38: Incoherent Detection
(ppt)
Data Files
Exam Solutions

Inclass Quiz I Solutions
(pdf)
Other Goodies
 Farshid Delgosha’s solutions to problems 25 starting on p. 29 of
Hero’s book (MS Word doc)
(pdf) and problems 17
(MS Word doc) (pdf)
and 25(ab)
(MS Word doc)
(pdf)
in Chapter 4 of Poor’s
book  Brief
Comments on the EM Algorithm by Prof.
Donald
L. Snyder. This contains
very nice description of the “classic” EM algorithm, and a review of
research on the EM algorithm at Washington University circa 1986.  Volkan gave a presentation on estimation theory in Jim McClellan’s
research group. He let me post his
slides and a related
short
paper
with related proofs.
Volkan did a great job; it’s
a nice distillation of some of the material in Ch. 4 of Poor. Enjoy!