ECE 7251: Signal Detection and Estimation
Georgia Institute of Technology
Spring 2002

Instructor: Prof. Aaron Lanterman
Office: GCATT 334B
Phone: 404-385-2548
Course website:

When and where: MWF, 12:05-12:55,
212 Engineering Science and Mechanics Building



  • The first quiz has been postponed to Monday, Feb. 11.
  • The second quiz has been postponed to Monday, March 18.

Lectures, Suggested Readings, and Suggested Problems

  • Lecture 1: Introduction (1/4/01)

  • Lecture 2: Sufficient Statistics and Exponential Families (1/7/01)
    (pdf) (Poor, pp. 158-160, 164; Hero, pp. 24-30; some lecture
    examples taken from pp. 30-33 of Srinath)
    Suggested problems: Hero, Sec. 3.5, #5, #6 (parts a and b only)

  • Lecture 3: Introduction to Bayesian Estimation (1/9/01)
    (pdf) (Poor, pp. 142-147)
    (Poor, pp. 142-147; Hero pp.

  • Lecture 4: Examples of Bayesian Estimation (1/11/01)
    (Poor, pp. 147-152, pay particular attention to Example IV.B.2;
    Hero, pp. 38-42)
    Suggested problems: Poor, Sec. IV.F: #1, #7

  • Lecture 5: More Examples and Properties
    of Bayesian Estimation
    (Hero, pp. 42-46)
    Suggested problem: Poor, Sec. IV. F: #25 (parts a and b only)
    Highly suggested problem: Try to derive the CME,
    CmE, and MAP estimators
    on pp. 43-44 of Hero (good practice with erf functions; you may need to do
    integration by parts)

  • Lecture 6: The Orthogonality Principle in MMSE Estimation (1/16/01)
    (Poor, pp. 221-229; Hero, pp. 82-96)
    Suggested problems:
    Three interesting MMSE problems with solutions
    , Scan of an old UIUC exam problem
    (This one is interesting since it shows how the orthogonality principle
    is useful for things beyond computing linear MMSE. Here, you compute a
    quadratic MMSE!)

  • Lecture 7: Examples of Linear and Nonlinear MMSE Estimation (1/18/01)
  • Lecture 8: Nonrandom Parameter Estimation (1/23/01)
    (Poor, pp. 173-185; the discussion on p. 179 and continuing on to the
    top of p. 180 is particularly enlightening; Hero, pp. 51-60, pp. 70-76)

  • Lecture 9: The Cramer-Rao Bound (1/25/01)
    (Poor, pp. 167-173, pp. 185-186; Hero, pp. 60-70)
    Suggested problems: Poor, Sec. IV. F: #15, #25 (now try parts c and d)

  • Lecture 10: Estimation Under Additive Gaussian Noise (a.k.a. Least
    Squares Solutions) (1/28/01)
    (ppt, revised 2/08)
    (pdf, revised 2/08)
    (Poor, pp. 155-157)

  • Lecture 11: Examples with Non-Gaussian Data, Part I
  • Lecture 12: Examples with Non-Gaussian Data, Part II
  • Lecture 13:”The” Expectation-Maximization Algorithm
    (Basic Formulation and Simple Example) (2/4/02)

  • Lecture 14:”The” Expectation-Maximization Algorithm
    (Theory) (2/6/02)

  • Lecture 15: EM Algorithm for Gaussian Mixture
  • Lecture 16: The Kalman Filter

  • Lecture 17: Variations of the Kalman Filter

  • Lecture 18: Wiener Filtering

  • Lecture 19: Introduction to Detection Theory
    (including Bayesian and MinMax Tests)
    (read Poor, pp. 5-18, but try not to get too bogged down in the proofs, and
    skim the parts on randomization; Hero, p. 140-160)

    Suggested problems: Poor, Sec. II.F, #2 (parts a and b), #4 (parts
    a and b), #6 (parts a and b)

  • Lecture 20: Gaussian Detection Example (Equal Variances)
    (Poor, Example II.C.1)

  • Lecture 21: Gaussian Detection Example (Equal Means)
  • Lecture 22: Neyman-Pearson Tests and ROC Curves
    (Poor, pp. 22-29; Hero, pp. 160-174)
    Suggested problems: Poor, Sec. II.F, #2 (part c), #4 (part c),
    Hero #7.3, #7.4

  • Lecture 23: Chernoff Bounds (Theory)
    (ppt, revised 3/18)
    (pdf, revised 3/18)

    Suggested problems: See exercises on lecture slides

  • Lecture 24: Chernoff Bounds (Gaussian Examples)
    (ppt, revised 3/18)
    (pdf, revised 3/18)

    Suggested problems: See exercises on lecture slides

  • Lecture 25: The Role of Information Theory in Detection Theory

  • Lecture 26: Uniformly Most Powerful Tests
    (Hero, pp. 177-192)

    Suggested problems: Hero, #7.2

  • Lecture 28: Locally Most Powerful Tests
    (pdf) (look over solution to Poor,
    Sec. II.6, #15; Hero, pp. 196-208)

    Suggested problems: Hero, #8.1

  • Lecture 29: Generalized Likelihood Ratio Tests and
    Model Order Selection Criteria
    (Hero, pp. 209-211; skim Hero, Chapter 9)

    Suggested problems: Hero, #8.2, #8.4, #8.5

  • Lecture 30: Karhunen-Loeve Expansions, Part I
  • Lecture 31: Karhunen-Loeve Expansions, Part II
  • Lecutre 32: Karhunen-Loeve Expansions, Part III (Hero, pp. 325-327)

    Suggested problems:

    A Hard K-L Expansion Problem

  • Lecture 33: Detecting Deterministic Signals in Noise
    (Poor, pp. 45-63;
    read the solution to Poor, Sec. III.F, #1)

    Suggested problems: Poor, Sec. III.F, #6, #13

  • Lecture 34: General Multivariate Gaussian Detection Problems
    (Hero, pp. 249-261, pp. 269-279)

  • Lecture 35: Continous-Time Detection of Deterministic Signals
    in White Gaussian Noise

  • Lecture 36: A Peek at Channel Capacity
    (ppt) (Hero, pp. 339-343)

    Suggested problems: Poor, Sec. III.F, #3

  • Lecture 37: Continous-Time Detection of Deterministic Signals
    in Colored Gaussian Noise
    (Hero, pp. 327-339; pay particular attention to pp. 327-330; notice
    that what Van Trees calls Q, Hero calls r^{-1})

  • Lecture 38: Incoherent Detection
    (Poor, pp. 65-72; skim Hero, pp. 356-362)

    Suggested problem: Poor, Sec III.F, #15

  • Lecture 39: Parameter Estimation with Continuous-Time Data, Part I
    (Poor, pp. 327-331)

  • Lecture 40: Parameter Estimation with Continuous-Time Data, Part II
    (Poor, pp. 331-333)

Data Files

Exam Solutions

  • In-class Quiz I Solutions

Other Goodies

  • Kalman’s
    original paper on Kalman filtering
    recently re-typeset

  • Farshid Delgosha’s solutions to problems 2-5 starting on p. 29 of
    Hero’s book (MS Word doc)
    (pdf) and problems 1-7
    (MS Word doc) (pdf)
    and 25(ab)
    (MS Word doc)
    in Chapter 4 of Poor’s

  • Brief
    Comments on the EM Algorithm
    by Prof.
    L. Snyder. This contains
    very nice description of the “classic” EM algorithm, and a review of
    research on the EM algorithm at Washington University circa 1986.

  • Volkan gave a presentation on estimation theory in Jim McClellan’s
    research group. He let me post his
    slides and a related
    with related proofs.
    Volkan did a great job; it’s
    a nice distillation of some of the material in Ch. 4 of Poor. Enjoy!