syllabus.html

ECE 7251: Signal Detection and Estimation
Georgia Institute of Technology
Spring 2002

Instructor: Prof. Aaron Lanterman
Office: GCATT 334B
Phone: 404-385-2548
E-mail: lanterma@ece.gatech.edu
Course website: users.ece.gatech.uiuc.edu/~lanterma/ece7251


When and where: MWF, 12:05-12:55,
212 Engineering Science and Mechanics Building


Prerequisites: Knowledge of probability and random processes
at the level of the text by Papoulis, Stark and Woods,
or Leon-Garcia, i.e. ECE6601 or equivalent.
Some questions on the take-home
portion of the exams will require some basic programming skills;
although you are welcome to use whatever programming language you like,
competency in MATLAB will be extremely helpful.
Note that ECE6250 is NOT a prerequisite for this offering of ECE7251!


Textbooks

Required texts:

  • A.O. Hero, “Statistical Methods for Signal Processing,” 1998-2002.
    This is a set of course notes written by Prof. Al Hero (ECE, Univ. of
    Michigan-Ann Arbor), which he plans to eventually turn into a textbook.
    A link to the
    PDF file will be provided, and copies will be made available for purchase
    in the campus bookstore.

  • H.V. Poor, “An Introduction to Signal Detection and Estimation,”
    2nd Edition, Springer, 1994.


Prof. Hero’s notes will provide the main structure
of the course and the inspiration for my lectures. However, the notes are
a bit lacking on textual descriptions at the moment, so I thought it best
that everyone have a copy of Vince Poor’s book too, in order to have a
solid reference readily available. Prof. Hero welcomes comments on his notes;
send your comments to me, and I’ll compile them together into a coherent
report.


Texts on reserve in the library:

  • H.L. Van Trees, “Detection, Estimation, and Modulation Theory,”
    H.L. Van Trees, Vol. I, 1968, recently reprinted in paperback (Sept. 2001)
    by John
    Wiley & Sons. (Amazon.com sells this for $66.)

  • L.L. Scharf, “Statistical signal processing :
    detection, estimation, and time
    series analysis,” Addison-Wesley, 1991.

  • S.M. Kay, “Fundamentals of statistical
    signal processing. Volume I:
    estimation theory,” Prentice-Hall, 1998

  • S.M. Kay, “Fundamentals of statistical
    signal processing. Volume II:
    detection theory,” Prentice-Hall, 1998

  • M.D. Srinath, P.K.Rajasekaran, R. Viswanathan, “Introduction to
    Statistical Signal Processing with Applications,” Prentice-Hall, 1996.

Grades, Exams, and Other Necessities of Life


Grade breakdown:
Exam 1 (Feb. 6): 20%
Exam 2 (March 13): 20%
Exam 3 (April 3): 20%
Final Exam: 40%
(If you have a conflict with these any of these times, please let me know
ASAP.)


Each exam will have an in-class part and a take-home part.
The in-class part will be open book and open notes, although this is primary
so
you won’t panic; the in-class section will emphasize your “intuitive”
understanding of the material. If you find yourself spending
most of your time on the in-class portion frantically flipping through your
notes trying to find the answer, you will run out of time.


The take-home portion will consist of more
in-depth problems which couldn’t possibly completed in an in-class
exam, although they are not intended to be exceptionally time consuming.
You will be given a generous amount of time to complete it.
The take-home portion may involve some highly instructive computational
experiments. You will be allowed to consult with any resources in the
library, posted on web sites, etc., as long as these are properly
cited in your solution. You will not, however, be
allowed to discuss the take-home
portion in any way with anyone besides myself.


I’m giving three exams (which is more than usual for a graduate-level
class) since it
helps reduce the bad day effect. The bad day effect hits
you when you happen to be having a bad day (you know, one of those days
where nothing is going right and you’ve run out of coffee
and you can’t seem to get your brain working)
on the one day you’re taking
the single midterm exam. When there are several midterm exams, you’re not
likely to have a bad day on all the exam days, so if you do have a
bad day, it can get averaged out.


A note on homework, or the lack thereof:
Notice that no homework will be collected and graded. In lecture, various
problems will be suggested for you to try at your leisure; working through
as many problems as you can will be the best way to prepare for the exams,
particularly the in-class portion.
You are strongly encouraged to work together in groups, preferably with
a lot of coffee. It’s also best
to tackle a problem or two per day, perhaps soon after the lecture while
the material is still fresh in your head,
rather than sit down and try a whole
bunch at once in a marathon session right before the exam.


Office hours:
If you have an office in GCATT, just go ahead and drop by; I will be there
most afternoons and evenings, although never in the morning. If you are
stationed outside of GCATT (or are otherwise having trouble getting a hold
of me in person),
send me an e-mail, and I’ll set up a time to meet with you in Van Leer
(or wherever is most convenient for you.)
I will
generally go to lunch after class; people are welcome to join me for lunch
if they have questions or generally want to chat. I will have more formal
office hours before the exams.

A note on e-mail:
As anyone who’s taken a course from me before will attest, I tend to send
out a lot of course-related e-mail. Make sure your e-mail account is working
(and not over quota or something like that) so you don’t miss anything good.

Tentative schedule

  • General Structure
    • Sufficient Statistics; Exponential Families
  • Parameter Estimation
    • Bayesian Estimation (MAP and MMSE)
    • Orthogonality Principle of MMSE
    • Linear Minimum Mean Squared Error Estimation
    • Maximum-Likelihood
    • Method of Moments
    • Cramer-Rao Bounds (both random and nonrandom)
  • Computational Techniques
    • EM Algorithm: Theory
    • EM Algorithm: Examples
    • Markov Chain Monte Carlo algorithms
  • Filtering for Discrete-Time Processes
    • Kalman Filter
    • Wiener Filter
  • Simple Hypothesis Testing
    • Bayesian Detection
    • Minmax Detection
    • Neyman-Pearson Lemma; ROC Curves
    • Chernoff Bounds
  • Composite Hypothesis Testing
    • Uniformly Most Powerful (UMP) Tests
    • Locally Most Powerful (LMP) Tests
    • Generalized Likelihood Ratio Tests (GLRT)
    • Detector Structures for Discrete-Time Data with Gaussian, Laplacian,
      and Cauchy Noise
  • Model Order Estimation
    • Schwarz’s Bayesian Information Criterion
    • Minimum Description Length Criterion
    • Stochastic Complexity
  • Continuous-Time Extensions
    • Karhunen-Loeve Expansions
    • Grenander’s Theorem
    • Detection with Continuous Data
    • Parameter Estimation with Continuous Data