3 credits
Fall 2025 Lecture Upper DivisionProbability measure, conditional probability and Bayes theorem. The gambler's ruin problem. The PageRank Googles search algorithm. The classical distributions including, uniform, exponential, Poisson, binomial, Gaussian, Wigner, Raleigh, etc. Functions of a random variable. Joint distributions. Expectations, variance and median. An introduction to Hilbert space and the projection theorem. Applications to interpolation problems and observability in control theory. The projection theorem and estimation problems. The projection theorem and conditional expectation. Applying the projection theorem to approximate the conditional expectation, that is, nonlinear estimation problems. Conditional expectation and ADAM and EVE. Gaussian random vectors. Computing the condition expectation for Gaussian random vectors by using the orthogonal projection. The Moore-Penrose inverse. The central limit theorem. Maximal likelihood estimation.
Learning Outcomes1Demonstrate a knowledge of probability measure and distributions.
2Know how to apply the classical distribution functions.
3Know how to apply the Google PageRank algorithm.
4Have a good understanding of mean variance and median in applications.
5Have a deep understanding of conditional expectation.
6Use the projection theorem to solve both linear and "nonlinear" estimation problem.