2015-2016 2016-2017 2017-2018 2018-2019 2019-2020
Browse
by subject...
    Schedule
view...
 

1 - 2 of 2 results for: EE378A

EE 378A: Statistical Signal Processing

Basic concepts of statistical decision theory; Bayes decision theory; HMMs and their state estimation (Forward--backward), Kalman as special case, approximate state estimation (particle filtering, Extended Kalman Filter), unknown parameters; Inference under logarithmic loss, mutual information as a fundamental measure of statistical relevance, properties of mutual information: data processing, chain rules. Directed information. Prediction under logarithmic loss; Context Tree Weighting algorithm; Sequential decision making in general: prediction under general loss functions, causal estimation, estimation of directed information. Non-sequential inference via sequential probability assignments. Universal denoising; Denoising from a decision theoretic perspective: nonparametric function estimation, wavelet shrinkage, density estimation; Estimation of mutual information on large alphabets with applications such as boosting the Chow-Liu algorithm. Estimation of the total variation distance, more »
Basic concepts of statistical decision theory; Bayes decision theory; HMMs and their state estimation (Forward--backward), Kalman as special case, approximate state estimation (particle filtering, Extended Kalman Filter), unknown parameters; Inference under logarithmic loss, mutual information as a fundamental measure of statistical relevance, properties of mutual information: data processing, chain rules. Directed information. Prediction under logarithmic loss; Context Tree Weighting algorithm; Sequential decision making in general: prediction under general loss functions, causal estimation, estimation of directed information. Non-sequential inference via sequential probability assignments. Universal denoising; Denoising from a decision theoretic perspective: nonparametric function estimation, wavelet shrinkage, density estimation; Estimation of mutual information on large alphabets with applications such as boosting the Chow-Liu algorithm. Estimation of the total variation distance, estimate the fundamental limit is easier than to achieve the fundamental limit; Peetre¿s K-functional and bias analysis: bias correction using jackknife, bootstrap, and Taylor series; Nonparametric functional estimation. Prerequisites: Familiarity with probability theory and linear algebra at the undergraduate level.
Last offered: Spring 2017

EE 378B: Inference, Estimation, and Information Processing

Techniques and models for signal, data and information processing, with emphasis on incomplete data, non-ordered index sets and robust low-complexity methods. Linear models; regularization and shrinkage; dimensionality reduction; streaming algorithms; sketching; clustering, search in high dimension; low-rank models; principal component analysis. Applications include: positioning from pairwise distances; distributed sensing; measurement/traffic monitoring in networks; finding communities/clusters in networks; recommendation systems; inverse problems. Prerequisites: EE278 and EE263 or equivalent. Recommended but not required: EE378A
Last offered: Spring 2019
Filter Results:
term offered
updating results...
number of units
updating results...
time offered
updating results...
days
updating results...
UG Requirements (GERs)
updating results...
component
updating results...
career
updating results...
© Stanford University | Terms of Use | Copyright Complaints