2015-2016 2016-2017 2017-2018 2018-2019 2019-2020
Browse
by subject...
    Schedule
view...
 

1 - 6 of 6 results for: STATS 116: Theory of Probability

CS 229T: Statistical Learning Theory (STATS 231)

How do we formalize what it means for an algorithm to learn from data? How do we use mathematical thinking to design better machine learning methods? This course focuses on developing mathematical tools for answering these questions. We will present various learning algorithms and prove theoretical guarantees about them. Topics include generalization bounds, implicit regularization, the theory of deep learning, spectral methods, and online learning and bandits problems. Prerequisites: A solid background in linear algebra ( Math 104, Math 113 or CS205) and probability theory (CS109 or STAT 116), statistics and machine learning ( STATS 315A, CS 229 or STATS 216).
Last offered: Autumn 2018

MATH 230A: Theory of Probability I (STATS 310A)

Mathematical tools: sigma algebras, measure theory, connections between coin tossing and Lebesgue measure, basic convergence theorems. Probability: independence, Borel-Cantelli lemmas, almost sure and Lp convergence, weak and strong laws of large numbers. Large deviations. Weak convergence; central limit theorems; Poisson convergence; Stein's method. Prerequisites: STATS 116, MATH 171.
Terms: Aut | Units: 3

STATS 116: Theory of Probability

Probability spaces as models for phenomena with statistical regularity. Discrete spaces (binomial, hypergeometric, Poisson). Continuous spaces (normal, exponential) and densities. Random variables, expectation, independence, conditional probability. Introduction to the laws of large numbers and central limit theorem. Prerequisites: MATH 52 and familiarity with infinite series, or equivalent.
Terms: Aut, Spr, Sum | Units: 4 | UG Reqs: GER:DB-Math, WAY-AQR, WAY-FR

STATS 231: Statistical Learning Theory (CS 229T)

How do we formalize what it means for an algorithm to learn from data? How do we use mathematical thinking to design better machine learning methods? This course focuses on developing mathematical tools for answering these questions. We will present various learning algorithms and prove theoretical guarantees about them. Topics include generalization bounds, implicit regularization, the theory of deep learning, spectral methods, and online learning and bandits problems. Prerequisites: A solid background in linear algebra ( Math 104, Math 113 or CS205) and probability theory (CS109 or STAT 116), statistics and machine learning ( STATS 315A, CS 229 or STATS 216).
Last offered: Autumn 2018

STATS 300A: Theory of Statistics I

Finite sample optimality of statistical procedures; Decision theory: loss, risk, admissibility; Principles of data reduction: sufficiency, ancillarity, completeness; Statistical models: exponential families, group families, nonparametric families; Point estimation: optimal unbiased and equivariant estimation, Bayes estimation, minimax estimation; Hypothesis testing and confidence intervals: uniformly most powerful tests, uniformly most accurate confidence intervals, optimal unbiased and invariant tests. Prerequisites: Real analysis, introductory probability (at the level of STATS 116), and introductory statistics.
Terms: Aut | Units: 3

STATS 310A: Theory of Probability I (MATH 230A)

Mathematical tools: sigma algebras, measure theory, connections between coin tossing and Lebesgue measure, basic convergence theorems. Probability: independence, Borel-Cantelli lemmas, almost sure and Lp convergence, weak and strong laws of large numbers. Large deviations. Weak convergence; central limit theorems; Poisson convergence; Stein's method. Prerequisites: STATS 116, MATH 171.
Terms: Aut | Units: 3
Filter Results:
term offered
updating results...
number of units
updating results...
time offered
updating results...
days
updating results...
UG Requirements (GERs)
updating results...
component
updating results...
career
updating results...
© Stanford University | Terms of Use | Copyright Complaints