2015-2016 2016-2017 2017-2018 2018-2019 2019-2020
Browse
by subject...
    Schedule
view...
 

1 - 10 of 11 results for: CS109

CS 109: Introduction to Probability for Computer Scientists

Topics include: counting and combinatorics, random variables, conditional probability, independence, distributions, expectation, point estimation, and limit theorems. Applications of probability in computer science including machine learning and the use of probability in the analysis of algorithms. Prerequisites: 103, 106B or X, multivariate calculus at the level of MATH 51 or CME 100 or equivalent.
Terms: Aut, Win, Spr | Units: 3-5 | UG Reqs: GER:DB-EngrAppSci, WAY-AQR, WAY-FR

CS 124: From Languages to Information (LINGUIST 180, LINGUIST 280)

Extracting meaning, information, and structure from human language text, speech, web pages, social networks. Introducing methods (string algorithms, edit distance, language modeling, machine learning classifiers, neural embeddings, inverted indices, collaborative filtering, PageRank), applications (chatbots, sentiment analysis, information retrieval, question answering, text classification, social networks, recommender systems), and ethical issues in both. Prerequisites: CS103, CS107, CS109.
Terms: Win | Units: 3-4 | UG Reqs: WAY-AQR
Instructors: Jurafsky, D. (PI)

CS 229: Machine Learning (STATS 229)

Topics: statistical pattern recognition, linear and non-linear regression, non-parametric methods, exponential family, GLMs, support vector machines, kernel methods, deep learning, model/feature selection, learning theory, ML advice, clustering, density estimation, EM, dimensionality reduction, ICA, PCA, reinforcement learning and adaptive control, Markov decision processes, approximate dynamic programming, and policy search. Prerequisites: knowledge of basic computer science principles and skills at a level sufficient to write a reasonably non-trivial computer program in Python/numpy, familiarity with probability theory to the equivalency of CS109 or STATS116, and familiarity with multivariable calculus and linear algebra to the equivalency of MATH51.
Terms: Aut, Spr | Units: 3-4

CS 229T: Statistical Learning Theory (STATS 231)

How do we formalize what it means for an algorithm to learn from data? How do we use mathematical thinking to design better machine learning methods? This course focuses on developing mathematical tools for answering these questions. We will present various learning algorithms and prove theoretical guarantees about them. Topics include generalization bounds, implicit regularization, the theory of deep learning, spectral methods, and online learning and bandits problems. Prerequisites: A solid background in linear algebra ( Math 104, Math 113 or CS205) and probability theory (CS109 or STAT 116), statistics and machine learning ( STATS 315A, CS 229 or STATS 216).
Last offered: Autumn 2018

CS 250: Algebraic Error Correcting Codes (EE 387)

Introduction to the theory of error correcting codes, emphasizing algebraic constructions, and diverse applications throughout computer science and engineering. Topics include basic bounds on error correcting codes; Reed-Solomon and Reed-Muller codes; list-decoding, list-recovery and locality. Applications may include communication, storage, complexity theory, pseudorandomness, cryptography, streaming algorithms, group testing, and compressed sensing. Prerequisites: Linear algebra, basic probability (at the level of, say, CS109, CME106 or EE178) and "mathematical maturity" (students will be asked to write proofs). Familiarity with finite fields will be helpful but not required.
Last offered: Winter 2019

CS 336: Robot Perception and Decision-Making: Optimal and Learning-based Approaches

How can robots perceive the world and their own motion so that they can accomplish navigation and manipulation tasks? In this course, we will study how this question has been approached specifically if the robot is equipped with visual sensing capabilities. We focus on studying how a robot can make decisions based on raw, high-dimensional sensory data that represents only partial, noisy observations of the environment. Therefore, the course is divided into two main themes (i) Estimation and (ii) Decision-Making and Control where in each theme we will study traditional approaches, learning-based methods and combinations of those. Prerequisites: CS106B, MATH 51 or CME 100, CS109, CS 221 or CS 229.
Terms: Aut | Units: 3-4

EE 387: Algebraic Error Correcting Codes (CS 250)

Introduction to the theory of error correcting codes, emphasizing algebraic constructions, and diverse applications throughout computer science and engineering. Topics include basic bounds on error correcting codes; Reed-Solomon and Reed-Muller codes; list-decoding, list-recovery and locality. Applications may include communication, storage, complexity theory, pseudorandomness, cryptography, streaming algorithms, group testing, and compressed sensing. Prerequisites: Linear algebra, basic probability (at the level of, say, CS109, CME106 or EE178) and "mathematical maturity" (students will be asked to write proofs). Familiarity with finite fields will be helpful but not required.
Last offered: Winter 2019

LINGUIST 180: From Languages to Information (CS 124, LINGUIST 280)

Extracting meaning, information, and structure from human language text, speech, web pages, social networks. Introducing methods (string algorithms, edit distance, language modeling, machine learning classifiers, neural embeddings, inverted indices, collaborative filtering, PageRank), applications (chatbots, sentiment analysis, information retrieval, question answering, text classification, social networks, recommender systems), and ethical issues in both. Prerequisites: CS103, CS107, CS109.
Terms: Win | Units: 3-4 | UG Reqs: WAY-AQR
Instructors: Jurafsky, D. (PI)

LINGUIST 280: From Languages to Information (CS 124, LINGUIST 180)

Extracting meaning, information, and structure from human language text, speech, web pages, social networks. Introducing methods (string algorithms, edit distance, language modeling, machine learning classifiers, neural embeddings, inverted indices, collaborative filtering, PageRank), applications (chatbots, sentiment analysis, information retrieval, question answering, text classification, social networks, recommender systems), and ethical issues in both. Prerequisites: CS103, CS107, CS109.
Terms: Win | Units: 3-4
Instructors: Jurafsky, D. (PI)

STATS 229: Machine Learning (CS 229)

Topics: statistical pattern recognition, linear and non-linear regression, non-parametric methods, exponential family, GLMs, support vector machines, kernel methods, deep learning, model/feature selection, learning theory, ML advice, clustering, density estimation, EM, dimensionality reduction, ICA, PCA, reinforcement learning and adaptive control, Markov decision processes, approximate dynamic programming, and policy search. Prerequisites: knowledge of basic computer science principles and skills at a level sufficient to write a reasonably non-trivial computer program in Python/numpy, familiarity with probability theory to the equivalency of CS109 or STATS116, and familiarity with multivariable calculus and linear algebra to the equivalency of MATH51.
Terms: Aut, Spr | Units: 3-4
Filter Results:
term offered
updating results...
number of units
updating results...
time offered
updating results...
days
updating results...
UG Requirements (GERs)
updating results...
component
updating results...
career
updating results...
© Stanford University | Terms of Use | Copyright Complaints