CS 109: Introduction to Probability for Computer Scientists
Topics include: counting and combinatorics, random variables, conditional probability, independence, distributions, expectation, point estimation, and limit theorems. Applications of probability in computer science including machine learning and the use of probability in the analysis of algorithms. Prerequisites: 103, 106B or X,
MATH 51 or equivalent.
Terms: Win, Spr, Sum
| Units: 3-5
| UG Reqs: WAY-AQR, WAY-FR, GER:DB-EngrAppSci
CS 109L: Statistical Computing with R Laboratory
Supplemental lab to
CS109. Introduces the R programming language for statistical computing. Topics include basic facilities of R including mathematical, graphical, and probability functions, building simulations, introductory data fitting and machine learning. Provides exposure to the functional programming paradigm. Corequisite:
CS109.
Terms: Win, Spr
| Units: 1
Instructors:
Holtz, B. (PI)
;
Sahami, M. (PI)
CS 124: From Languages to Information (LINGUIST 180, LINGUIST 280)
Automated processing of less structured information: human language text and speech, web pages, social networks, genome sequences, with goal of automatically extracting meaning and structure. Methods include: string algorithms, automata and transducers, hidden Markov models, graph algorithms, XML processing. Applications such as information retrieval, text classification, social network models, machine translation, genomic sequence alignment, word meaning extraction, and speech recognition. Prerequisite:
CS103,
CS107,
CS109.
Terms: Win
| Units: 3-4
Instructors:
Jurafsky, D. (PI)
CS 246: Mining Massive Data Sets
Distributed file systems: Hadoop, map-reduce; PageRank, topic-sensitive PageRank, spam detection, hubs-and-authorities; similarity search; shingling, minhashing, random hyperplanes, locality-sensitive hashing; analysis of social-network graphs; association rules; dimensionality reduction: UV, SVD, and CUR decompositions; algorithms for very-large-scale mining: clustering, nearest-neighbor search, gradient descent, support-vector machines, classification, and regression; submodular function optimization. Prerequisites: At lease one of CS107 or
CS145; at least one of CS109 or STAT116, or equivalent.
Terms: Win
| Units: 3-4
Instructors:
Leskovec, J. (PI)
LINGUIST 180: From Languages to Information (CS 124, LINGUIST 280)
Automated processing of less structured information: human language text and speech, web pages, social networks, genome sequences, with goal of automatically extracting meaning and structure. Methods include: string algorithms, automata and transducers, hidden Markov models, graph algorithms, XML processing. Applications such as information retrieval, text classification, social network models, machine translation, genomic sequence alignment, word meaning extraction, and speech recognition. Prerequisite:
CS103,
CS107,
CS109.
Terms: Win
| Units: 3-4
LINGUIST 280: From Languages to Information (CS 124, LINGUIST 180)
Automated processing of less structured information: human language text and speech, web pages, social networks, genome sequences, with goal of automatically extracting meaning and structure. Methods include: string algorithms, automata and transducers, hidden Markov models, graph algorithms, XML processing. Applications such as information retrieval, text classification, social network models, machine translation, genomic sequence alignment, word meaning extraction, and speech recognition. Prerequisite:
CS103,
CS107,
CS109.
Terms: Win
| Units: 3-4
Instructors:
Jurafsky, D. (PI)
Filter Results: