2020-2021 2021-2022 2022-2023 2023-2024 2024-2025
Browse
by subject...
    Schedule
view...
 

1 - 7 of 7 results for: CS224N

CS 25: Transformers United V4

Since their introduction in 2017, Transformers have taken the world by storm, and are finding applications all over Deep Learning. They have enabled the creation of powerful language models like ChatGPT and Gemini, and are a critical component in other ML applications such as text-to-image and video generation (e.g. DALL-E and Sora). They have significantly elevated the capabilities and impact of Artificial Intelligence. In CS 25, which has become one of Stanford's hottest and most exciting seminars, we examine the details of how Transformers work, and dive deep into the different kinds of Transformers and how they're applied in various fields and applications. We do this through a combination of instructor lectures, guest lectures, and classroom discussions. Potential topics include LLM architectures, creative use cases (e.g. art and music), healthcare/biology and neuroscience applications, robotics and RL (e.g. physical tasks, simulations, or games), and so forth. We invite folks at more »
Since their introduction in 2017, Transformers have taken the world by storm, and are finding applications all over Deep Learning. They have enabled the creation of powerful language models like ChatGPT and Gemini, and are a critical component in other ML applications such as text-to-image and video generation (e.g. DALL-E and Sora). They have significantly elevated the capabilities and impact of Artificial Intelligence. In CS 25, which has become one of Stanford's hottest and most exciting seminars, we examine the details of how Transformers work, and dive deep into the different kinds of Transformers and how they're applied in various fields and applications. We do this through a combination of instructor lectures, guest lectures, and classroom discussions. Potential topics include LLM architectures, creative use cases (e.g. art and music), healthcare/biology and neuroscience applications, robotics and RL (e.g. physical tasks, simulations, or games), and so forth. We invite folks at the forefront of Transformers research for talks, which will also be livestreamed and recorded through YouTube/Zoom. Past speakers have included Andrej Karpathy, Geoffrey Hinton, Jim Fan, Ashish Vaswani, and folks from OpenAI, Google DeepMind, NVIDIA, etc. Our class includes social events and networking sessions and has a popular reception within and outside Stanford, with around 1 million total views on YouTube. This is a 1-unit S/NC course, where attendance is the only homework! Please enroll on Axess or audit by joining the livestream (or in person if seats are available). Prerequisites: basic knowledge of Deep Learning (should understand attention) or CS224N/ CS231N/ CS230. Course website: https://web.stanford.edu/class/cs25/
Terms: Spr | Units: 1

CS 129X: Human Centered NLP (CS 329X)

Recent advances in natural language processing (NLP), especially around large pretrained models, have enabled extensive successful applications. However, there are growing concerns about the negative aspects of NLP systems, such as biases and a lack of input from users. This course gives an overview of human-centered techniques and applications for NLP, ranging from human-centered design thinking to human-in-the-loop algorithms, fairness, and accessibility. Along the way, we will cover machine-learning techniques which are especially relevant to NLP and to human experiences. Prerequisite: CS224N or CS224U, or equivalent background in natural language processing. Prerequisite: CS224N or CS224U, or equivalent background in natural language processing.

CS 224N: Natural Language Processing with Deep Learning (LINGUIST 284, SYMSYS 195N)

Methods for processing human language information and the underlying computational properties of natural languages. Focus on deep learning approaches: understanding, implementing, training, debugging, visualizing, and extending neural network models for a variety of language understanding tasks. Exploration of natural language tasks ranging from simple word level and syntactic processing to coreference, question answering, and machine translation. Examination of representative papers and systems and completion of a final project applying a complex neural network model to a large-scale NLP problem. Prerequisites: calculus and linear algebra; CS124, CS221, or CS229.
Terms: Win | Units: 3-4

CS 224S: Spoken Language Processing (LINGUIST 285)

Introduction to spoken language technology with an emphasis on dialogue and conversational systems. Deep learning and other methods for automatic speech recognition, speech synthesis, affect detection, dialogue management, and applications to digital assistants and spoken language understanding systems. Prerequisites: CS124, CS221, CS224N, or CS229.
Terms: Spr | Units: 2-4
Instructors: Maas, A. (PI)

CS 329S: Machine Learning Systems Design

This project-based course covers the iterative process for designing, developing, and deploying machine learning systems. It focuses on systems that require massive datasets and compute resources, such as large neural networks. Students will learn about data management, data engineering, approaches to model selection, training, scaling, how to continually monitor and deploy changes to ML systems, as well as the human side of ML projects. In the process, students will learn about important issues including privacy, fairness, and security. Pre-requisites: At least one of the following; CS229, CS230, CS231N, CS224N or equivalent. Students should have a good understanding of machine learning algorithms and should be familiar with at least one framework such as TensorFlow, PyTorch, JAX.
Last offered: Winter 2022

CS 329X: Human Centered NLP (CS 129X)

Recent advances in natural language processing (NLP), especially around large pretrained models, have enabled extensive successful applications. However, there are growing concerns about the negative aspects of NLP systems, such as biases and a lack of input from users. This course gives an overview of human-centered techniques and applications for NLP, ranging from human-centered design thinking to human-in-the-loop algorithms, fairness, and accessibility. Along the way, we will cover machine-learning techniques which are especially relevant to NLP and to human experiences. Prerequisite: CS224N or CS224U, or equivalent background in natural language processing. Prerequisite: CS224N or CS224U, or equivalent background in natural language processing.
Terms: Aut | Units: 3-4

LINGUIST 285: Spoken Language Processing (CS 224S)

Introduction to spoken language technology with an emphasis on dialogue and conversational systems. Deep learning and other methods for automatic speech recognition, speech synthesis, affect detection, dialogue management, and applications to digital assistants and spoken language understanding systems. Prerequisites: CS124, CS221, CS224N, or CS229.
Last offered: Spring 2024
Filter Results:
term offered
updating results...
teaching presence
updating results...
number of units
updating results...
time offered
updating results...
days
updating results...
UG Requirements (GERs)
updating results...
component
updating results...
career
updating results...
© Stanford University | Terms of Use | Copyright Complaints