Dr. Sarah Chen
Associate Professor of Computer Science
Stanford University
My research focuses on natural language processing, machine learning, and the intersection of language understanding with knowledge representation.
Teaching
CS 224N: Natural Language Processing with Deep Learning
Winter 2024, Winter 2025, Winter 2026, Stanford University
Graduate-level course providing a comprehensive introduction to cutting-edge neural network methods for NLP. Topics include word vectors, recurrent and attention-based architectures, transformers, pre-training, and applications to question answering, machine translation, and text generation. Students complete a substantial final research project.
CS 329T: Trustworthy Machine Learning
Spring 2025, Spring 2026, Stanford University
Graduate seminar exploring the challenges of building ML systems that are robust, fair, interpretable, and safe. Covers adversarial robustness, distribution shift, fairness metrics, explainability methods, and alignment techniques. Students engage with recent research papers and develop their own trustworthiness evaluations.
CS 124: From Languages to Information
Fall 2024, Fall 2025, Stanford University
Undergraduate introduction to natural language processing. Covers fundamental topics including text classification, sentiment analysis, information extraction, question answering, and chatbots. Emphasis on practical programming assignments and real-world applications.