Welcome to CPSC 477/577!

This course provides a deep dive into Natural Language Processing (NLP), a pivotal and dynamic subfield of Artificial Intelligence (AI) that focuses on the interaction between computers and human language.

The course begins by exploring the fundamental principles of NLP, providing a solid grounding in how natural language is processed and understood by machines. Students will first explore the traditional methods of NLP, and study the classic NLP tasks as well as understanding their historical significance and foundational role. These methods, based on statistical and machine learning approaches, lay the groundwork for understanding how machines interpret language.

Transitioning to modern NLP, the course delves into the revolutionary impact of deep learning and neural networks. Here, students will learn about representation learning methods, including word representations and sentence representations. Then the course dives into the foundations of language modeling and self-supervised learning in NLP. Specifically, we will discuss sequence-to-sequence models, transformers, and transfer learning, including models like GPT (Generative Pre-trained Transformer), BERT (Bidirectional Encoder Representations from Transformers), and T5. These models have transformed the landscape of NLP by enabling more general language understanding and generation capabilities. We then transition into contemporary topics in NLP including LLMs, parameter-efficient fine-tuning, efficiency, and incorporating other modalities.

Through a blend of lectures, hands-on projects and assignments, and case studies, students will gain practical experience in both traditional and modern NLP techniques. The goal of the course is to introduce the students to the field and provide them with a comprehensive overview of fundamentals that helped shaped today’s advanced AI models.

Prerequisites:

Intro to AI or Intro to Machine Learning or permission of instructor.

Resources

  • Dan Jurafsky and James H. Martin. Speech and Language Processing (2024 pre-release)
  • Yoav Goldberg. A Primer on Neural Network Models for Natural Language Processing
  • Jacob Eisenstein. Natural Language Processing

We will also using papers from major conferences in the field including ACL, EMNLP, NAACL, ICLR, NeurIPS, etc.


  • Lectures: Tue/Thur 2:30PM - 3:45PM
  • Lecture Location: Osborn Memorial Laboratories 202