Deep Learning Natural Language Processing (DNLP)
(The course design comes from Stanford NLP with deep learning course with some modification)
Gary Geunbae Lee, Eng 2-211, gblee@postech.ac.kr, 279-2254
1. Course objectives
This course will cover a cutting-edge research knowledge in deep learning natural language processing. Through lectures, students will learn the necessary skills to design, implement, and understand their own neural network models for various NLP problems such as word embedding/contextual word embedding, text classification, syntactic parsing, recurrent language modeling, machine translation, question answering, natural language generation, dialog systems, multi-task deep learning models, etc
2. Course prerequisites
no required pre-requisite
3. Grading
midterm 35%
final 35%
homework 30%
4 texts or references
Dan Jurafsky and James H. Martin. Speech and Language Processing (3rd ed. draft)
Jacob Eisenstein. Natural Language Processing
Delip Rao and Brian McMahan. Natural Language Processing with PyTorch
Lewis Tunstall, Leandro von Werra, and Thomas Wolf. Natural Language Processing with Transformers
Yoav Goldberg. A Primer on Neural Network Models for Natural Language Processing
Ian Goodfellow, Yoshua Bengio, and Aaron Courville. Deep Learning
Michael A. Nielsen. Neural Networks and Deep Learning
Eugene Charniak. Introduction to Deep Learning
4. Others
instruction language: English
2 homeworks: solve deep learning NLP application problems including python programming/coding
5. Course schedule
11.1 LLM Prompting RLHF
12.1 code generation
13.1 Knowledge in LM
14.1 Multimodal NLP