Natural Language Processing
graduate course, National Kaohsiung University of Science and Technology, 2021
Course Description
The course mainly introduces the theories and applications of Natural Language Processing (NLP). This course starts from basic concepts of NLP to the-state-of-the-art NLP algorithms such as transformer. Also, this course will introduce how to apply the concepts that learnt in this course to task-oriented dialog system.
Lecture 1: Introduction to Natural Language Processing
- Lecture Slides: PPT
- Reading Materials
- Can We Automate Scientific Reviewing?
- The Design and Implementation of XiaoIce, an Empathetic Social Chatbot
- Recipes for building an open-domain chatbot
- Semantically-Aligned Equation Generation for Solving and Reasoning MathWord Problems
- Stanza : A Python Natural Language Processing Toolkit for Many Human Languages
Lecture 2: Distributed Representaion of Words
- Lecture Slides: PPT
- Reading Materials
Lecture 3: Advanced Word Vector and Introduction to Neural Network
- Lecture Slides: PPT
- Reading Materials
- GloVe: Global Vectors for Word Representation
- Speech and Language Processing: Information Extraction (Chapter number may vary according to the edition)
- Stanford University CS224N Lecture2 and Lecture3 (2019 Edition)
Lecture 4: Backpropagation and Computation Graphs
- Lecture Slides: PPT
- Reading Materials
- Maxout Networks
- Understanding the difficulty of training deep feedforward neural networks
- Dropout: A Simple Way to Prevent Neural Networks from Overfitting
- Adam: A Method for Stochastic Optimization
- An overview of gradient descent optimization algorithms
- Stanford University CS224N Lecture4 (2019 Edition)
Lecture 5: Dependency Parsing
- Lecture Slides: PPT
- Reading Materials
- Universal Dependency Parsing
Lecture 6: Recurrent Neural Networks and Language Models
- Lecture Slides: PPT
- Reading Materials
- Speech and Language Processing: N-gram Language Models (Chapter number may vary according to the edition)
- Deep Visual-Semantic Alignments for Generating Image Descriptions
- Speechless? Here’s how AI learns to finish your sentences
- Stanford University CS224N Lecture6 (2019 Edition)
- Deep Learning - Sequence Modeling: Recurrent and Recursive Nets
Lecture 7: Vanishing Gradients and Fancy RNNs
- Lecture Slides: PPT
- Reading Materials
- On the difficulty of training Recurrent Neural Networks
- Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies
- Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation
- Deep Residual Learning for Image Recognition
- Densely Connected Convolutional Networks
- Highway Networks
- Stanford University CS224N Lecture7 (2019 Edition)
Lecture 8: Machine Translation, Seq2Seq, Attention, and Transformer
- Lecture Slides: PPT
- Reading Materials
- Attention Is All You Need
- Attention Is All You Need (Code Explanation)
- Layer Normalization
- Universal Language Model Fine-tuning for Text Classification
- BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
- BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension
- Stanford University CS224N Lecture8 (2019 Edition)
- Stanford University CS224N Lecture13 (2019 Edition)
Lecture 9: Task Oriented Dialogue Systems & Multi-Modal Dialog System
- Lecture Slides: PPT
- Reading Materials
- Speech and Language Processing: Chatbots and Dialogue Systems (Chapter number may vary according to the edition)
- Continuously Learning Neural Dialogue Management
- Sample-efficient Actor-Critic Reinforcement Learning with Supervised Data for Dialogue Management
- Augment Information with Multimodal Information
Textbooks
- Natural Language Processing
- Jurafsky and Martin, Speech and Language Processing (3rd ed.)
- Deep Learning
- Goodfellow, Bengio, and Courville, Deep Learning
- Zhang et al., Dive into Deep Learning
Online Courses
- Natural Language Processing and Deep Learning