InfoCoBuild

CS224N - Natural Language Processing with Deep Learning

CS224N: Natural Language Processing with Deep Learning. Instructor: Prof. Chris Manning, Department of Computer Science and Linguistics, Stanford University. Natural language processing (NLP) or computational linguistics is one of the most important technologies of the information age. Applications of NLP are everywhere because people communicate almost everything in language: web search, advertising, emails, customer service, language translation, virtual agents, medical reports, etc. In the last decade, deep learning (or neural network) approaches have obtained very high performance across many different NLP tasks, using single end-to-end neural models that do not require traditional, task-specific feature engineering. In this course, students will gain a thorough introduction to cutting-edge research in Deep Learning for NLP. Through lectures, assignments and a final project, students will learn the necessary skills to design, implement, and understand their own neural network models, using the Pytorch framework. You can find more information about this course, such as lecture slides and syllabus, here. (from Stanfordonline)

Lecture 16 - Coreference Resolution


Go to the Course Home or watch other lectures:

Lecture 01 - Introduction and Word Vectors
Lecture 02 - Word Vectors and Word Senses
Lecture 03 - Neural Networks
Lecture 04 - Backpropagation
Lecture 05 - Dependency Parsing
Lecture 06 - Language Models and RNNs
Lecture 07 - Vanishing Gradients, Fancy RNNs
Lecture 08 - Translation, Seq2Seq, Attention
Lecture 09 - Practical Tips for Projects
Lecture 10 - Question Answering
Lecture 11 - Convolutional Networks for NLP
Lecture 12 - Subword Models
Lecture 13 - Contextual Word Embeddings
Lecture 14 - Transformers and Self-Attention
Lecture 15 - Natural Language Generation
Lecture 16 - Coreference Resolution
Lecture 17 - Multitask Learning
Lecture 18 - Constituency Parsing, TreeRNNs
Lecture 19 - Bias in AI
Lecture 20 - Future of NLP + Deep Learning
Lecture 21 - Low Resource Machine Translation
Lecture 22 - BERT and Other Pre-trained Language Models