Course Material

(1) Lecture 1: The Basics of NLP

This session introduces what is NLP, why it is challenging and how we approach any NLP problem.

(2) Lecture 2: Representing text into vectors

This lecture covers representation learning techniques for Natural Language Processing.

(3) Lecture 3: Deep Learning Methods for NLP

This lecture presents Deep Learning techniques used in NLP. We cover the design and training principles. We present the Multi-Layer-Perceptron (MLP), Recurrent Architectures (RNN and LSTM) and the transformer architecture.

(4) Lecture 4: Language Modeling

This lecture introduces language modeling.

(5) Lecture 5: Sequence Labeling & Classification

Sequence Labeling & Classification with Deep Learning models such as RNN-based models and transformers.

(6) Lecture 6: Sequence Generation

Sequence Generation using Encoder-Decoders.

Lab 1: Introduction to textual data with Python

This lab introduces basics processing operations required for any NLP experiments. After introducing preprocessing tools for data cleaning and tokenization, we compute some descriptive statistics on textual data.

Lab 2: Word Embeddings and their evaluation

This lab explores representation learning techniques for words and documents. It explores models like tf-idf and Word2vec and develop quantitative and qualtiative evaluation methods.

Lab 3-4: Sequence Labeling and Sequence Classification with Deep Learning Models

This lab implements, trains and evaluates sequence classification and labeling models based on Recurrent Neural Networks and transformer deep-learning architecture.

Lab 5: Machine Translation

This lab introduces basics processing operations required for any NLP experiments. After introducing preprocessing tools for data cleaning and tokenization, we compute some descriptive statistics on textual data.