This session introduces what is NLP, why it is challenging and how we approach any NLP problem.
This lecture covers representation learning techniques for Natural Language Processing.
This lecture presents Deep Learning techniques used in NLP. We cover the design and training principles. We present the Multi-Layer-Perceptron (MLP), Recurrent Architectures (RNN and LSTM) and the transformer architecture.
This lecture introduces language modeling.
Sequence Labeling & Classification with Deep Learning models such as RNN-based models and transformers.
Sequence Generation using Encoder-Decoders.
This lab introduces basics processing operations required for any NLP experiments. After introducing preprocessing tools for data cleaning and tokenization, we compute some descriptive statistics on textual data.
This lab explores representation learning techniques for words and documents. It explores models like tf-idf and Word2vec and develop quantitative and qualtiative evaluation methods.
This lab implements, trains and evaluates sequence classification and labeling models based on Recurrent Neural Networks and transformer deep-learning architecture.
This lab introduces basics processing operations required for any NLP experiments. After introducing preprocessing tools for data cleaning and tokenization, we compute some descriptive statistics on textual data.