Wednesday, Jan. 19- DAILY AGENDA

Build Language Models

The workshop will be an in-depth tutorial on transfer learning for natural language. It will cover state of the art models and how to perform transfer learning in general with practical examples. Topics that will be covered include the following:


  • Language models

  • Attention & Self-attention

  • Transformers

  • Transfer learning

  • SOTA models


  • Question answering models

  • Dialog models

  • Language models

Our lives are entrenched with data and technology, and most of that data is in the form of text: from websites, social media posts, newspapers, transcripts from videos and meetings, etc. To understand and leverage this human language data, we turn to the field of Natural Language Processing (NLP). One of the most important and foundational problems within NLP is that of language modelling, which additionally serves as the backbone for nearly all other NLP tasks. In particular, breakthroughs with Transformers have achieved state-of-the-art results in nearly all problems, and its been incorporated with real-world applications (e.g., Google search, chatbots, Google Translate, etc).In this lecture, we introduce Language Modelling and walk through the advances that have led to Transformers (e.g., LSTMs, Attention, Self-Attention), and we close by highlighting particular examples of Transformers, BERT and GPT2. In the labs, we walk you through a series of (3) notebooks that illustrate different ways to use Transformers: Lab 1 concerns language modelling and generating new text; Lab 2 shows how to use Transformers to build a Question-Answering system; and Lab 3 uses a Transformer to create a dialog/chatbot. Enjoy!p.s., did a human write this description, or did a Transformer?

IACS lead: Chris Tanner


10:00 AM     -   11:00 AM         Lecture

11:00 AM     -   12:30 PM         Workshops

12:30 PM     -   1:30   PM         Lunch break

1:30   PM     -    4:00  PM          Lecture / Workshops

_IACS logo width130height130_orig.png