The Amazing World of Neural Language Generation (T 7)

Yangfeng Ji, Antoine Bosselut, Thomas Wolf and Asli Celikyilmaz

EMNLP tutorial page RocketChat LiveStream Tutorial writeup

Introduction

Neural Language Generation (NLG) - using neural network models to generate coherent text - is among the most promising methods for automated text creation. Recent years have seen a paradigm shift in neural text generation, caused by the advances in deep contextual language modeling (e.g., LSTMs, GPT, GPT2) and transfer learning (e.g., ELMo, BERT). While these tools have dramatically improved the state of NLG, particularly for low resources tasks, state-of-the-art NLG models still face many challenges: a lack of diversity in generated text, commonsense violations in depicted situations, difficulties in making use of factual information, and difficulties in designing reliable evaluation metrics. In this tutorial, we will present an overview of the current state-of-the-art in neural network architectures, and how they shaped recent research directions in text generation. We will discuss how and why these models succeed/fail at generating coherent text, and provide insights on several applications.

Schedule

Title Slides Description Speaker
Introduction PDF This section will introduce the tutorial by presenting the recent impact of neural network modeling approaches on the field. Asli Celikyilmaz
Neural Network Modeling PDF Modeling strategies for natural text generation Yangfeng Ji
Training and Decoding PDF Various decoding and learning strategies for neural NLG models Antoine Bosselut
Benchmarks and Evaluation PDF Automatic and human evaluation of NLG Asli Celikyilmaz
Building Neural Generation Models PDF Challenges for deploying natural language generation models in production. Thomas Wolf

Speakers

Yangfeng Ji

University of Virginia

Antoine Bosselut

Stanford University

Thomas Wolf

Hugging Face

Asli Celikyilmaz

Microsoft Research