OpenAI’s GPT-2 Language Model

IMG_2923

(T) OpenAI released this week its second-generation language model GPT-2. That model provides reading comprehension, summarization, translation, and question answering. It is so good so OpenAI fears that it could be used for malicious application and so it has released only a smaller version of the model (which has only 117 million parameters).

The key feature of GPT-2 includes:

  • GPT-2 generates a synthetic text sample of a lengthy continuation from a text input – that sample is adapted to the content and style of the text input
  • GPT-2 leverages several unsupervised multi-task learners that predicts the next word based on the text input and the task requested such as translation or reading comprehension
  • GPT-2’s model architecture is based on Transformer with self-attention mechanisms, developed by a Google team, and include 1.5 billion parameters
  • GPT-2 is trained over 8 million documents representing 40 Gb of text extracted from 45 million of Web pages links – Web pages are human-curated and the content is considered of quality

An example of a text generated from GPT-2:

 

GPT-2 - FN

 

References

 

Note: The picture above is from Monterey.

Copyright © 2005-2019 by Serge-Paul Carrasco. All rights reserved.
Contact Us: asvinsider at gmail dot com.