GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion.
684 Pulls 16 Tags Updated 11 months ago