GPT-2 is a transformers model pretrained on a very large corpus of English data in a self-supervised fashion.
636 Pulls 16 Tags Updated 9 months ago