Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).

3B

105 Pulls Updated 8 months ago

1 Tag
fb80c38134f0 • 2.3GB • Updated 8 months ago