vdelv/
phi-2:latest

503 1 year ago

Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).

1 year ago

fb80c38134f0 · 2.3GB ·

phi2
·
2.78B
·
Q6_K
Instruct:{{ .Prompt }} Output:

Readme

No readme