vdelv/ phi-2:latest

541 2 years ago

Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).

ollama run vdelv/phi-2

Details

2 years ago

fb80c38134f0 · 2.3GB ·

phi2
·
2.78B
·
Q6_K
Instruct:{{ .Prompt }} Output:

Readme

No readme