latest
2.3GB
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).
3B
105 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
fb80c38134f0 · 2.3GB
model
archphi2
·
parameters2.78B
·
quantizationQ6_K
2.3GB
template
Instruct:{{ .Prompt }}
Output:
31B
Readme
No readme