Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
vdelv
/
phi-2
:latest
503
Downloads
Updated
1 year ago
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).
Phi-2 is a Transformer with 2.7 billion parameters. It was trained using the same data sources as Phi-1.5, augmented with a new data source that consists of various NLP synthetic texts and filtered websites (source: Microsoft).
Cancel
Updated 1 year ago
1 year ago
fb80c38134f0 · 2.3GB ·
model
arch
phi2
·
parameters
2.78B
·
quantization
Q6_K
2.3GB
template
Instruct:{{ .Prompt }} Output:
31B
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)