thanks to bartowski for quants on HF
168 Pulls Updated 9 months ago
Updated 9 months ago
9 months ago
5422b93c86d1 · 6.1GB
model
archllama
·
parameters14B
·
quantizationQ3_K_S
6.1GB
params
{
"num_ctx": 4096,
"stop": [
"<|im_start|>",
"<|im_end|>"
]
}
74B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
181B
Readme
Original Quants on HF: bartowski/dolphin-2.9.2-Phi-3-Medium-GGUF
Original Model on HF: cognitivecomputations/dolphin-2.9.2-Phi-3-Medium
Discord: https://discord.gg/cognitivecomputations