herald/
phi3-128k:latest

12.6K 1 year ago

Phi-3 128K version of Q5

1 year ago

d8f0cff3daa1 · 2.7GB ·

llama
·
3.82B
·
Q5_K_M
{ "stop": [ "<|user|>", "<|assistant|>", "<|system|>", "<|end|>"
{{ if .System }}<|system|> {{ .System }}<|end|> {{ end }}{{ if .Prompt }}<|user|> {{ .Prompt }}<|end

Readme

Convert from PrunaAI/Phi-3-mini-128k-instruct-GGUF-Imatrix-smashed, adopt Q5_K_8_4 quantization.

Its multilingual capabilities are clearly superior to version Q4 quantization.