Quantized variants of a German large language model (LLM).
1,057 Pulls Updated 6 months ago
Updated 6 months ago
6 months ago
fdee13d679f1 · 3.5GB
model
archllama
·
parameters7.24B
·
quantizationQ3_K_M
3.5GB
params
{"temperature":0.5}
20B
Readme
em_german_leo_mistral
Quantized variants of a German large language model (LLM).
Example Modelfile:
FROM ./em_german_leo_mistral.Q4_K_M.gguf
PARAMETER temperature 0.5
Derived from the GGUF models at: https://huggingface.co/TheBloke/em_german_leo_mistral-GGUF
Original model from: https://github.com/jphme/EM_German