Quantized variants of a German large language model (LLM).
1,057 Pulls Updated 6 months ago
Updated 8 months ago
8 months ago
ad2dbef52fc8 · 5.9GB
model
archllama
·
parameters7.24B
·
quantizationQ6_K
5.9GB
params
{"temperature":0.5}
20B
Readme
em_german_leo_mistral
Quantized variants of a German large language model (LLM).
Example Modelfile:
FROM ./em_german_leo_mistral.Q4_K_M.gguf
PARAMETER temperature 0.5
Derived from the GGUF models at: https://huggingface.co/TheBloke/em_german_leo_mistral-GGUF
Original model from: https://github.com/jphme/EM_German