Quantized variants of a German large language model (LLM).
1,043 Pulls Updated 5 months ago
Updated 8 months ago
8 months ago
3834dd2467ae · 5.1GB
model
archllama
·
parameters7.24B
·
quantizationQ5_K_M
5.1GB
params
{"temperature":0.5}
20B
Readme
em_german_leo_mistral
Quantized variants of a German large language model (LLM).
Example Modelfile:
FROM ./em_german_leo_mistral.Q4_K_M.gguf
PARAMETER temperature 0.5
Derived from the GGUF models at: https://huggingface.co/TheBloke/em_german_leo_mistral-GGUF
Original model from: https://github.com/jphme/EM_German