2,088 1 year ago

Quantized variants of a German large language model (LLM).

ollama run bengt0/em_german_leo_mistral

Models

View all →

12 models

em_german_leo_mistral:latest

4.4GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q2_K

3.1GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q3_K_S

3.2GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q3_K_M

3.5GB · 32K context window · Text · 1 year ago

em_german_leo_mistral:Q3_K_L

3.8GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q4_0

4.1GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q4_K_S

4.1GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q4_K_M

4.4GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q5_0

5.0GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q5_K_M

5.1GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q6_K

5.9GB · 32K context window · Text · 2 years ago

em_german_leo_mistral:Q8_0

7.7GB · 32K context window · Text · 2 years ago

Readme

em_german_leo_mistral

Quantized variants of a German large language model (LLM).

Example Modelfile:

FROM ./em_german_leo_mistral.Q4_K_M.gguf

PARAMETER temperature 0.5

Derived from the GGUF models at: https://huggingface.co/TheBloke/em_german_leo_mistral-GGUF

Original model from: https://github.com/jphme/EM_German