8,409 Downloads Updated 1 year ago
Updated 1 year ago
1 year ago
006995dc8e70 · 4.7GB ·
The Dorna models are a family of decoder-only models, specifically trained/fine-tuned on Persian data, developed by Part AI. As an initial release, an 8B instruct model from this family is Dorna-Llama3-8B-Instruct is built using the Meta Llama 3 Instruct model.
In this repo, we provide bf16 model and quantized models in the GGUF formats, including Q2_K, Q3_K, Q3_K_L, Q3_K_M, Q3_K_S, Q4_0, Q4_1, Q4_K_M, Q4_K_S, Q5_0, Q5_1, Q5_K_M, Q5_K_S and Q8_0
Here offers an in-depth report that includes several performance charts. Check it out.
Open the terminal and run ollama run dorna-llama3
Example using curl:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt":"سلام!"
}'