PartAI Dorna-Llama3: The most powerful Persian LLM to date, under 10B parameters
901 Pulls Updated 5 months ago
Updated 5 months ago
5 months ago
c18d18aaeb6e · 8.5GB
Readme
Dorna-Llama3
The Dorna models are a family of decoder-only models, specifically trained/fine-tuned on Persian data, developed by Part AI. As an initial release, an 8B instruct model from this family is Dorna-Llama3-8B-Instruct is built using the Meta Llama 3 Instruct model.
In this repo, we provide bf16
model and quantized models in the GGUF formats, including Q2_K
, Q3_K
, Q3_K_L
, Q3_K_M
, Q3_K_S
, Q4_0
, Q4_1
, Q4_K_M
, Q4_K_S
, Q5_0
, Q5_1
, Q5_K_M
, Q5_K_S
and Q8_0
Here offers an in-depth report that includes several performance charts. Check it out.
CLI
Open the terminal and run ollama run dorna-llama3
API
Example using curl:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt":"سلام!"
}'