7,416 1 year ago

PartAI Dorna-Llama3: The most powerful Persian LLM to date, under 10B parameters

1 year ago

c18d18aaeb6e · 8.5GB

llama
·
8.03B
·
Q8_0
You are a helpful Persian assistant. Please answer questions in the asked language.
{ "stop": [ "<|start_header_id|>", "<|end_header_id|>", "<|eot_id|>"
{{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System }}<|eot_id|>{{ end }}{{ if .Pr

Readme

Dorna-Llama3

The Dorna models are a family of decoder-only models, specifically trained/fine-tuned on Persian data, developed by Part AI. As an initial release, an 8B instruct model from this family is Dorna-Llama3-8B-Instruct is built using the Meta Llama 3 Instruct model.

In this repo, we provide bf16 model and quantized models in the GGUF formats, including Q2_K, Q3_K, Q3_K_L, Q3_K_M, Q3_K_S, Q4_0, Q4_1, Q4_K_M, Q4_K_S, Q5_0, Q5_1, Q5_K_M, Q5_K_S and Q8_0

Here offers an in-depth report that includes several performance charts. Check it out.

CLI

Open the terminal and run ollama run dorna-llama3

API

Example using curl:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "llama3",
  "prompt":"سلام!"
 }'