Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
cenker
/
llama-turkishwiki-3-8b-4bit
A fine-tuned version of the Llama 3 model (8B, 4-bit quantized) on the WikiRAG-TR dataset, which contains 6K question-answer pairs generated from the introduction sections of Turkish Wikipedia articles.
A fine-tuned version of the Llama 3 model (8B, 4-bit quantized) on the WikiRAG-TR dataset, which contains 6K question-answer pairs generated from the introduction sections of Turkish Wikipedia articles.
Cancel
No models have been pushed.
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)