A fine-tuned version of the Llama 3 model (8B, 4-bit quantized) on the WikiRAG-TR dataset, which contains 6K question-answer pairs generated from the introduction sections of Turkish Wikipedia articles.

No models have been pushed.

Readme

No readme