
-
GemmaTR-WikiQA-4bit
Türkçe Vikipedi soru-cevap verileriyle ince ayar yapılmış, 27B parametreli, 4-bit quantize edilmiş verimli dil modeli.
76 Pulls 1 Tag Updated 7 months ago
-
llama-turkishwiki-3-8b-4bit
A fine-tuned version of the Llama 3 model (8B, 4-bit quantized) on the WikiRAG-TR dataset, which contains 6K question-answer pairs generated from the introduction sections of Turkish Wikipedia articles.
-
mistralv03
Finetuned model for finding hotel GKS codes