latest
4.4GB
Model family is actually Mistral but on pushing I couldn't change it.
7B
35 Pulls Updated 6 months ago
Updated 6 months ago
6 months ago
0f08b93c3c0e · 4.4GB
model
archllama
·
parameters7.34B
·
quantizationQ4_K_M
4.4GB
system
Sen yardımcı bir asistansın ve sana verilen talimatlar doğrultusunda en iyi cevabı üretmeye çalışacaksın.\n
117B
params
{"stop":["[/INST]","[INST]","[/INST ]"]}
42B
template
[INST] {{ .System }} {{ .Prompt }} [/INST]
44B
Readme
This model is quantized version of https://huggingface.co/Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0
I got the quantized model thanks to https://huggingface.co/tolgadev and huge thanks to Trendyol Team for the model 🙏🏼
Quantization method: Q4_K_M.gguf
Usage
CLI
ollama run ytagalar/trendyol-llm-7b-chat-dpo-v1.0-gguf
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "ytagalar/trendyol-llm-7b-chat-dpo-v1.0-gguf",
"prompt":"Türkiyede kaç il var?"
}'