Model family is actually Mistral but on pushing I couldn't change it.

7B

35 Pulls Updated 6 months ago

Readme

alt text

This model is quantized version of https://huggingface.co/Trendyol/Trendyol-LLM-7b-chat-dpo-v1.0
I got the quantized model thanks to https://huggingface.co/tolgadev and huge thanks to Trendyol Team for the model 🙏🏼
Quantization method: Q4_K_M.gguf

Usage

CLI

ollama run ytagalar/trendyol-llm-7b-chat-dpo-v1.0-gguf

API

Example:

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "ytagalar/trendyol-llm-7b-chat-dpo-v1.0-gguf",
  "prompt":"Türkiyede kaç il var?"
 }'