BgGPT-7B is a Bulgarian language model trained from mistralai/Mistral-7B-v0.1.
345 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
5a770a14b56b · 4.4GB
Readme
BgGPT
Meet BgGPT-7B, a Bulgarian language model trained from mistralai/Mistral-7B-v0.1. BgGPT is distributed under Apache 2.0 license.
This model was created by INSAIT Institute, part of Sofia University, in Sofia, Bulgaria.
Model description
The model is fine-tuned to improve its Bulgarian language capabilities using multiple datasets, including Bulgarian web crawl data, a range of specialized Bulgarian datasets sourced by INSAIT Institute, and machine translations of popular English datasets. This Bulgarian data was augmented with English datasets to retain English and logical reasoning skills.
The model’s tokenizer has been extended to allow for a more efficient encoding of Bulgarian words written in Cyrillic. This not only increases throughput of Cyrillic text but also performance.
Usage
CLI
ollama run todorov/bggpt
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "todorov/bggpt",
"prompt":"Кога е основан Софийският университет?"
}'