from Blizado/discolm-mfto-7b-german-v0.1
980 Pulls Updated 14 months ago
Updated 14 months ago
14 months ago
e3f44d88e49e · 7.7GB
Readme
This is an upload of the GGUF of the experimental merge of the pre-trained language models OpenPipe/mistral-ft-optimized-1227 DiscoResearch/DiscoLM_German_7b_v1 created using mergekit, taken from https://huggingface.co/Blizado/discolm-mfto-7b-german-v0.1
As Blizado states: “DiscoLM German 7B is is up to this date (01/21/2024) by far the best German model and makes far fewer grammatical errors and his German generally sounds good. But it is finetuned on Mistral V0.2 or even V0.1.
Mistral FT Optimized 1227 is much better in German than Mistral 7B V0.2 and other German fine-tuning models that make grammar errors in almost every sentence. But even that model is a good step behind DiscoLM German 7B and creates not so well formed German sentences.
The ulterior motive was now combining this two models to get a even better German model, especially for German roleplay.”
Uploaded to ollama for experimenting.