from mayflowergmbh/Wiedervereinigung-7b-dpo-laser-GGUF
199 Pulls Updated 14 months ago
Updated 14 months ago
14 months ago
f87eb374b259 · 4.4GB
model
archllama
·
parameters7.24B
·
quantizationQ4_K_M
4.4GB
params
{
"num_ctx": 8192,
"stop": [
"<|im_start",
"<|im_end",
"|im_start",
82B
template
{{ if .System }}<|im_start|>system
{{ .System }}<|im_end|>
{{ end }}{{ if .Prompt }}<|im_start|>user
156B
Readme
the quantized GGUF of the dpo-trained with a german translation of intel-orca-dpo and laserRMT treated with german datasets LazyMergekit merge of: DiscoResearch/DiscoLM_German_7b_v1 DRXD1000/Phoenix VAGOsolutions/SauerkrautLM-7b-v1-mistral malteos/hermeo-7b from https://huggingface.co/mayflowergmbh/Wiedervereinigung-7b-dpo-laser
uploaded for experimenting with the model