Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
ifioravanti
/
neutrixomnibe-dpo
:7b-q8_0
81
Downloads
Updated
1 year ago
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
Cancel
7b
neutrixomnibe-dpo:7b-q8_0
...
/
params
7ed19e244ef4 · 92B
{
"num_ctx": 32768,
"stop": [
"</s>",
"<|im_start|>",
"<|im_end|>"
]
}