Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
ifioravanti
/
neutrixomnibe-dpo
:7b-q8_0
81
Downloads
Updated
1 year ago
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
Cancel
7b
neutrixomnibe-dpo:7b-q8_0
...
/
template
206bc0071007 · 66B
### System:
{{ .System }}
### User:
{{ .Prompt }}
### Assistant: