NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
7b
68 Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
5e3083f673e8 · 7.7GB
model
archllama
·
parameters7.24B
·
quantizationQ8_0
7.7GB
template
### System:
{{ .System }}
### User:
{{ .Prompt }}
### Assistant:
66B
params
{
"num_ctx": 32768,
"stop": [
"</s>",
"<|im_start|>",
"<|im_end|>"
92B
Readme
NOT READY FOR THE PRIME TIME!
I’m trying to figure out the right teamplate here, stay tuned!
CLI
ollama run ifioravanti/neutrixomnibe-dpo
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "ifioravanti/neutrixomnibe-dpo",
"prompt": "Here is a story about llamas eating grass"
}'
Memory requirements
- 7b models generally require at least 8GB of RAM