Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
ifioravanti
/
neutrixomnibe-dpo
81
Downloads
Updated
1 year ago
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
Cancel
7b
Name
5 models
Size
Context
Input
neutrixomnibe-dpo:latest
c946c072a969
• 4.1GB • 32K context window •
Text input • 1 year ago
Text input • 1 year ago
neutrixomnibe-dpo:latest
4.1GB
32K
Text
c946c072a969
· 1 year ago
neutrixomnibe-dpo:7b
latest
c946c072a969
• 4.1GB • 32K context window •
Text input • 1 year ago
Text input • 1 year ago
neutrixomnibe-dpo:7b
latest
4.1GB
32K
Text
c946c072a969
· 1 year ago
neutrixomnibe-dpo:7b-q4_0
c946c072a969
• 4.1GB • 32K context window •
Text input • 1 year ago
Text input • 1 year ago
neutrixomnibe-dpo:7b-q4_0
4.1GB
32K
Text
c946c072a969
· 1 year ago
neutrixomnibe-dpo:7b-q5_k_m
892e3c8ad575
• 5.1GB • 32K context window •
Text input • 1 year ago
Text input • 1 year ago
neutrixomnibe-dpo:7b-q5_k_m
5.1GB
32K
Text
892e3c8ad575
· 1 year ago
neutrixomnibe-dpo:7b-q8_0
5e3083f673e8
• 7.7GB • 32K context window •
Text input • 1 year ago
Text input • 1 year ago
neutrixomnibe-dpo:7b-q8_0
7.7GB
32K
Text
5e3083f673e8
· 1 year ago