Blog
Discord
GitHub
Models
Sign in
Download
Models
Discord
Blog
GitHub
Download
Sign in
ifioravanti
/
neutrixomnibe-dpo
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
NeuTrixOmniBe-DPO is a merge of the NeuralTrix-7B-dpo and OmniBeagleSquaredMBX-v3-7B-v2
Cancel
7b
68
Pulls
Updated
10 months ago
5
Tags
latest
c946c072a969
• 4.1GB • 10 months ago
7b
c946c072a969
• 4.1GB • 10 months ago
7b-q4_0
c946c072a969
• 4.1GB • 10 months ago
7b-q5_k_m
892e3c8ad575
• 5.1GB • 10 months ago
7b-q8_0
5e3083f673e8
• 7.7GB • 10 months ago