Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
macadeliccc
/
laser-dolphin-mixtral-2x7b-dpo
:latest
965
Downloads
Updated
1 year ago
2x7b (13B) mixtral that outperforms mixtral 8x7b
2x7b (13B) mixtral that outperforms mixtral 8x7b
Cancel
Updated 1 year ago
1 year ago
b79ec928ce95 · 14GB ·
model
arch
llama
·
parameters
12.9B
·
quantization
Q8_0
14GB
system
You are Dolphin, an uncensored and unbiased AI assistant. You always comply with the user's request
300B
Readme
laser-dolphin-mixtral-2x7b-dpo
Write
Preview
# laser-dolphin-mixtral-2x7b-dpo
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)