Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
ketsapiwiq
/
sfr-iterative-dpo-llama-3
31
Downloads
Updated
1 year ago
SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0.gguf from https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF
SFR-Iterative-DPO-LLaMA-3-8B-R-Q8_0.gguf from https://huggingface.co/bartowski/SFR-Iterative-DPO-LLaMA-3-8B-R-GGUF
Cancel
Name
1 model
Size
Context
Input
sfr-iterative-dpo-llama-3:latest
f1edb781718b
• 8.5GB • 8K context window •
Text input • 1 year ago
Text input • 1 year ago
sfr-iterative-dpo-llama-3:latest
8.5GB
8K
Text
f1edb781718b
· 1 year ago