Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
qllama
/
bge-m3
539.3K
Downloads
Updated
6 months ago
quantize https://huggingface.co/BAAI/bge-m3 to f16 / q8_0 (latest) / q4_k_m
quantize https://huggingface.co/BAAI/bge-m3 to f16 / q8_0 (latest) / q4_k_m
Cancel
embedding
Name
4 models
Size
Context
Input
bge-m3:latest
cf7e89be82c9
• 635MB • 8K context window •
Text input • 6 months ago
Text input • 6 months ago
bge-m3:latest
635MB
8K
Text
cf7e89be82c9
· 6 months ago
bge-m3:q4_k_m
b3ea13943d56
• 438MB • 8K context window •
Text input • 6 months ago
Text input • 6 months ago
bge-m3:q4_k_m
438MB
8K
Text
b3ea13943d56
· 6 months ago
bge-m3:q8_0
cf7e89be82c9
• 635MB • 8K context window •
Text input • 6 months ago
Text input • 6 months ago
bge-m3:q8_0
635MB
8K
Text
cf7e89be82c9
· 6 months ago
bge-m3:f16
5e44f5b9b0e9
• 1.2GB • 8K context window •
Text input • 6 months ago
Text input • 6 months ago
bge-m3:f16
1.2GB
8K
Text
5e44f5b9b0e9
· 6 months ago