Models
GitHub
Discord
Turbo
Sign in
Download
Models
Download
GitHub
Discord
Sign in
qllama
/
bge-m3
:f16
539.4K
Downloads
Updated
6 months ago
quantize https://huggingface.co/BAAI/bge-m3 to f16 / q8_0 (latest) / q4_k_m
quantize https://huggingface.co/BAAI/bge-m3 to f16 / q8_0 (latest) / q4_k_m
Cancel
embedding
Updated 6 months ago
6 months ago
5e44f5b9b0e9 · 1.2GB
model
arch
bert
·
parameters
567M
·
quantization
F16
1.2GB
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)