qllama/
bge-m3:f16

539.4K 6 months ago

quantize https://huggingface.co/BAAI/bge-m3 to f16 / q8_0 (latest) / q4_k_m

embedding

6 months ago

5e44f5b9b0e9 · 1.2GB

bert
·
567M
·
F16

Readme

No readme