28 Downloads Updated 1 month ago
Name
2 models
Size
Context
Input
multilingual-e5-large-instruct:latest
603MB · 512 context window · Text · 1 month ago
603MB
512
Text
multilingual-e5-large-instruct:Q8_0
603MB · 512 context window · Text · 2 months ago
Multilingual E5 Text Embeddings: A Technical Report. Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
This model has 24 layers and the embedding size is 1024.
Original Model on HF