The first family of Large Language Models pretrained from scratch on Italian!
611 Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
9c071a9d5ece · 2.1GB
model
archllama
·
parameters2.89B
·
quantizationQ5_K_M
2.1GB
Readme
Minerva is the first family of LLMs pretrained from scratch on Italian developed by Sapienza NLP in collaboration with Future Artificial Intelligence Research (FAIR) and CINECA.
The Minerva models are truly-open (data and model) Italian-English LLMs, with approximately half of the pretraining data composed of Italian text.
Tag | HuggingFace |
---|---|
3b-base_Q8_0 |
https://huggingface.co/VaiTon/Minerva-3B-base-v1.0-Q8_0-GGUF |
3b-base_Q5_K_M |
https://huggingface.co/VaiTon/Minerva-3B-base-v1.0-Q5_K_M-GGUF |
3b-instruct-Q5_K_M |
https://huggingface.co/NikolayKozloff/Minerva-3B-Instruct-v1.0-Q5_K_M-GGUF |
1b-base-Q8_0 |
https://huggingface.co/VaiTon/Minerva-1B-base-v1.0-Q8_0-GGUF |