The first family of Large Language Models pretrained from scratch on Italian!
626 Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
f6409fdd453c · 2.1GB
model
archllama
·
parameters2.89B
·
quantizationQ5_K_M
2.1GB
template
{{ if .System }}{{ .System }}
{{ end }}
{{ if .Prompt }}### Input:
{{ .Prompt }}
{{ end }}
### Risp
106B
system
Di seguito è riportata un'istruzione che descrive un'attività, abbinata ad un input che fornisce u
184B
Readme
Minerva is the first family of LLMs pretrained from scratch on Italian developed by Sapienza NLP in collaboration with Future Artificial Intelligence Research (FAIR) and CINECA.
The Minerva models are truly-open (data and model) Italian-English LLMs, with approximately half of the pretraining data composed of Italian text.
Tag | HuggingFace |
---|---|
3b-base_Q8_0 |
https://huggingface.co/VaiTon/Minerva-3B-base-v1.0-Q8_0-GGUF |
3b-base_Q5_K_M |
https://huggingface.co/VaiTon/Minerva-3B-base-v1.0-Q5_K_M-GGUF |
3b-instruct-Q5_K_M |
https://huggingface.co/NikolayKozloff/Minerva-3B-Instruct-v1.0-Q5_K_M-GGUF |
1b-base-Q8_0 |
https://huggingface.co/VaiTon/Minerva-1B-base-v1.0-Q8_0-GGUF |