Velvet is an Italian family of large language models (2B, 14B) developed from scratch, featuring a dense architecture.
tools
2b
14b
333 Pulls Updated 2 weeks ago
Updated 2 weeks ago
2 weeks ago
925eee96383d · 8.5GB
model
archllama
·
parameters14.1B
·
quantizationQ4_K_M
8.5GB
params
{
"num_ctx": 16384,
"repeat_last_n": 0,
"repeat_penalty": 0.5,
"stop": [
"<i
156B
template
<s>{{- if .Messages }}
{{- range $index, $_ := .Messages }}
{{- if eq .Role "user" }}<instruction>{{
727B
license
Apache License
Version 2.0, January 2004
http://w
11kB
Readme
Velvet is an Italian family of large language models, developed from scratch, featuring a dense architecture. This model was trained on the HPC Leonardo infrastructure hosted by CINECA, utilizing public data that underwent extensive curation.
Velvet-14B: instruct model trained on 6 languages (Italian, English, Spanish, Portuguese-Brazilian, German, French).
Velvet-2B: instruct model trained on 2 languages (Italian, English).