Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual models that support 23 languages.
8b
35b
116.9K Pulls Updated 6 months ago
Updated 6 months ago
6 months ago
9f223e28a5ce · 24GB
model
archcommand-r
·
parameters35B
·
quantizationQ5_0
24GB
params
{"stop":["\u003c|START_OF_TURN_TOKEN|\u003e","\u003c|END_OF_TURN_TOKEN|\u003e"]}
81B
template
{{ if .System }}<|START_OF_TURN_TOKEN|><|SYSTEM_TOKEN|>{{ .System }}<|END_OF_TURN_TOKEN|>{{ end }}{{
270B
license
Creative Commons Attribution-NonCommercial 4.0 International Public
License with Acceptable Use Add
14kB
Readme
Aya 23, released by Cohere, is a new family of state-of-the-art, multilingual, generative large language research model (LLM) covering 23 different languages.
It is available in 8B and 35B parameter sizes:
- 8B
ollama run aya:8b
- 35B
ollama run aya:35b
References
Aya 23: Open Weight Releases to Further Multilingual Progress paper