623 Downloads Updated 1 year ago
Name
14 models
mistral_7b_portuguese:q2_K
2.7GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q3_K_S
3.2GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q3_K_M
3.5GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q3_K_L
3.8GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q4_0
4.1GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q4_1
4.6GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q4_K_S
4.1GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q4_K_M
4.4GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q5_0
5.0GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q5_1
5.4GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q5_K_S
5.0GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q5_K_M
5.1GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q6_K
5.9GB · 32K context window · Text · 1 year ago
mistral_7b_portuguese:q8_0
7.7GB · 32K context window · Text · 1 year ago
This is a finetuned version of mistralai/Mistral-7B-Instruct-v0.2 using unsloth on a instruct portuguese dataset, as an attempt to improve the performance of the model on the language.
No benchmarks have been executed yet.
Dataset used: cnmoro/WizardVicuna-PTBR-Instruct-Clean