10 1 month ago

Fine-tuned meta-llama/Llama-3.2-1B for English QA tasks. use llama.cpp quantization q4_k_m

dacf4e9ed1ea · 118B
{
"num_ctx": 4096,
"stop": [
"<|end_of_text|>",
"<|eot_id|>"
],
"temperature": 0.6,
"top_k": 50,
"top_p": 0.9
}