A state-of-the-art mixture-of-experts (MoE) language model. Kimi K2-Instruct-0905 demonstrates significant improvements in performance on public benchmarks and real-world coding agent tasks.
tools
1026b
{
"num_gpu": 1,
"repeat_penalty": 1,
"stop": [
"<|im_start|>",
"<|im_user|>",
"<|im_end|>"
],
"temperature": 0.6,
"top_k": 20,
"top_p": 0.95
}