1,025 1 year ago

(unofficial) An instruction-tuned 7B parameter multilingual large language model (LLM) pre-trained with 4T tokens in all official 24 European languages and released in the research project OpenGPT-X.

7b
e1d53abb2a83 · 92B
{
"num_ctx": 4096,
"stop": [
"<|im_start|>",
"<|im_end|>"
],
"temperature": 0.7
}