18 7 months ago

A strong Mixture-of-Experts (MoE) language model with 671B total parameters with 37B activated for each token.

1c16f08d3b69 · 164B
{
"num_ctx": 64000,
"stop": [
"<|begin▁of▁sentence|>",
"<|end▁of▁sentence|>",
"<|User|>",
"<|Assistant|>"
]
}