General use chat model based on Llama and Llama 2 with 2K to 16K context sizes.

7b 13b 33b

167K 14 months ago

7adfc8235793 · 76B
{
"num_ctx": 16384,
"rope_frequency_scale": 0.125,
"stop": [
"USER:",
"ASSISTANT:"
]
}