granite3-moe:1b-instruct-fp16

85.5K 10 months ago

The IBM Granite 1B and 3B models are the first mixture of experts (MoE) Granite models from IBM designed for low latency usage.

tools 1b 3b
5d4667f097b4 · 15B
{
"num_gpu": 23
}