Ministral 8B is an 8B parameter model featuring a unique interleaved sliding-window attention pattern for faster, memory-efficient inference. Designed for edge use cases, it supports up to 128k context length and excels in knowledge and reasoning tasks.

tools 8b

324 2 months ago

2 Tags
ef9a9e6e6c8b • 4.9GB • 2 months ago
ef9a9e6e6c8b • 4.9GB • 2 months ago