103 1 month ago

GigaChat3-10B-A1.8B is a dialogue model of the GigaChat family. The model is based on a Mixture-of-Experts (MoE) architecture with 10B total and 1.8B active parameters. The architecture includes Multi-head Latent Attention and Multi-Token Prediction.

ollama run Bored/gigachat3-10B-A1.8

Models

View all →

Readme

No readme