432 1 year ago

The Mixtral-8x22B Large Language Model (LLM) is a pretrained generative Sparse Mixture of Experts.