7 1 month ago

Met2e is a new model from the MET2 series, based on the latest large M21 dataset, offering a set of models based on Qwen.

0.8b 3b

Models

View all →

Readme

met-2e logo.png

[ just make it free ]

MET-2e — is a series of new-generation language models, based on the Qwen2 architecture and the MET project’s own developments. The models are designed to perform complex tasks requiring structured and deep data analysis. The largest version, MET-2e-3B, demonstrates performance comparable to larger models, such as llama3.1 8b or deepseek-r1 14b.

Try MET.chat

Models

The MET-2experimental (MET 2.1) series is a line of experimental models based on Qwen and some MET technologies. The key change is its own high-quality dataset and the partial use of Micro-Directions during the training stage, followed by conversion to MoE, which allows for quick fine-tuning of the model and adding specialized focus in various fields.

MET2e-Davinci An advanced model designed for engaging conversations, deep world understanding, lyrical creativity, and much more.

Based on MD2MoE
Has 128 experts

MET2.1 0.8b Davinci

ollama run oleg_pivo2014/met2e:0.8b

MET2e-xxl The largest and most powerful model for the most complex tasks. Requires high-performance hardware.

Classic fully connected neural network
Is the smartest in the MET series

MET2.1 3b XXL

ollama run oleg_pivo2014/met2e:3b

Comparative Tests (Benchmarks)

The results of the model’s comparative performance analysis are presented in the diagram below.

real 2510.png

Licensing

The MET2e series models are based on the open models Qwen2.5 from Alibaba Cloud, and are licensed differently depending on the version:

  • MET2e-0.8B (based on Qwen2.5-0.5B) is distributed under the Apache 2.0 license.
  • MET2e-3B (based on Qwen2.5-3B) is distributed in accordance with the Qwen License;
    while the additional modifications by the MET project are distributed under Apache 2.0.

Use of the MET2e models implies compliance with the terms of the respective licenses.

Copyrights

  • © 2025, Oleg_Pivo2014: Fine-tuning and modification of the models.
  • © 2024, Alibaba Cloud: Development of the original Qwen2.5-0.5B and Qwen2.5-3B models and development of the qwen2moe architecture.

Acknowledgements

I express personal gratitude to Alibaba Cloud🧸 for providing high-quality open-source models, architectures, and for the opportunity for their further fine-tuning and modification.