A merge of serveral great Qwen2-1.5B models.
179 Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
86c0c00f152f · 1.5GB
Readme
CleverQwen2-1.5B
This is a merge of pre-trained language models created using mergekit.
It has grown by about 300M parameters and I don’t know why. I would like to know though. It works as expexted - amazing - I just can’t see any reason for the Qwen2 models to gain parameters when merged.
Merge Details
Merge Method
This model was merged using the Model Stock merge method using trollek/Qwen2-1.5B-Instruct-Abliterated as a base.
Models Merged
The following models were included in the merge: - cognitivecomputations/dolphin-2.9.3-qwen2-1.5b - M4-ai/Hercules-5.0-Qwen2-1.5B - Replete-AI/Replete-Coder-Qwen2-1.5b