Solar Pro Preview: an advanced large language model (LLM) with 22 billion parameters designed to fit into a single GPU
21.3K Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
8e88c5664027 · 12GB
Readme
Solar Pro Preview is an advanced large language model (LLM) featuring 22 billion parameters, optimized to operate on a single GPU. It demonstrates superior performance compared to LLMs with fewer than 30 billion parameters and delivers results comparable to much larger models, such as Llama 3.1 with 70 billion parameters.
Developed using an enhanced version of the depth up-scaling method, Solar Pro Preview scales a Phi-3-medium model with 14 billion parameters to 22 billion, designed to run on a GPU with 80GB of VRAM. The training strategy and dataset have been meticulously curated, leading to significant performance improvements over Phi-3-medium, especially on benchmarks like MMLU-Pro and IFEval, which assess a model’s knowledge and instruction-following capabilities.
As a pre-release version of the official Solar Pro, Solar Pro Preview comes with limitations on language coverage and a maximum context length of 4K. Despite these restrictions, the model stands out for its efficiency and capability, with potential for future extensions to support more languages and functionalities.
The official version of Solar Pro, scheduled for release in November 2024, will include expanded language support and longer context windows.