86 2 weeks ago

SmallCoder is a compact reasoning-focused coding model, fine-tuned from DeepSeek-R1 1.5B using a code dataset that includes step-by-step reasoning.

1.5b
ollama run DedeProGames/smallcoder:1.5b

Models

View all →

Readme

upscalemedia-transformed (1).png

Welcome to SmallCoder, a compact reasoning-focused coding model fine-tuned from DeepSeek-R1 1.5B using a code dataset that includes step-by-step reasoning.
It is designed to deliver strong coding performance for its parameter size, emphasizing structured reasoning and correctness during problem solving.


Overview

SmallCoder is a compact reasoning-focused coding model, fine-tuned from DeepSeek-R1 1.5B using a code dataset that includes step-by-step reasoning.
It targets strong code generation for its parameter size, prioritizing correctness-oriented reasoning during coding tasks.

The training approach focuses on exposing the model to structured, multi-step solutions so it can better handle logic-heavy programming tasks while remaining lightweight and efficient.


Benchmark Results

Model LCB (v5) (8/1/24–2/1/25) Codeforces Rating Codeforces Percentile HumanEval+
SmallCoder 25.1 963 28.5 73.0
Deepseek-R1-Distill-Qwen-1.5B 16.9 615 1.9 58.3

These results highlight improvements in reasoning-driven coding performance while maintaining a compact model size.

SmallCoder aims to balance efficiency and reasoning depth, making it suitable for local deployment, experimentation, and lightweight coding assistance workflows.