1,863 2 months ago

The most powerful open-source coding AI - 480B parameters with Mixture of Experts architecture for exceptional code generation and understanding.

d6178d045cc3 · 1.3kB
# Qwen3-Coder-480B-A35B-Instruct IQ2_XS
## Overview
Qwen3-Coder-480B is Qwen's most powerful agentic coding model featuring 480 billion total parameters with 35 billion active parameters (MoE architecture). This IQ2_XS quantization provides extreme compression while maintaining usability.
## Key Features
- **MoE Architecture** - 480B total / 35B active parameters
- **256K Native Context** - Extendable to 1M with YaRN
- **Agentic Coding** - Superior performance on multi-step coding tasks
- **Claude Sonnet Comparable** - Competitive with top proprietary models
- **Apache 2.0** - Fully open source
## Capabilities
- Complex code generation across all languages
- Multi-file refactoring and architecture design
- Debugging and code analysis
- Tool use and function calling
- Long-context code understanding
## Quantization
IQ2_XS (2-bit) - ~133GB, extreme compression for systems with limited VRAM
## Links
- **Original Model**: [Qwen/Qwen3-Coder-480B-A35B-Instruct](https://huggingface.co/Qwen/Qwen3-Coder-480B-A35B-Instruct)
- **GGUF Source**: [bartowski/Qwen_Qwen3-Coder-480B-A35B-Instruct-GGUF](https://huggingface.co/bartowski/Qwen_Qwen3-Coder-480B-A35B-Instruct-GGUF)
## License
Apache 2.0 - Free for commercial and personal use.