A strong, economical, and efficient Mixture-of-Experts language model with Tool Calling support.
tools
16b
236b
923 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
624d561e7050 · 8.9GB
model
archdeepseek2
·
parameters15.7B
·
quantizationQ4_0
8.9GB
params
{"min_p":0,"mirostat":0,"mirostat_eta":0.1,"mirostat_tau":5,"num_ctx":4096,"num_predict":128,"repeat
224B
license
MIT License
Copyright (c) 2023 DeepSeek
Permission is hereby granted, free of charge, to any perso
1.1kB
template
{{- if .Messages }}
{{- if or .System .Tools }}
{{- if .System }}
{{ .System }}
{{- end }}
{{
1.3kB
license
DEEPSEEK LICENSE AGREEMENT
Version 1.0, 23 October 2023
Copyright (c) 2023 DeepSeek
Section I: PR
14kB
Readme
No readme