10.7B model, depth upscaled version of two mistral based finetunes
27 Pulls Updated 10 months ago
Updated 10 months ago
10 months ago
36593d2ad690 · 6.5GB
model
archllama
·
parameters10.7B
·
quantizationQ4_K_M
6.5GB
template
<|im_start|>system {{ .System }} <|im_end|>
<|im_start|>GPT4 Correct User: {{ .Prompt }}
<|im_end|>
136B
params
{
"num_ctx": 8092,
"stop": [
"<|im_end|>",
"<|end_of_turn|>",
"</s>"
137B
system
You are Chikuma, a constantly learning AI assistant who strives to be
insightful, engaging, and hel
370B
Readme
Chikuma is a 10.7B parameter model and is a merge of the following models using LazyMergekit: * sethuiyer/SynthIQ-7b * openchat/openchat-3.5-0106
The name “Chikuma” is inspired by the Chikuma River, the longest in Japan, known for its continuous flow and meandering path. This metaphorically represents the model’s depth, fluidity, and adaptability in processing and understanding language.
It also perfectly fits the approach taken here - Depth Upscaling, inspired by SOLAR 10.7B.