Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
TCYZ
/
cokertme
:1m
71
Downloads
Updated
2 months ago
Türkiye'nin yapay zeka hamlesinde stratejik bir boşluğu dolduran Çökertme serisi, devasa modellerin aksine "her cihazda çalışan zeka" mottosuyla geliştirilmiştir. TCYZ projesi kapsamında sunulan bu aile, en küçük donanımla çalışabilir.
Türkiye'nin yapay zeka hamlesinde stratejik bir boşluğu dolduran Çökertme serisi, devasa modellerin aksine "her cihazda çalışan zeka" mottosuyla geliştirilmiştir. TCYZ projesi kapsamında sunulan bu aile, en küçük donanımla çalışabilir.
Cancel
1m
1.6m
6.7m
28m
57m
cokertme:1m
...
/
model
25b3f70fc21c · 5.0MB
Metadata
general.architecture
llama
llama
llama.attention.head_count
4
4
llama.attention.head_count_kv
4
4
llama.attention.layer_norm_rms_epsilon
1e-06
1e-06
llama.block_count
6
6
llama.context_length
64
64
llama.embedding_length
128
128
llama.feed_forward_length
256
256
llama.rope.dimension_count
32
32
tokenizer.ggml.bos_token_id
0
0
tokenizer.ggml.eos_token_id
2
2
tokenizer.ggml.merges
[Ä ±, o r, a n, Ġ b, e r, ...]
[Ä ±, o r, a n, Ġ b, e r, ...]
tokenizer.ggml.model
gpt2
gpt2
tokenizer.ggml.padding_token_id
1
1
tokenizer.ggml.tokens
[<s>, <pad>, </s>, <unk>, <mask>, ...]
[<s>, <pad>, </s>, <unk>, <mask>, ...]
Tensor
Name
Type
Shape
token_embd.weight
F32
F32
[128, 1000]
blk.0
blk.0.attn_k.weight
F32
F32
[128, 128]
blk.0.attn_norm.weight
F32
F32
[128]
blk.0.attn_output.weight
F32
F32
[128, 128]
blk.0.attn_q.weight
F32
F32
[128, 128]
blk.0.attn_v.weight
F32
F32
[128, 128]
blk.0.ffn_down.weight
F32
F32
[256, 128]
blk.0.ffn_gate.weight
F32
F32
[128, 256]
blk.0.ffn_norm.weight
F32
F32
[128]
blk.0.ffn_up.weight
F32
F32
[128, 256]
blk.1
blk.1.attn_k.weight
F32
F32
[128, 128]
blk.1.attn_norm.weight
F32
F32
[128]
blk.1.attn_output.weight
F32
F32
[128, 128]
blk.1.attn_q.weight
F32
F32
[128, 128]
blk.1.attn_v.weight
F32
F32
[128, 128]
blk.1.ffn_down.weight
F32
F32
[256, 128]
blk.1.ffn_gate.weight
F32
F32
[128, 256]
blk.1.ffn_norm.weight
F32
F32
[128]
blk.1.ffn_up.weight
F32
F32
[128, 256]
blk.2
blk.2.attn_k.weight
F32
F32
[128, 128]
blk.2.attn_norm.weight
F32
F32
[128]
blk.2.attn_output.weight
F32
F32
[128, 128]
blk.2.attn_q.weight
F32
F32
[128, 128]
blk.2.attn_v.weight
F32
F32
[128, 128]
blk.2.ffn_down.weight
F32
F32
[256, 128]
blk.2.ffn_gate.weight
F32
F32
[128, 256]
blk.2.ffn_norm.weight
F32
F32
[128]
blk.2.ffn_up.weight
F32
F32
[128, 256]
blk.3
blk.3.attn_k.weight
F32
F32
[128, 128]
blk.3.attn_norm.weight
F32
F32
[128]
blk.3.attn_output.weight
F32
F32
[128, 128]
blk.3.attn_q.weight
F32
F32
[128, 128]
blk.3.attn_v.weight
F32
F32
[128, 128]
blk.3.ffn_down.weight
F32
F32
[256, 128]
blk.3.ffn_gate.weight
F32
F32
[128, 256]
blk.3.ffn_norm.weight
F32
F32
[128]
blk.3.ffn_up.weight
F32
F32
[128, 256]
blk.4
blk.4.attn_k.weight
F32
F32
[128, 128]
blk.4.attn_norm.weight
F32
F32
[128]
blk.4.attn_output.weight
F32
F32
[128, 128]
blk.4.attn_q.weight
F32
F32
[128, 128]
blk.4.attn_v.weight
F32
F32
[128, 128]
blk.4.ffn_down.weight
F32
F32
[256, 128]
blk.4.ffn_gate.weight
F32
F32
[128, 256]
blk.4.ffn_norm.weight
F32
F32
[128]
blk.4.ffn_up.weight
F32
F32
[128, 256]
blk.5
blk.5.attn_k.weight
F32
F32
[128, 128]
blk.5.attn_norm.weight
F32
F32
[128]
blk.5.attn_output.weight
F32
F32
[128, 128]
blk.5.attn_q.weight
F32
F32
[128, 128]
blk.5.attn_v.weight
F32
F32
[128, 128]
blk.5.ffn_down.weight
F32
F32
[256, 128]
blk.5.ffn_gate.weight
F32
F32
[128, 256]
blk.5.ffn_norm.weight
F32
F32
[128]
blk.5.ffn_up.weight
F32
F32
[128, 256]
output.weight
F32
F32
[128, 1000]
output_norm.weight
F32
F32
[128]