Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
TCYZ
/
cokertme
:1.6m
71
Downloads
Updated
2 months ago
Türkiye'nin yapay zeka hamlesinde stratejik bir boşluğu dolduran Çökertme serisi, devasa modellerin aksine "her cihazda çalışan zeka" mottosuyla geliştirilmiştir. TCYZ projesi kapsamında sunulan bu aile, en küçük donanımla çalışabilir.
Türkiye'nin yapay zeka hamlesinde stratejik bir boşluğu dolduran Çökertme serisi, devasa modellerin aksine "her cihazda çalışan zeka" mottosuyla geliştirilmiştir. TCYZ projesi kapsamında sunulan bu aile, en küçük donanımla çalışabilir.
Cancel
1m
1.6m
6.7m
28m
57m
cokertme:1.6m
...
/
model
10e29fcddb54 · 7.6MB
Metadata
general.architecture
llama
llama
llama.attention.head_count
8
8
llama.attention.head_count_kv
8
8
llama.attention.layer_norm_rms_epsilon
1e-05
1e-05
llama.block_count
6
6
llama.context_length
128
128
llama.embedding_length
16
16
llama.feed_forward_length
42
42
tokenizer.ggml.model
llama
llama
tokenizer.ggml.scores
[0, 0, 0, 0, 0, ...]
[0, 0, 0, 0, 0, ...]
tokenizer.ggml.token_type
[1, 1, 1, 1, 1, ...]
[1, 1, 1, 1, 1, ...]
tokenizer.ggml.tokens
[!, ", #, $, %, ...]
[!, ", #, $, %, ...]
Tensor
Name
Type
Shape
token_embd.weight
F32
F32
[16, 50257]
blk.0
blk.0.attn_k.weight
F32
F32
[16, 16]
blk.0.attn_norm.weight
F32
F32
[16]
blk.0.attn_output.weight
F32
F32
[16, 16]
blk.0.attn_q.weight
F32
F32
[16, 16]
blk.0.attn_v.weight
F32
F32
[16, 16]
blk.0.ffn_down.weight
F32
F32
[42, 16]
blk.0.ffn_gate.weight
F32
F32
[16, 42]
blk.0.ffn_norm.weight
F32
F32
[16]
blk.0.ffn_up.weight
F32
F32
[16, 42]
blk.1
blk.1.attn_k.weight
F32
F32
[16, 16]
blk.1.attn_norm.weight
F32
F32
[16]
blk.1.attn_output.weight
F32
F32
[16, 16]
blk.1.attn_q.weight
F32
F32
[16, 16]
blk.1.attn_v.weight
F32
F32
[16, 16]
blk.1.ffn_down.weight
F32
F32
[42, 16]
blk.1.ffn_gate.weight
F32
F32
[16, 42]
blk.1.ffn_norm.weight
F32
F32
[16]
blk.1.ffn_up.weight
F32
F32
[16, 42]
blk.2
blk.2.attn_k.weight
F32
F32
[16, 16]
blk.2.attn_norm.weight
F32
F32
[16]
blk.2.attn_output.weight
F32
F32
[16, 16]
blk.2.attn_q.weight
F32
F32
[16, 16]
blk.2.attn_v.weight
F32
F32
[16, 16]
blk.2.ffn_down.weight
F32
F32
[42, 16]
blk.2.ffn_gate.weight
F32
F32
[16, 42]
blk.2.ffn_norm.weight
F32
F32
[16]
blk.2.ffn_up.weight
F32
F32
[16, 42]
blk.3
blk.3.attn_k.weight
F32
F32
[16, 16]
blk.3.attn_norm.weight
F32
F32
[16]
blk.3.attn_output.weight
F32
F32
[16, 16]
blk.3.attn_q.weight
F32
F32
[16, 16]
blk.3.attn_v.weight
F32
F32
[16, 16]
blk.3.ffn_down.weight
F32
F32
[42, 16]
blk.3.ffn_gate.weight
F32
F32
[16, 42]
blk.3.ffn_norm.weight
F32
F32
[16]
blk.3.ffn_up.weight
F32
F32
[16, 42]
blk.4
blk.4.attn_k.weight
F32
F32
[16, 16]
blk.4.attn_norm.weight
F32
F32
[16]
blk.4.attn_output.weight
F32
F32
[16, 16]
blk.4.attn_q.weight
F32
F32
[16, 16]
blk.4.attn_v.weight
F32
F32
[16, 16]
blk.4.ffn_down.weight
F32
F32
[42, 16]
blk.4.ffn_gate.weight
F32
F32
[16, 42]
blk.4.ffn_norm.weight
F32
F32
[16]
blk.4.ffn_up.weight
F32
F32
[16, 42]
blk.5
blk.5.attn_k.weight
F32
F32
[16, 16]
blk.5.attn_norm.weight
F32
F32
[16]
blk.5.attn_output.weight
F32
F32
[16, 16]
blk.5.attn_q.weight
F32
F32
[16, 16]
blk.5.attn_v.weight
F32
F32
[16, 16]
blk.5.ffn_down.weight
F32
F32
[42, 16]
blk.5.ffn_gate.weight
F32
F32
[16, 42]
blk.5.ffn_norm.weight
F32
F32
[16]
blk.5.ffn_up.weight
F32
F32
[16, 42]
output.weight
F32
F32
[16, 50257]
output_norm.weight
F32
F32
[16]