Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
TCYZ
/
cokertme2
:14m
45
Downloads
Updated
1 month ago
Çökertme 2, Türkçe doğal dil işleme görevleri için optimize edilmiş, ultra-nano ölçekli bir dil modeli ailesidir.
Çökertme 2, Türkçe doğal dil işleme görevleri için optimize edilmiş, ultra-nano ölçekli bir dil modeli ailesidir.
Cancel
0.8m
1.6m
5m
6m
14m
100m
cokertme2:14m
...
/
model
23bf1c78eafb · 57MB
Metadata
general.architecture
llama
llama
llama.attention.head_count
8
8
llama.attention.head_count_kv
8
8
llama.attention.layer_norm_rms_epsilon
1e-05
1e-05
llama.block_count
6
6
llama.context_length
128
128
llama.embedding_length
128
128
llama.feed_forward_length
341
341
tokenizer.ggml.model
llama
llama
tokenizer.ggml.scores
[0, 0, 0, 0, 0, ...]
[0, 0, 0, 0, 0, ...]
tokenizer.ggml.token_type
[1, 1, 1, 1, 1, ...]
[1, 1, 1, 1, 1, ...]
tokenizer.ggml.tokens
[!, ", #, $, %, ...]
[!, ", #, $, %, ...]
Tensor
Name
Type
Shape
token_embd.weight
F32
F32
[128, 50257]
blk.0
blk.0.attn_k.weight
F32
F32
[128, 128]
blk.0.attn_norm.weight
F32
F32
[128]
blk.0.attn_output.weight
F32
F32
[128, 128]
blk.0.attn_q.weight
F32
F32
[128, 128]
blk.0.attn_v.weight
F32
F32
[128, 128]
blk.0.ffn_down.weight
F32
F32
[341, 128]
blk.0.ffn_gate.weight
F32
F32
[128, 341]
blk.0.ffn_norm.weight
F32
F32
[128]
blk.0.ffn_up.weight
F32
F32
[128, 341]
blk.1
blk.1.attn_k.weight
F32
F32
[128, 128]
blk.1.attn_norm.weight
F32
F32
[128]
blk.1.attn_output.weight
F32
F32
[128, 128]
blk.1.attn_q.weight
F32
F32
[128, 128]
blk.1.attn_v.weight
F32
F32
[128, 128]
blk.1.ffn_down.weight
F32
F32
[341, 128]
blk.1.ffn_gate.weight
F32
F32
[128, 341]
blk.1.ffn_norm.weight
F32
F32
[128]
blk.1.ffn_up.weight
F32
F32
[128, 341]
blk.2
blk.2.attn_k.weight
F32
F32
[128, 128]
blk.2.attn_norm.weight
F32
F32
[128]
blk.2.attn_output.weight
F32
F32
[128, 128]
blk.2.attn_q.weight
F32
F32
[128, 128]
blk.2.attn_v.weight
F32
F32
[128, 128]
blk.2.ffn_down.weight
F32
F32
[341, 128]
blk.2.ffn_gate.weight
F32
F32
[128, 341]
blk.2.ffn_norm.weight
F32
F32
[128]
blk.2.ffn_up.weight
F32
F32
[128, 341]
blk.3
blk.3.attn_k.weight
F32
F32
[128, 128]
blk.3.attn_norm.weight
F32
F32
[128]
blk.3.attn_output.weight
F32
F32
[128, 128]
blk.3.attn_q.weight
F32
F32
[128, 128]
blk.3.attn_v.weight
F32
F32
[128, 128]
blk.3.ffn_down.weight
F32
F32
[341, 128]
blk.3.ffn_gate.weight
F32
F32
[128, 341]
blk.3.ffn_norm.weight
F32
F32
[128]
blk.3.ffn_up.weight
F32
F32
[128, 341]
blk.4
blk.4.attn_k.weight
F32
F32
[128, 128]
blk.4.attn_norm.weight
F32
F32
[128]
blk.4.attn_output.weight
F32
F32
[128, 128]
blk.4.attn_q.weight
F32
F32
[128, 128]
blk.4.attn_v.weight
F32
F32
[128, 128]
blk.4.ffn_down.weight
F32
F32
[341, 128]
blk.4.ffn_gate.weight
F32
F32
[128, 341]
blk.4.ffn_norm.weight
F32
F32
[128]
blk.4.ffn_up.weight
F32
F32
[128, 341]
blk.5
blk.5.attn_k.weight
F32
F32
[128, 128]
blk.5.attn_norm.weight
F32
F32
[128]
blk.5.attn_output.weight
F32
F32
[128, 128]
blk.5.attn_q.weight
F32
F32
[128, 128]
blk.5.attn_v.weight
F32
F32
[128, 128]
blk.5.ffn_down.weight
F32
F32
[341, 128]
blk.5.ffn_gate.weight
F32
F32
[128, 341]
blk.5.ffn_norm.weight
F32
F32
[128]
blk.5.ffn_up.weight
F32
F32
[128, 341]
output.weight
F32
F32
[128, 50257]
output_norm.weight
F32
F32
[128]