Models
Docs
Pricing
Sign in
Download
Models
Download
Docs
Pricing
Sign in
TCYZ
/
hayati
:13m
15
Downloads
Updated
2 months ago
Cancel
0.4m
5m
13m
hayati:13m
...
/
model
33ef3a10b7e5 · 53MB
Metadata
general.architecture
llama
llama
llama.attention.head_count
2
2
llama.attention.head_count_kv
2
2
llama.attention.layer_norm_rms_epsilon
1e-05
1e-05
llama.block_count
1
1
llama.context_length
128
128
llama.embedding_length
128
128
llama.feed_forward_length
341
341
tokenizer.ggml.model
llama
llama
tokenizer.ggml.scores
[0, 0, 0, 0, 0, ...]
[0, 0, 0, 0, 0, ...]
tokenizer.ggml.token_type
[1, 1, 1, 1, 1, ...]
[1, 1, 1, 1, 1, ...]
tokenizer.ggml.tokens
[!, ", #, $, %, ...]
[!, ", #, $, %, ...]
Tensor
Name
Type
Shape
token_embd.weight
F32
F32
[128, 50257]
blk.0
blk.0.attn_k.weight
F32
F32
[128, 128]
blk.0.attn_norm.weight
F32
F32
[128]
blk.0.attn_output.weight
F32
F32
[128, 128]
blk.0.attn_q.weight
F32
F32
[128, 128]
blk.0.attn_v.weight
F32
F32
[128, 128]
blk.0.ffn_down.weight
F32
F32
[341, 128]
blk.0.ffn_gate.weight
F32
F32
[128, 341]
blk.0.ffn_norm.weight
F32
F32
[128]
blk.0.ffn_up.weight
F32
F32
[128, 341]
output.weight
F32
F32
[128, 50257]
output_norm.weight
F32
F32
[128]