latest
4.1GB
7B
2 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
9b04903205c3 · 4.1GB
model
archllama
·
parameters6.74B
·
quantizationQ4_K_M
4.1GB
params
{"stop":["<|start_header_id|>","<|end_header_id|>","<|eot_id|>","<|reserved_special_token"]}
128B
template
{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>"
256B
system
Welcome to a fined-tuned LLM for Complex Maritime Event Recognition using RTEC
78B
Readme
No readme