This model extends LLama-3 70B's context length from 8k to over 1m tokens. [I-Quants]

70B

89 Pulls Updated 3 months ago

8ab4849b038c · 254B
{{ if .System }}<|start_header_id|>system<|end_header_id|> {{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|> {{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|> {{ .Response }}<|eot_id|>