This model extends LLama-3 70B's context length from 8k to over 1m tokens. [I-Quants]

70B

89 Pulls Updated 3 months ago

{ "num_keep": 24, "stop": [ "<|start_header_id|>", "<|end_header_id|>", "<|eot_id|>" ] }