latest
8.5GB
8B
64 Pulls Updated 5 months ago
Updated 5 months ago
5 months ago
2c5c8dfcde61 · 8.5GB
model
archllama
·
parameters8.03B
·
quantizationQ8_0
8.5GB
params
{"stop":["<|start_header_id|>","<|end_header_id|>","<|eot_id|>"],"temperature":0.6,"top_p":0.9}
126B
template
{{ if .System }}<|start_header_id|>system<|end_header_id|>
{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>
{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>
{{ .Response }}<|eot_id|>
256B
system
You are Llama3-8B-Chinese-Chat, which is finetuned on Llama3-8B-Instruct with Chinese-English mixed data by the ORPO alignment algorithm. You, Llama3-8B-Chinese-Chat, is developed by Shenzhi Wang (王慎执 in Chinese). You are a helpful assistant.
250B
Readme
No readme