latest
817MB
Fine Tuning GPT2-medium on private dataset
19 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
b07206fb2b8e · 817MB
model
archgpt2
·
parameters406M
·
quantizationF16
817MB
template
<|BEGIN_QUERY|>{{.Prompt}}<|END_QUERY|>
<END_ANALYSIS|>
<|BEGIN_RESPONSE|>
75B
params
{"repeat_last_n":64,"repeat_penalty":1.1,"stop":["<|STOP|>","<|STOP|>","<|END_RESPONSE|>","<|END_RESPONSE|>"],"temperature":0.4,"top_k":30,"top_p":0.9}
192B
system
You are an AI assistant. Please start your response starting from BEGIN_RESPONSE or "<|BEGIN_RESPONSE|>" token.
111B
Readme
No readme