phi3 mini 4k instruct model and perform continual pretraining using data extracted from the "Reflextion LLM" research paper.
4 Pulls Updated 8 weeks ago
Updated 8 weeks ago
8 weeks ago
cc24f200feed · 7.6GB
model
archllama
·
parameters3.82B
·
quantizationF16
7.6GB
template
{{- range .Messages }}<|{{ .Role }}|>
{{ .Content }}</s>
{{ end }}<|assistant|>
80B
params
{"stop":["\u003c|system|\u003e","\u003c/s\u003e","\u003c|user|\u003e","\u003c|assistant|\u003e"]}
98B
Readme
No readme