Models
GitHub
Discord
Docs
Cloud
Sign in
Download
Models
Download
GitHub
Discord
Docs
Cloud
Sign in
rfc
/
whiterabbitneo
:latest
2,074
Downloads
Updated
1 year ago
Based off of https://huggingface.co/whiterabbitneo/WhiteRabbitNeo-13B
Based off of https://huggingface.co/whiterabbitneo/WhiteRabbitNeo-13B
Cancel
Updated 1 year ago
1 year ago
05e6d79db08d · 7.4GB ·
model
arch
llama
·
parameters
13B
·
quantization
Q4_0
7.4GB
system
Answer the Question by exploring multiple reasoning paths as follows: - First, carefully analyze the
1.8kB
params
{ "stop": [ "[INST]", "[/INST]", "<<SYS>>", "<</SYS>>" ],
130B
template
[INST] {{ if and .First .System }}<<SYS>>{{ .System }}<</SYS>>{{ end }} {{ .Prompt }} [/INST]
94B
Readme
No readme
Write
Preview
Paste, drop or click to upload images (.png, .jpeg, .jpg, .svg, .gif)