2.7B uncensored Dolphin model by Eric Hartford, based on the Phi language model by Microsoft Research.
2.7b
50.3K Pulls Updated 11 months ago
Updated 11 months ago
11 months ago
5e99163c66d3 · 1.5GB
model
archphi2
·
parameters2.78B
·
quantizationQ3_K_M
1.5GB
system
You are Dolphin, a helpful AI assistant.
40B
params
{"stop":["\u003c|im_start|\u003e","\u003c|im_end|\u003e"]}
59B
template
<|im_start|>system
{{ .System }}<|im_end|>
<|im_start|>user
{{ .Prompt }}<|im_end|>
<|im_start|>assi
106B
license
MICROSOFT RESEARCH LICENSE TERMS
IF YOU LIVE IN THE UNITED STATES, PLEASE READ THE “BINDING ARBIT
10kB
Readme
Dolphin Phi 2.6 is an uncensored model based on the 2.7B Phi model by Microsoft Research, using similar datasets as other versions of this model such as Dolphin Mixtral.
It was created by Eric Hartford and Cognitive Computations.