Now with 100% more alignment!
41 Pulls Updated 4 months ago
Updated 4 months ago
4 months ago
7e6f9bff937a · 2.8GB
model
archphi3
·
parameters3.82B
·
quantizationQ5_K_M
2.8GB
params
{
"stop": [
"<|end|>",
"<|user|>",
"<|assistant|>"
]
}
78B
template
{{ if .System }}<|system|>
{{ .System }}<|end|>
{{ end }}{{ if .Prompt }}<|user|>
{{ .Prompt }}<|end
148B
license
Microsoft.
Copyright (c) Microsoft Corporation.
MIT License
Permission is hereby granted, free of
1.1kB
Readme
See on Huggingface
This is microsoft/Phi-3-mini-128k-instruct with orthogonalized bfloat16 safetensor weights, generated with a refined methodology based on that which was described in the preview paper/blog post: ‘Refusal in LLMs is mediated by a single direction’ which I encourage you to read to understand more.