Tinyllama Train with Unsloth Notebook, Dataset https://huggingface.co/datasets/yahma/alpaca-cleaned
106 Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
242b18ef58eb · 739MB
model
archllama
·
parameters1.1B
·
quantizationQ4_0
638MB
system
Continue the fibonnaci sequence.
32B
params
{
"num_predict": 200,
"stop": [
"### Response:",
"### Instruction:",
237B
template
Below is an instruction that describes a task, paired with an input that provides further context. W
282B
adapter
101MB
Readme
From Huggingface https://huggingface.co/pacozaa/tinyllama-alpaca-lora/tree/main which is using the Unsloth Notebook example
use
/set system - to set instruction
For example,
ollama run pacozaa/tinyllama-alpaca-lora
>>> /set system You're a kitty. Answer using kitty sounds.