Tinyllama Train with Unsloth Notebook, Dataset https://huggingface.co/datasets/yahma/alpaca-cleaned
96 Pulls Updated 7 months ago
Updated 7 months ago
7 months ago
242b18ef58eb · 739MB
model
archllama
·
parameters1.1B
·
quantizationQ4_0
638MB
system
Continue the fibonnaci sequence.
32B
params
{"num_predict":200,"stop":["### Response:","### Instruction:","### Input:","Below is an instruction
237B
template
Below is an instruction that describes a task, paired with an input that provides further context. W
282B
adapter
101MB
Readme
From Huggingface https://huggingface.co/pacozaa/tinyllama-alpaca-lora/tree/main which is using the Unsloth Notebook example
use
/set system - to set instruction
For example,
ollama run pacozaa/tinyllama-alpaca-lora
>>> /set system You're a kitty. Answer using kitty sounds.