116 1 year ago

This is the 110M parameter Llama 2 architecture model trained on the TinyStories dataset

ollama run DuckingtonLabs/tinyStories

Details

1 year ago

e69e9cff82de · 143MB ·

llama
·
134M
·
Q8_0
MIT License Copyright (c) 2025 Duckington Labs Permission is hereby granted, free of charge, to any
{ "num_ctx": 1024, "stop": [ "<|im_start|>", "<|im_end|>" ] }
{{ if .System }}<|im_start|>system {{ .System }}<|im_end|> {{ end }}{{ if .Prompt }}<|im_start|>user

Readme

This is the 110M parameter Llama 2 architecture model trained on the TinyStories dataset. These are converted from karpathy/tinyllamas. See the llama2.c project for more details.