This model is archived and no longer maintained.

falcon Archive

A large language model built by the Technology Innovation Institute (TII) for use in summarization, text generation, and chat bots.

7B 40B 180B

32.5K Pulls Updated 7 months ago


Technology Innovation Institute Logo

Falcon is a family of high-performing large language models model built by the Technology Innovation Institute (TII), a research center part of Abu Dhabi government’s advanced technology research council overseeing technology research.


ollama run falcon "Why is the sky blue?"


curl -X POST http://localhost:11434/api/generate -d '{
  "model": "falcon",
  "prompt": "Why is the sky blue?"

Parameter counts

Parameter Count Recommended memory
7 billion 8GB View ollama run falcon:7b
40 billion 32GB View ollama run falcon:40b
180 billion 192GB View ollama run falcon:180b


chat Chat models are fine-tuned on chat and instruction datasets with a mix of several large-scale conversational datasets.
instruct Instruct models follow instructions and are fine-tuned on the baize instructional dataset.
text Text models are the base foundation model without any fine-tuning for conversations, and are best used for simple text completion.

Falcon 180B

As of September 2023, the 180 billion parameter model, Falcon 180B, is the best-performing openly released LLM. It sits somewhere in between OpenAI’s GPT 3.5 and GPT 4. For running Falcon 180B, a powerful system is recommended with at least 192GB of total memory.

Note: Falcon 180B is released under a different license than its smaller siblings that restricts commercial use under certain conditions. See the model details and license for more information.

More information