A large language model built by the Technology Innovation Institute (TII) for use in summarization, text generation, and chat bots.

7b 40b 180b

63.8K 14 months ago

14 months ago

51498c22efa8 · 4.6GB

model
falcon
·
7.22B
·
Q4_1

Readme

Technology Innovation Institute Logo

Falcon is a family of high-performing large language models model built by the Technology Innovation Institute (TII), a research center part of Abu Dhabi government’s advanced technology research council overseeing technology research.

CLI

ollama run falcon "Why is the sky blue?"

API

curl -X POST http://localhost:11434/api/generate -d '{
  "model": "falcon",
  "prompt": "Why is the sky blue?"
}'

Parameter counts

Parameter Count Recommended memory
7 billion 8GB View ollama run falcon:7b
40 billion 32GB View ollama run falcon:40b
180 billion 192GB View ollama run falcon:180b

Variations

chat Chat models are fine-tuned on chat and instruction datasets with a mix of several large-scale conversational datasets.
instruct Instruct models follow instructions and are fine-tuned on the baize instructional dataset.
text Text models are the base foundation model without any fine-tuning for conversations, and are best used for simple text completion.

Falcon 180B

As of September 2023, the 180 billion parameter model, Falcon 180B, is the best-performing openly released LLM. It sits somewhere in between OpenAI’s GPT 3.5 and GPT 4. For running Falcon 180B, a powerful system is recommended with at least 192GB of total memory.

Note: Falcon 180B is released under a different license than its smaller siblings that restricts commercial use under certain conditions. See the model details and license for more information.

More information