Llama 3
April 18, 2024
Llama 3 is now available to run using Ollama.
To get started, Download Ollama and run Llama 3:
ollama run llama3
The most capable model
Llama 3 represents a large improvement over Llama 2 and other openly available models:
- Trained on a dataset seven times larger than Llama 2
- Double the context length of 8K from Llama 2
- Encodes language much more efficiently using a larger token vocabulary with 128K tokens
- Less than 1⁄3 of the false “refusals” when compared to Llama 2
Two sizes: 8B and 70B parameters
The initial release of Llama 3 includes two sizes:
- 8B Parameters
ollama run llama3:8b
- 70B Parameters
ollama run llama3:70b
Using Llama 3 with popular tooling
LangChain
from langchain_community.llms import Ollama
llm = Ollama(model="llama3")
llm.invoke("Why is the sky blue?")
LlamaIndex
from llama_index.llms.ollama import Ollama
llm = Ollama(model="llama3")
llm.complete("Why is the sky blue?")
What’s next
Meta plans to release a 400B parameter Llama 3 model and many more. Over the coming months, they will release multiple models with new capabilities including multimodality, the ability to converse in multiple languages, a much longer context window, and stronger overall capabilities.