44.6K Downloads Updated 1 year ago
Name
8 models
openbmb-minicpm-llama3-v-2_5:latest
6.0GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:IQ3_M
4.8GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:q2_K
4.2GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:q3_K_M
5.1GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:q4_K_M
6.0GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:q5_K_S
6.6GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:q8_0
9.6GB · 8K context window · Text · 1 year ago
openbmb-minicpm-llama3-v-2_5:fp16
17GB · 8K context window · Text · 1 year ago
Go to release page and download the file.
🔥 Especially the ./ollama-linux-arm64 file was build on debian os. It can run in Termux app on android phone.
Start the server:
./ollama-linux-x86_64 serve
Running this model:
ollama run hhao/openbmb-minicpm-llama3-v-2_5
# x86_64 arch
docker pull hihao/ollama-amd64
# arm64 arch
# docker pull hihao/ollama-arm64
docker run -d -v ./models:/root/.ollama -p 11434:11434 --name ollama hihao/ollama-amd64
docker exec -it ollama bash
ollama run hhao/openbmb-minicpm-llama3-v-2_5
Prepare both our llama.cpp fork and this Ollama fork.
git clone -b minicpm-v2.5 https://github.com/OpenBMB/ollama.git
cd ollama/llm
git clone -b minicpm-v2.5 https://github.com/OpenBMB/llama.cpp.git
cd ../
Here we give a MacOS example. See the developer guide for more platforms.
brew install go cmake gcc
Optionally enable debugging and more verbose logging:
## At build time
export CGO_CFLAGS="-g"
## At runtime
export OLLAMA_DEBUG=1
Get the required libraries and build the native LLM code:
go generate ./...
Build ollama:
go build .
Start the server:
./ollama serve
Running this model:
ollama run hhao/openbmb-minicpm-llama3-v-2_5
Note: The windows build for Ollama is still under development.
Install required tools:
$env:CGO_ENABLED="1"
go generate ./...
go build .
Start the server:
./ollama serve
Running this model:
ollama run hhao/openbmb-minicpm-llama3-v-2_5
In addition to the common Windows development tools described above, install CUDA after installing MSVC.
In addition to the common Windows development tools described above, install AMDs HIP package after installing MSVC.
Lastly, add ninja.exe
included with MSVC to the system path (e.g. C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\Ninja
).
See the developer guide for Linux.