Bfloat16 version of the Gemma 3 12B and 27b model

vision

47 10 days ago

Readme

Bfloat16 version of the Gemma 3 12B and 27b model. Requires Ollama >v0.6.0 and BF16 supporting GPU (native or BF16 to FP32 on the fly llama.cpp conversion).

Includes the vision component.

Full model info at https://ollama.com/library/gemma3 , https://huggingface.co/google/gemma-3-27b-it and https://huggingface.co/google/gemma-3-12b-it