rjmalagon/
gemma-3:27b-it-bf16

162 6 months ago

Bfloat16 version of the Gemma 3 12B and 27b model

vision

6 months ago

aa0506fe6c4e · 56GB ·

gemma3
·
27B
·
BF16
clip
·
423M
·
F32
{{- range $i, $_ := .Messages }} {{- $last := eq (len (slice $.Messages $i)) 1 }} {{- if or (eq .Rol
Gemma Terms of Use Last modified: February 21, 2024 By using, reproducing, modifying, distributing,
{ "stop": [ "<end_of_turn>" ] }

Readme

Bfloat16 version of the Gemma 3 12B and 27b model. Requires Ollama >v0.6.0 and BF16 supporting GPU (native or BF16 to FP32 on the fly llama.cpp conversion).

Includes the vision component.

Full model info at https://ollama.com/library/gemma3 , https://huggingface.co/google/gemma-3-27b-it and https://huggingface.co/google/gemma-3-12b-it