Bfloat16 version of the Microsoft Phi4 14b model

10 2 months ago

Readme

Bfloat16 version of the Microsoft Phi4 14b model. Requires Ollama >v0.5.5 and BF16 supporting GPU (native or BF16 to FP32 on the fly llama.cpp conversion)

Full model info at https://ollama.com/library/phi4 and https://huggingface.co/microsoft/phi-4/