123 Downloads Updated 1 year ago
Name
3 models
Size
Context
Input
wolfram-miquliz-120b-v2:Q2_K
44GB · 32K context window · Text · 1 year ago
44GB
32K
Text
wolfram-miquliz-120b-v2:IQ2_XXS
32GB · 32K context window · Text · 1 year ago
32GB
wolfram-miquliz-120b-v2:IQ2_XS
35GB · 32K context window · Text · 1 year ago
35GB
See https://huggingface.co/wolfram/miquliz-120b-v2.0-GGUF
Pushed using https://github.com/adrienbrault/hf-gguf-to-ollama