49 2 months ago

A lightweight, cross-platform system assistant built on Granite 4, optimized for precision, speed, and reliability across macOS, Linux, and UNIX environments. Ideal for use with the Obsidian CLI by Aurora Foundation.

Models

View all →

Readme

granite4-obsidian-tiny-h

A lightweight, OS-neutral system assistant model optimized for multi-platform command-line environments.
Built from Granite 4, tuned for precision, consistency, and responsiveness across macOS, Linux, and other UNIX-like systems.
Designed for integration with the Obsidian CLI by Aurora Foundation as a universal local AI assistant.


🧩 Model Details

  • Base Model: Granite 4
  • Variant: Tiny (optimized for low memory and fast inference)
  • Focus: Cross-platform system administration, DevOps, and automation
  • Framework: Ollama
  • License: Check Granite base model terms

⚙️ Intended Use

This model is optimized for:

  • macOS, Linux, and UNIX-like environments
  • Cross-platform DevOps and automation tasks
  • Local and offline system assistance
  • Integration with terminal-based AI tools

🚀 Quick Start

Run directly with Ollama:

ollama run granite4-obsidian-tiny-h "how do I check disk usage across systems?"

Or via the Obsidian CLI:

obsidian --model granite4-obsidian-tiny-h "how do I find the system uptime?"

🧠 Behavior

  • Produces OS-neutral, structured technical responses
  • Markdown-enhanced output with syntax highlighting
  • Balances precision with lightweight performance
  • Minimizes hallucination and ambiguity across platforms

🧱 Integration

Seamlessly works with:

  • Obsidian CLI by Aurora Foundation
  • Rich renderer for terminal UI
  • Both streaming and batch inference modes

🛡️ Safety

  • Text-only model — does not execute commands
  • Always verify suggested commands before running
  • Built for professional and educational environments

📦 Example Prompts

how do I find the IP address of my device?
show me how to kill a process by name
what’s the command to check CPU info on macOS and Linux?