109 7 months ago

thinking
ollama run comanderanch/Linux-Buster

Details

7 months ago

798c1d78f797 · 4.7GB ·

qwen2
·
7.62B
·
Q4_K_M
MIT License Copyright (c) 2023 DeepSeek Permission is hereby granted, free of charge, to any person
You are a helpfull assistant your name is Linux-Buster. Your Job is to assume the Role and Goal: Rol
{ "stop": [ "<|begin▁of▁sentence|>", "<|end▁of▁sentence|>",
{{- if .System }}{{ .System }}{{ end }} {{- range $i, $_ := .Messages }} {{- $last := eq (len (slice

Readme

Linux-Buster:latest

A minimal, command-only Linux CLI assistant built on DeepSeek R1 7B and packaged for Ollama. Linux-Buster converts natural-language requests into single, executable Linux commands—with no extra narration.


TL;DR

  • Purpose: Translate task descriptions → exact Linux command(s).
  • Output policy: Commands only. No explanations, prefixes, or prose.
  • Base model: deepseek-r1:7b
  • Tag: Linux-Buster:latest
  • Temp: 1 (more generative; you can tune per request)

Features

  • Command-only responses for file ops, system admin, networking, packages, text processing, etc.
  • Will choose the most common, safe default when input is vague.
  • Neutral, consistent tone—returns just the command.

Requirements

  • Ollama installed on the host running the model.
  • Sufficient RAM/VRAM for deepseek-r1:7b.

Modelfile

Create a Modelfile in your working directory with the contents below.

FROM deepseek-r1:7b

# set the temperature to 1 [higher is more creative, lower is more coherent]
PARAMETER temperature 1

SYSTEM """
    You are a helpfull assistant your name is Linux-Buster. Your Job is to assume the Role and Goal:
    
    Role and Goal: You are designed to act as a Linux OS command line expert. Its primary function is to understand user descriptions of desired commands and output the exact Linux command that can be run on the terminal without any additional text or explanation.

    Constraints: You should strictly output Linux commands without any explanatory text, preambles, or follow-up messages. It must ensure the commands are syntactically correct and applicable to the described task.

    Guidelines: You should be capable of interpreting a wide range of descriptions related to file management, system administration, networking, and software management among other Linux command line tasks. It should focus on providing the most direct and efficient command solution to the user's request.

    Clarification: You should be biased toward making a response based on the intended behavior, filling in any missing details. If the description is too vague or broad, it should opt for the most commonly used or straightforward command related to the request.

    Personalization: You maintain a neutral tone, focusing solely on the accuracy and applicability of the Linux commands provided.

    Your responcibilitys are to guide the user in anything cli linux.
"""

Note: The SYSTEM content is preserved verbatim from the original spec.


Build & Tag

# 1) Build the local model from Modelfile
ollama create Linux-Buster:latest -f Modelfile

# 2) Verify it appears locally
ollama list | grep -i linux-buster

Run (CLI)

# Single-shot prompt
ollama run Linux-Buster:latest "show disk usage of current directory in human-readable units"

# Interactive session
ollama run Linux-Buster:latest
# now type requests; each response is a command only

cURL example (API)

curl http://localhost:11434/api/generate \
  -d '{
        "model": "Linux-Buster:latest",
        "prompt": "find all .log files modified in last 24 hours under /var/log"
      }'

Output Contract (Strict)

Linux-Buster must:

  1. Return only the command(s). No prose.
  2. Prefer one-liners when possible.
  3. Fill missing, common-sense defaults (e.g., -r for recursive copy if implied).
  4. Use widely available GNU/Linux tooling.

If a task truly needs multiple steps, output them on separate lines, each being a standalone command.


Prompting Guidelines

  • Be direct: “list open TCP ports” → returns a single command.
  • Specify scope if needed: path, user, package manager (e.g., apt, dnf), distro nuances.
  • Side-effect caution: destructive ops (e.g., rm -rf) should only appear when explicitly implied by the request.

Examples

List human-readable disk usage for current directory

du -sh .

Find all .env files excluding node_modules

find . -type f -name "*.env" -not -path "*/node_modules/*"

Show top 10 memory-consuming processes

ps aux --sort=-%mem | head -n 11

Follow last 100 lines of Nginx access log

tail -n 100 -f /var/log/nginx/access.log

Open TCP ports with PID (Linux)

ss -tulnp

Recursively replace a string in all .py files under src/

sed -i 's/OLD/NEW/g' $(grep -rl --include='*.py' OLD src/)

Install htop (APT-based)

sudo apt update && sudo apt install -y htop

Temperature Notes

The Modelfile sets temperature 1. For more consistent command generation, you can override at runtime:

ollama run Linux-Buster:latest -p "archive the current directory to tar.gz" -t 0.2

Open WebUI

  • Point your WebUI instance to the same Ollama endpoint that hosts Linux-Buster:latest.
  • In a chat with this model, prompts should be task descriptions only (no code fences required).

Versioning & Push (optional)

If you have an Ollama Hub namespace:

# Tag explicitly
ollama create Linux-Buster:1.0.0 -f Modelfile

# Push (replace <namespace>)
ollama push <namespace>/Linux-Buster:1.0.0

Limitations

  • Returns commands only; it won’t explain trade-offs or risks.
  • May default to Debian/Ubuntu tooling unless you specify otherwise (e.g., dnf, pacman).
  • Complex, multi-step procedures might need several prompts.

Troubleshooting

  • Model doesn’t appear: re-run ollama create ..., check ollama list and server logs.
  • Too verbose outputs: ensure your front-end isn’t injecting extra system prompts; the Modelfile policy must be top-most.
  • Commands look non-GNU: specify distro/tooling explicitly in the prompt.

License / Use

Use at your own risk. Validate commands before running in production environments.