23 Downloads Updated 8 months ago
Vyxa is an open-source large language model (LLM) designed for Web3 developers, AI researchers, and decentralized applications. It provides cutting-edge capabilities for smart contract auditing, blockchain analytics, and DeFi automation.
Vyxa is available via Ollama, allowing you to run it locally with minimal setup.
Ollama is required to run Vyxa. Install it using the following command:
curl -fsSL https://ollama.com/install.sh | sh
Download and install from: Ollama for Windows
Verify the installation:
ollama --version
Once Ollama is installed, pull the Vyxa LLM model:
ollama pull VyxaFoundation/Vyxa
Run the model:
ollama run VyxaFoundation/Vyxa
Vyxa can be used as an API to integrate AI capabilities into your applications.
ollama serve
curl -X POST "http://localhost:11434/api/generate" \
-H "Content-Type: application/json" \
-d '{
"model": "VyxaFoundation/Vyxa",
"prompt": "Write a smart contract"
}'
{
"response": "pragma solidity ^0.8.0; contract Vyxa { ... }"
}
To chat with Vyxa in real time:
ollama run VyxaFoundation/Vyxa --interactive
ollama run VyxaFoundation/Vyxa -p "Explain the impact of AI in Web3."
You can adjust temperature, top_p, and max tokens:
ollama run VyxaFoundation/Vyxa -t 0.7 -p "Generate Solidity code for an ERC-20 token."
✅ Vyxa-Next – Lightweight Model for Mobile & Edge AI
✅ Vyxa Chat – AI-Powered Web3 Conversational Model
✅ Decentralized AI Compute – Vyxa x Akash, Bittensor, Solana
✅ On-Chain AI Oracles for Smart Contracts
🔹 Model download is slow?
➡️ Use a VPN or a faster mirror.
🔹 Model runs out of memory?
➡️ Try running a lower quantization version (Q4, Q6).
🔹 Ollama is not recognized?
➡️ Ensure Ollama is installed and added to your system’s PATH.
💻 Website: Vyxa.org
📂 GitHub: Vyxa GitHub
𝕏 Twitter/X: @VyxaFoundation