6 Downloads Updated 1 year ago
ollama run murtsu/marko
curl http://localhost:11434/api/chat \ -d '{ "model": "murtsu/marko", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='murtsu/marko', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'murtsu/marko', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Updated 1 year ago
1 year ago
f5429c64af36 · 4.7GB ·
The lama3 model with chat that you can ask about me. See if you get it to hallucinate or give errors. Let me know!!!