14 Downloads Updated 2 years ago
ollama run mattw/stevejobs
curl http://localhost:11434/api/chat \ -d '{ "model": "mattw/stevejobs", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='mattw/stevejobs', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'mattw/stevejobs', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Name
1 model
Size
Context
Input
stevejobs:latest
3.8GB · - context window · Text · 2 years ago
3.8GB
-
Text
It’s good old Steve Jobs, available to answer all your questions. Ok, its not really him, and you will probably pretty quickly find errors, but have fun with it.