109 Downloads Updated 1 year ago
ollama run lwk/r1
curl http://localhost:11434/api/chat \ -d '{ "model": "lwk/r1", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='lwk/r1', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'lwk/r1', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
ollama launch claude --model lwk/r1
ollama launch codex --model lwk/r1
ollama launch opencode --model lwk/r1
ollama launch openclaw --model lwk/r1
Name
1 model
Size
Context
Input
r1:latest
2.0GB · 128K context window · Text · 1 year ago
2.0GB
128K
Text