19 Downloads Updated 1 year ago
ollama run goat/fhw
curl http://localhost:11434/api/chat \ -d '{ "model": "goat/fhw", "messages": [{"role": "user", "content": "Hello!"}] }'
from ollama import chat response = chat( model='goat/fhw', messages=[{'role': 'user', 'content': 'Hello!'}], ) print(response.message.content)
import ollama from 'ollama' const response = await ollama.chat({ model: 'goat/fhw', messages: [{role: 'user', content: 'Hello!'}], }) console.log(response.message.content)
Name
1 model
Size
Context
Input
fhw:latest
4.1GB · 32K context window · Text · 1 year ago
4.1GB
32K
Text
This model replies for each prompt enquiry in the JSON format : { “title”: {“type”: “string”}, “artist”: {“type”: “string”}, “material”: {“type”: “string”}, “description”: {“type”: “string”} }