Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset.
7b
40K Pulls Updated 13 months ago
Updated 13 months ago
13 months ago
23a9f20d4a00 · 4.7GB
model
archllama
·
parameters6.74B
·
quantizationQ5_K_S
4.7GB
params
{"stop":["User:","Assistant:"]}
31B
template
{{ .System }}
User: {{ .Prompt }}
Assistant:
45B
Readme
MedLlama2 by Siraj Raval is a Llama 2-based model trained with MedQA dataset to be able to provide medical answers to questions. It is not intended to replace a medical professional, but to provide a starting point for further research.
CLI
Open the terminal and run ollama run medllama2
API
Example:
curl -X POST http://localhost:11434/api/generate -d '{
"model": "medllama2",
"prompt":"A 35-year-old woman presents with a persistent dry cough, shortness of breath, and fatigue. She is initially suspected of having asthma, but her spirometry results do not improve with bronchodilators. What could be the diagnosis?"
}'
Memory requirements
- 7b models generally require at least 8GB of RAM