Fine-tuned Llama 2 model to answer medical questions based on an open source medical dataset.

8,715 Pulls Updated 4 months ago

MedLlama2 by Siraj Raval is a Llama 2-based model trained with MedQA dataset to be able to provide medical answers to questions. It is not intended to replace a medical professional, but to provide a starting point for further research.


Open the terminal and run ollama run medllama2



curl -X POST http://localhost:11434/api/generate -d '{
  "model": "medllama2",
  "prompt":"A 35-year-old woman presents with a persistent dry cough, shortness of breath, and fatigue. She is initially suspected of having asthma, but her spirometry results do not improve with bronchodilators. What could be the diagnosis?"

Memory requirements

  • 7b models generally require at least 8GB of RAM