144 1 year ago

Saiga Mistral 7B with 128k context window

Models

View all →

Readme

Based on https://huggingface.co/evilfreelancer/saiga_mistral_7b_128k_lora

If you use this model with langchain, make sure to edit your ai_prefix and human_prefix of your Memory classes like this:

memory = ConversationBufferWindowMemory(input_key='input', memory_key="history", k=5, ai_prefix='Bot', human_prefix='User')

Don’t forget to change your prompts accordingly!