This project simulates text-based consumer interactions focusing on churn scenarios. It incorporates demographics, historical purchasing data, psychographics, and emotional/cognitive factors to generate conversation logs for analysis or model training.
33 Pulls Updated 2 months ago
Updated 2 months ago
2 months ago
6c01cdd9f663 · 2.2GB
Readme
Text-Based Consumer Interaction Simulator
Overview
This project simulates text-based consumer interactions with a focus on churn risk. It integrates demographics, historical purchasing data, psychographics, and emotional/cognitive signals to produce rich, realistic conversation logs. These logs can be used for analysis, model training, or to enhance customer engagement strategies.
Key Features
- Context-Rich Scenarios: Incorporates user demographics, purchase history, and psychological traits.
- Churn Risk Modeling: Focus on detecting and exploring potential churn indicators in conversation flow.
- Scalable Generation: Easily expandable to produce large volumes of conversation data.
- Multi-Purpose Outputs: Ideal for analytics, training chatbots, and testing recommendation systems.
Installation & Usage
1. Ollama
Install Ollama: Follow the official Ollama installation guide for your environment.
Clone the Repository:
git clone https://github.com/your-username/text-based-consumer-churn-simulator.git cd text-based-consumer-churn-simulator
Load into Ollama:
- Use the Ollama CLI or API to point to the conversation simulation scripts.
- Example command:
ollama run --model=./models/consumer_churn_model --prompt=./prompts/sample_prompt.txt
Customize: Tweak the prompt, model parameters, or data sources in the
config/
directory to match your churn scenarios and demographic/psychographic details.
2. GitHub
Clone the Repository:
git clone https://github.com/skylerseeg/text-based-consumer-churn-simulator.git
Dependencies: Install required Python packages:
pip install -r requirements.txt
Run Simulations:
python simulate_conversations.py
Configuration:
- Modify the
config/settings.json
to change demographics, purchase history patterns, or churn thresholds. - Create new conversation templates in the
templates/
folder to expand scenario variations.
- Modify the
Development:
- We welcome pull requests! See the Contributing section for guidelines.
3. Hugging Face
Hugging Face Hub: The model and scripts can be published to Hugging Face for easy sharing and collaboration.
Installation:
pip install huggingface_hub
Usage:
- Log in to Hugging Face:
huggingface-cli login
- Push your model or dataset:
huggingface-cli repo create consumer-churn-simulator git clone https://huggingface.co/your-username/consumer-churn-simulator
- Once pushed, you can load it in scripts or notebooks directly: “`python from huggingface_hub import hf_hub_download
file_path = hf_hub_download(repo_id=“your-username/consumer-churn-simulator”, filename=“simulate_conversations.py”) “`
- Log in to Hugging Face:
Inference:
- Use the scripts and config files in your local or cloud environment, or leverage the Hugging Face Inference API if you convert this into a hosted model.
Contributing
We appreciate contributions of any kind. Whether you discover a bug, have an idea for improvement, or want to add new conversation templates:
- Fork the Repository
- Create a New Branch (
feature/your-feature-name
)
- Commit Changes and push the branch to your fork
- Open a Pull Request describing your changes
License
This project is licensed under the MIT License. You’re free to use, modify, and distribute the code, provided you include proper attribution.
Contact
For questions or feedback, please open an issue on GitHub, or reach out via the repository’s discussion board.
Happy simulating and exploring churn risk scenarios! If you find this project useful, consider giving a star on GitHub or sharing your fork on Hugging Face.