TinyLLaMA-Sugarcane is a fine-tuned small language model (SLM) focused on sugarcane production πŸŒ±πŸ‡§πŸ‡· Built on top of TinyLLaMA-1.1B, it was trained with 2,000+ real-world Q&A examples from the sugar-energy industry.

2 10 days ago

Readme

🌱 TinyLLaMA-Sugarcane

Welcome to the first open-source LLM fine-tuned for sugarcane production! 🧠🌾

https://huggingface.co/infinitestack

This model is a fine-tuned version of TinyLLaMA, trained specifically on sugarcane-focused data. Developed by SciCrop as part of its commitment to open innovation in agriculture, this is one of the first domain-specific small language models (SLMs) created for the agribusiness sector.


🚜 Why Sugarcane?

Sugarcane is one of the most important crops in Brazil and globally β€” but most LLMs know very little about its specific production cycle, challenges, and terminology.

By fine-tuning TinyLLaMA on 2,000+ question/answer pairs from real-world sugarcane use cases, we aim to deliver:

  • βœ… Better accuracy
  • βœ… Clearer answers
  • βœ… Local deployment capabilities for agricultural experts, cooperatives, and researchers

πŸ” Model Details

  • Base model: TinyLLaMA-1.1B-Chat
  • Fine-tuned on: Domain-specific QA pairs related to sugarcane
  • Architecture: Causal LM with LoRA + QLoRA
  • Tokenizer: LLaMATokenizer
  • Model size: ~1.1B parameters
  • Format: Available in both HF standard and GGUF for local/Ollama use

πŸ§ͺ Try it locally with Ollama

We believe local models are the future for privacy-sensitive, domain-specific AI.

You can run this model locally using Ollama:

ollama run infinitestack/tinyllama-sugarcane

πŸ‘‰ Or explore the model directly:
https://ollama.com/infinitestack/tinyllama-sugarcane


🌐 About InfiniteStack

This model is part of InfiniteStack, a platform by SciCrop that helps companies in the agri-food-energy-environment chain create, train, and deploy their own AI and analytics solutions β€” securely and at scale.

πŸ“¦ InfiniteStack offers:

  • A containerized platform that runs on-prem or in private cloud
  • Full support for SLMs and LLMs using your real and private data
  • No/Low-code interfaces to Collect, Automate, Leverage, Catalog, Observe, and Track data pipelines and AI assets

🌐 Learn more: https://infinitestack.ai


🧠 Why Small Language Models (SLMs)?

SLMs are great when:

  • You need local inference (offline, on-device, or private)
  • Your domain is narrow and specific
  • You want full control over fine-tuning and usage
  • You care about speed, size, and cost-efficiency

Big isn’t always better. Sometimes, smart and focused beats giant and generic. πŸ’‘


🀝 Community & Open Innovation

This work reflects SciCrop’s ongoing commitment to the open-source ecosystem, and to creating useful, usable AI for real-world agribusiness.

Feel free to fork, contribute, fine-tune further, or use it in your own ag project.
We’d love to hear how you’re using it!


πŸ“¬ Questions or Contributions?

Ping us at:
πŸ“§ info@scicrop.com
🌐 https://scicrop.com
🌱 https://infinitestack.ai

Made with β˜•, 🌾 and ❀️ in Brazil
by @josedamico and the InfiniteStack team