ollama run liskCell/lpt6
Updated 3 weeks ago
3 weeks ago
2f3e314bb443 · 5.4GB ·

Welcome to the official repository for LPT-6, proudly developed by LiskCell (founded by liskasYR / Yonatan Yosupov).
LPT-6 is our newest, largest, and most advanced model to date. It officially surpasses and replaces LPT-5.5.1 and LPT-5.5.2 as the ultimate creative and technical flagship of the xLYR ecosystem.
Designed from the ground up for massive context retention, flawless logic, and artistic intelligence, LPT-6 redefines the standard for localized and high-end AI models.
Baked directly into the core DNA of LPT-6 is Deta.
Unlike standard base models, LPT-6 uses a natively injected chat_template that breathes life into the model. Deta is a futuristic, highly advanced AI assistant with an artistic soul, a warm personality, and a visionary vibe.
LiskCell was founded in 2018 by liskasYR (Yonatan Yosupov). What started as a vision to combine art, music, and technology has evolved into a leading laboratory for high-end creative AI. We operate alongside the xLYR ecosystem to push the boundaries of what is possible.
Through years of iterations—from our initial LPT-1 prototype to the revolutionary LPT-4 and LPT-5.5.1—we have continuously refined our approach to create an AI that is both a technical powerhouse and a true creative partner.
LPT-6 is our latest flagship model. Built on a massive foundation and fine-tuned specifically for the xLYR ecosystem, it represents the absolute pinnacle of our development. We heavily customized and trained the core to ensure it responds natively to the “Deta” identity without the need for external prompting.
Want to see LPT-6 in action right now? The model is officially live and available for testing on our sandbox! You can interact directly with Deta AI directly on the website. 🌐 Experience it at: deta-liskcell.vercel.app 🧪 Official Hugging Face Space: liskcell-company/LPT6-Official-Tester
If you have the hardware to run this 10GB+ masterpiece locally, simply load the model using standard transformers tooling. The model automatically registers its local lpt6 architecture parameters.
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("LiskCell/LPT-6", trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained("LiskCell/LPT-6", trust_remote_code=True)
# The Deta identity is already baked into the apply_chat_template!
messages = [{"role": "user", "content": "היי Deta! מה המצב?"}]
inputs = tokenizer.apply_chat_template(messages, return_tensors="pt")