Ai Solux
AiSolux
AiSolux is a unified chat workspace for using multiple AI models from one interface while keeping control over your data. It’s built for developers and power users who want to switch between hosted providers and self-hosted local models without changing tools. The focus is a practical, fast chat experience with rich rendering and reusable prompts.
Demo: https://chat.devsolux.net/
Key features
- Multi-provider model support — Use different AI providers from a single app and switch models as needed.
- Custom model endpoints — Connect your own model or compatible API endpoint.
- Local-first data handling — Keeps your chat data stored locally within the app.
- Local LLM integration — Works with self-hosted runtimes such as Ollama, RWKV-Runner, and LocalAI.
- Rich Markdown output — Renders LaTeX, Mermaid diagrams, code blocks, and syntax highlighting.
- Streaming responses — Displays tokens as they arrive for a more responsive feel.
- Prompt templates and tools — Create, reuse, and share prompt-based tools for common workflows.
- Conversation history compression — Automatically condenses long threads to stay usable and reduce token usage.
- Flexible sharing — Export chats as an image or share via ShareGPT.
- Multilingual experience — Supports a wide range of languages, including Türkçe, Deutsch, English, Español, Français, العربية, 中文, 日本語, and more.
Supported technologies
-
Model providers
- OpenAI (GPT models)
- Anthropic (Claude)
- Google (Gemini)
- Custom provider endpoints
-
Local LLM runtimes
- Ollama
- RWKV-Runner
- LocalAI
-
Rendering
- Markdown
- LaTeX
- Mermaid
- Code highlighting
-
Platforms
- Web (PWA)
- Desktop (Windows, macOS, Linux)
- Mobile (iOS, Android)
-
Sharing
- Image export
- ShareGPT
Use cases
- Compare outputs across different model providers in a single conversation.
- Run private workflows against local models while keeping data on-device.
- Maintain a prompt library for recurring tasks like refactoring, writing, and debugging.
- Share conversation snapshots with teammates for async review.
- Keep long-running project chats manageable with automatic history compression.
Notes
- Runs as a web app with PWA support, and also supports desktop and mobile installs.