Private Chat Hub
Privacy-first AI on your terms. Own your conversations.
Your AI, Your Data, Your Control
Private Chat Hub is a privacy-first Android app for chatting with self-hosted and on-device AI models. Connect to Ollama, LM Studio, or OpenCodeβor run entirely on-device with Gemma/Gemini Nano. Your conversations never leave your control.
π― The Problem
Cloud AI services lock you into their infrastructure and send your conversations to corporate servers. Private Chat Hub is differentβall your conversations stay on your devices and infrastructure, whether you're using a local server or a completely offline on-device model.
β¨ Why Private Chat Hub?
| Feature | Private Chat Hub | ChatGPT / Claude |
| Data Ownership | β 100% User Controlled | β Cloud Storage |
| Model Choice | β Any Model (Ollama, LM Studio, On-Device) | β Provider Locked |
| Works Fully Offline | β On-Device Models | β Internet Required |
| Tool / Function Calling | β Built-in (Web Search, Location, URL Fetchβ¦) | β οΈ Platform Dependent |
| Vision / Image Input | β Supported | β Supported |
| Text-to-Speech | β Built-in TTS | β Not Included |
| Cost | β Free (self-hosted) | π° Subscription |
| Open Source | β MIT License | β Proprietary |
π€ Supported AI Backends
Connect to the backend that fits your setupβor combine several at once:
- Ollama β Any model from the Ollama library (Llama 3, Mistral, Gemma, Qwen, DeepSeek, and more). Tool calling fully supported.
- LM Studio β Run any GGUF model locally via LM Studio's desktop app. Connect over your local network.
- OpenCode β Route requests through the OpenCode server to cloud providers (OpenAI, Anthropic, Google). Tool calling handled by the provider.
- On-Device (LiteRT / Gemma) β Completely offline inference using Gemma or Gemini Nano running directly on your Android device. No server needed.
π Core Features
- Real-Time Streaming β Responses stream token by token as the model generates them
- Tool / Function Calling β AI can call built-in tools: web search, current time, device location, URL fetch, and notifications
- Web Search Integration β Powered by Jina AI; results include clickable source links
- Vision & Image Input β Attach photos or files; vision-capable models can see and describe them
- Text-to-Speech β Have any AI response read aloud with built-in TTS streaming
- Model Comparison β Run the same prompt side-by-side across two or more models
- System Prompt per Conversation β Set a custom persona or instructions for each chat
- Project Workspaces β Organize conversations into named projects
- Offline Message Queue β Messages sent while offline are queued and delivered when reconnected
- Conversation Export β Export chats as JSON or Markdown; you own your history
- Markdown + LaTeX Rendering β Rich formatted responses with math support
- Thinking / Extended Reasoning β Support for reasoning-focused models with visible thought traces
- Multiple Model Switching β Swap models instantly within any conversation
- Material Design 3 β Beautiful dark UI with adaptive theming
π₯ Built For
Privacy Advocates β Run your own infrastructure; no data leaves your network
AI Developers β Test and compare models rapidly, experiment with tool calling, inspect streaming responses
Power Users β Advanced features: tool toggles, model comparison, project workspaces, TTS, LaTeX
Offline Users β Fully on-device inference with Gemma/Gemini Nano, no internet required
β‘ Get Started
Download the APK, connect to your preferred backend, and start chatting. No cloud sign-up required.
π Learn More
All Features β’ About the Project β’ Installation Guide β’ API Comparison β’ GitHub