Private Chat Hub

Privacy-first AI on your terms. Own your conversations.

Your AI, Your Data, Your Control

Private Chat Hub is a privacy-first Android app for chatting with self-hosted and on-device AI models. Connect to Ollama, LM Studio, or OpenCodeβ€”or run entirely on-device with Gemma/Gemini Nano. Your conversations never leave your control.

🎯 The Problem

Cloud AI services lock you into their infrastructure and send your conversations to corporate servers. Private Chat Hub is differentβ€”all your conversations stay on your devices and infrastructure, whether you're using a local server or a completely offline on-device model.

✨ Why Private Chat Hub?

Feature Private Chat Hub ChatGPT / Claude
Data Ownership βœ… 100% User Controlled ❌ Cloud Storage
Model Choice βœ… Any Model (Ollama, LM Studio, On-Device) ❌ Provider Locked
Works Fully Offline βœ… On-Device Models ❌ Internet Required
Tool / Function Calling βœ… Built-in (Web Search, Location, URL Fetch…) ⚠️ Platform Dependent
Vision / Image Input βœ… Supported βœ… Supported
Text-to-Speech βœ… Built-in TTS ❌ Not Included
Cost βœ… Free (self-hosted) πŸ’° Subscription
Open Source βœ… MIT License ❌ Proprietary

πŸ€– Supported AI Backends

Connect to the backend that fits your setupβ€”or combine several at once:

  • Ollama β€” Any model from the Ollama library (Llama 3, Mistral, Gemma, Qwen, DeepSeek, and more). Tool calling fully supported.
  • LM Studio β€” Run any GGUF model locally via LM Studio's desktop app. Connect over your local network.
  • OpenCode β€” Route requests through the OpenCode server to cloud providers (OpenAI, Anthropic, Google). Tool calling handled by the provider.
  • On-Device (LiteRT / Gemma) β€” Completely offline inference using Gemma or Gemini Nano running directly on your Android device. No server needed.

πŸš€ Core Features

  • Real-Time Streaming β€” Responses stream token by token as the model generates them
  • Tool / Function Calling β€” AI can call built-in tools: web search, current time, device location, URL fetch, and notifications
  • Web Search Integration β€” Powered by Jina AI; results include clickable source links
  • Vision & Image Input β€” Attach photos or files; vision-capable models can see and describe them
  • Text-to-Speech β€” Have any AI response read aloud with built-in TTS streaming
  • Model Comparison β€” Run the same prompt side-by-side across two or more models
  • System Prompt per Conversation β€” Set a custom persona or instructions for each chat
  • Project Workspaces β€” Organize conversations into named projects
  • Offline Message Queue β€” Messages sent while offline are queued and delivered when reconnected
  • Conversation Export β€” Export chats as JSON or Markdown; you own your history
  • Markdown + LaTeX Rendering β€” Rich formatted responses with math support
  • Thinking / Extended Reasoning β€” Support for reasoning-focused models with visible thought traces
  • Multiple Model Switching β€” Swap models instantly within any conversation
  • Material Design 3 β€” Beautiful dark UI with adaptive theming

πŸ‘₯ Built For

Privacy Advocates β€” Run your own infrastructure; no data leaves your network

AI Developers β€” Test and compare models rapidly, experiment with tool calling, inspect streaming responses

Power Users β€” Advanced features: tool toggles, model comparison, project workspaces, TTS, LaTeX

Offline Users β€” Fully on-device inference with Gemma/Gemini Nano, no internet required

⚑ Get Started

Download the APK, connect to your preferred backend, and start chatting. No cloud sign-up required.

πŸ“₯ Download Latest Release β†’

πŸ“š Learn More

All Features β€’ About the Project β€’ Installation Guide β€’ API Comparison β€’ GitHub