ChatForge
A modern AI chat application with advanced tool orchestration, OpenAI-compatible API proxy, and unique features like Judge system, conversation forking, and Playwright-powered browsing.
ChatForge is a full-stack AI chat application featuring a Next.js 15 frontend and Node.js backend. It acts as an OpenAI-compatible API proxy with enhanced capabilities including server-side tool orchestration, conversation forking, model comparison with judge evaluation, Playwright-powered browser automation, and cross-platform desktop app support.
While OpenWebUI and LibreChat are excellent options, ChatForge distinguishes itself through unique features they don’t offer.
Why ChatForge?
| Feature | ChatForge | OpenWebUI | LibreChat |
|---|---|---|---|
| Architecture | Node.js + SQLite (simple, single binary) | Python + complex stack | Node.js + MongoDB |
| Browser Automation | ✅ Playwright with SPA support | ❌ Basic fetch | ❌ Basic fetch |
| Model Comparison | ✅ Side-by-side with judge/evaluation | ❌ | ❌ |
| Conversation Forking | ✅ Fork at any message | ❌ | ❌ |
| Cross-Conv Memory | ✅ Built-in Journal tool | ❌ | ❌ |
| Prompt Caching | ✅ Automatic cache breakpoints | ❌ | ❌ |
| Desktop App | ✅ Native Electron with auto-login | ❌ | ❌ |
| Checkpoint Persistence | ✅ Resume aborted streams | ❌ | ❌ |
What Makes ChatForge Unique
-
⚖️ Judge/Evaluation System — Built-in automated model comparison with configurable judge models that provide numerical scores and reasoning for objective model evaluation. No other open-source chat UI offers this.
-
🌐 Enhanced WebFetch with Playwright — Unlike competitors’ basic HTTP fetching, ChatForge uses real browser automation with specialized content extractors for Reddit, StackOverflow, and SPAs that block standard crawlers.
-
🔀 True Model Comparison Mode — Compare multiple models side-by-side with completely isolated conversation histories, not just message-by-message switching.
-
🍴 Conversation Forking — Fork conversations at any message point to explore alternative paths without losing your original conversation thread.
-
📓 Journal Tool — Persistent cross-conversation memory that allows AI to store and retrieve notes across different chat sessions.
-
💾 Prompt Caching Optimization — Automatic cache breakpoint insertion (especially for Anthropic models) to reduce token costs and latency.
-
🔄 Streaming with Checkpoint Persistence — Abort streaming responses anytime with automatic state preservation, allowing you to resume or branch from any point.
-
🖥️ Native Desktop App — Cross-platform Electron app with auto-login and native packaging, not just a web wrapper.
-
⚡ Zero-Config Deployment — Single Docker image with SQLite (no external database required) vs. complex MongoDB/PostgreSQL setups.
Core Capabilities
- 🤖 Server-Side Tool Orchestration: Unified tool calling with iterative workflows, thinking support, parallel execution, and intelligent error handling
- 💬 Real-Time Streaming: Server-Sent Events (SSE) with tool execution visibility and abort support
- 💾 Conversation Persistence: SQLite-backed storage with automatic retention cleanup and migration system
- 🔌 Multi-Provider Support: OpenAI-compatible interface supporting OpenAI, Anthropic, and Gemini providers
- 🎨 Modern UI: React 19 with markdown rendering, syntax highlighting, code wrapping, HTML preview, and responsive design
- 🗂️ Prompt Management: Built-in and custom system prompts with conversation-aware selection
AI Capabilities
- 🖼️ Image Upload & Vision Support: Multimodal vision support with drag-and-drop UI
- 🎙️ Audio Upload Support: Upload and send audio files for voice-enabled models
- 📎 File Attachment Support: Text file upload with content extraction
- 🧠 Reasoning Controls: Support for reasoning effort and extended thinking modes
- 💾 Prompt Caching Optimization: Automatic cache breakpoints to reduce token costs
- 📓 Journal Tool: Persistent memory tool for cross-conversation AI memory
Quick Start
One-line Docker deployment:
docker run -d --name chatforge -p 3000:3000 \
-v chatforge_data:/data -v chatforge_logs:/app/logs \
-e DB_URL=file:/data/prod.db qduc/chat:latest
Visit http://localhost:3000, register, and configure your API keys in Settings → Providers & Tools.
Tech Stack
- Frontend: Next.js 15, React 19, TypeScript
- Backend: Node.js, Express, SQLite
- Desktop: Electron for cross-platform support
Check out the GitHub repository for detailed documentation and setup instructions.