Multi-provider, streaming, embeddable in one script tag. OpenAI, Anthropic, and Gemini โ switch with one env var.
<!-- That's it. One tag. --> <script src="https://your-server.com/js/widget.js" data-server="https://your-server.com" data-title="Chat with us" data-greeting="Hi! How can I help?" data-stream="true"></script>
OpenAI, Anthropic, and Google Gemini out of the box. Switch providers with a single environment variable. Add custom providers by implementing one interface.
Real-time Server-Sent Events for instant, token-by-token responses. Users see the AI thinking in real time โ no waiting for the full response.
Vanilla JavaScript, zero dependencies. One script tag to embed. CSS custom properties for theming โ match any brand in minutes.
Helmet security headers, CORS configuration, rate limiting, and Zod input validation on every endpoint. Production-hardened by default.
33 tests covering unit logic, middleware, config validation, and full API integration. Coverage reporting with Vitest + V8.
Routes โ Controllers โ Services โ Providers. Each layer has a single responsibility. Extend without touching existing code.
The widget sends messages to your Express server. ChatService resolves the configured provider and forwards the request. Responses stream back via SSE.
Copy the example env and add your API key.
Open http://localhost:3000 โ the demo page has a working chat widget.