Chat UI - Self-hosted chat interface✯
A web interface to chat with LLMs and use MCPs
Prerequisites✯
- Docker engine
- Docker desktop (For Windows and MacOS)
- An API key for LiteLLM (ia.limos.fr → "My keys" tab)
Quick deployment with Docker✯
Create a docker-compose.yml file:
services:
chat-ui:
image: ghcr.io/huggingface/chat-ui-db:latest
container_name: chat-ui
ports:
- "127.0.0.1:3000:3000" # Local access only
environment:
- OPENAI_BASE_URL=https://api.ia.limos.fr/v1
- OPENAI_API_KEY=API_KEY # Replace with your key
volumes:
- chat-ui-data:/data
volumes:
chat-ui-data:
Then run:
docker compose up -d
The interface will be available at: http://localhost:3000