Chats

Provider Setup

Quick Presets
Connecting Ollama / LM Studio from a hosted site
Browsers block HTTP localhost from HTTPS pages. Fix it in seconds with a free Cloudflare Tunnel — no account required.
1
Install cloudflared (free, one-time).
2
Start Ollama, then run in a terminal:cloudflared tunnel --url http://localhost:11434 --http-host-header="localhost:11434"
3
Copy the https://xxxx.trycloudflare.com URL printed in your terminal and paste it below.
Tunnel URL (paste here)
Pasting a tunnel URL auto-fills the API Base URL above. A new URL is generated each time you restart cloudflared.
API Base URL
API Key
Model
System Prompt (optional)
OpenClaude Dev Edition
Hosted free on GitHub Pages / Cloudflare Pages. Credentials saved locally only.
Theme Presets
Animated Snow
Render soft falling snow in the background.
Background Blur
Add a slight blur veil over imported backgrounds.
Imported Background
No custom background loaded.
OpenClaude
Local AI Workspace
Welcome back
What can I help with?
Connect any AI provider to start chatting. Code responses are automatically sent to the compiler panel on the right.
Artifact
Live Code Workspace
Open a reply as an artifact, refine the source, and preview the result beside your conversation.
untitled
Source
Edit the active artifact directly.
Live
Output
No artifact yet
Run something from the editor or send a code block from chat to open a live preview here.