A local development proxy that uses AI to automatically route *.localhost domains to your running services. No config files, no port numbers to remember -- just visit myapp.localhost and the proxy figures out the rest.
Supports any OpenAI-compatible API (OpenRouter, Ollama, LM Studio, vLLM, etc.).
- Dynamic hostname resolution using any OpenAI-compatible LLM API
- Automatic service discovery:
- Local processes with open ports (Linux:
ss//proc, macOS:lsof) - Docker containers (via Docker API)
- Local processes with open ports (Linux:
- Cross-platform: Works on Linux and macOS
- On-demand TLS certificates for
*.localhostdomains - Persistent mapping cache (JSON file)
- Debug dashboard at
proxy.localhost - Inter-service proxy for service-to-service communication (
/_proxy/serviceName/path) - REST API for managing mappings (
/_api/mappings/) - CLI with
setup,status,start,stop,restart,trustcommands - macOS menu bar app for quick access
brew tap contember/tudy https://github.com/contember/tudy
brew install tudy
tudy setupThe setup command walks you through configuring your API key, trusting the HTTPS certificate, and starting the proxy.
You'll need an OpenRouter API key (or any OpenAI-compatible API).
Note: On macOS, Docker cannot discover local processes outside the container. Native installation is required for full process discovery.
export LLM_API_KEY=your-key
docker compose up -dThen test with:
curl -k https://myapp.localhostInstall script (macOS/Linux)
curl -fsSL https://raw.githubusercontent.com/contember/tudy/main/install.sh | bashHandles macOS Gatekeeper automatically.
Manual download
Download from Releases, then:
tar xzf caddy-darwin-arm64.tar.gz
sudo LLM_API_KEY=your-key ./caddy run --config CaddyfileBuild from source
go install github.com/caddyserver/xcaddy/cmd/xcaddy@latest
xcaddy build --with github.com/contember/tudy/llm_resolver=./llm_resolver
LLM_API_KEY=your-key ./caddy run --config CaddyfileUsing a local LLM (Ollama, etc.)
export LLM_API_KEY=your-key
export LLM_API_URL=http://localhost:11434/v1/chat/completions
export MODEL=llama3.2Start a dev server on any port:
cd ~/projects/myapp
npm run dev # listening on port 5173Then open https://myapp.localhost in your browser. The proxy matches the hostname to your running process based on the project directory name, command, and port.
| Hostname | Matches |
|---|---|
myapp.localhost |
Process running in ~/projects/myapp |
api.myproject.localhost |
Backend service in myproject directory |
postgres-app.localhost |
Docker container named postgres-app |
| Parameter | Description |
|---|---|
?force |
Force re-resolution (bypass cache) |
?prompt=text |
Provide additional context to the LLM |
For frontend apps that need to reach a related backend:
https://myapp.localhost/_proxy/api/endpoint
This resolves api as a related service to myapp (e.g., a backend in the same project directory) and proxies the request.
Visit https://proxy.localhost to see all current route mappings, discovered processes, and Docker containers. You can delete stale mappings from here.
| Endpoint | Method | Description |
|---|---|---|
/_api/mappings/ |
GET | List all mappings |
/_api/mappings/{hostname} |
GET | Get a specific mapping |
/_api/mappings/{hostname} |
PUT | Set a manual mapping |
/_api/mappings/{hostname} |
DELETE | Delete a mapping |
# Set a manual mapping
curl -X PUT https://any.localhost/_api/mappings/myapp.localhost \
-d '{"type":"process","target":"localhost","port":3000}'
# Delete a mapping
curl -X DELETE https://any.localhost/_api/mappings/myapp.localhostThe tudy command handles proxy management and delegates all other commands to the underlying Caddy binary.
tudy setup # Interactive first-time setup
tudy status # Show proxy status
tudy start # Start the proxy
tudy stop # Stop the proxy
tudy restart # Restart the proxy
tudy trust # Trust the HTTPS certificate
All other commands (run, version, list-modules, etc.) are passed through to Caddy:
tudy version # Shows Caddy version
tudy run # Runs Caddy in foreground (env file sourced automatically)On macOS, a menu bar app is installed alongside the proxy. It shows proxy status, lets you start/stop the service, configure your API key, and trust the certificate from the menu bar.
The app is installed at $(brew --prefix)/opt/tudy/Tudy.app. Add it to System Settings > General > Login Items to start automatically.
| Variable | Default | Description |
|---|---|---|
LLM_API_KEY |
(required) | API key for the LLM provider |
LLM_API_URL |
https://openrouter.ai/api/v1/chat/completions |
OpenAI-compatible chat completions endpoint |
MODEL |
anthropic/claude-haiku-4.5 |
Model to use for routing decisions |
COMPOSE_PROJECT |
Own Docker Compose project name (filtered from discovery) |
Homebrew installations store config in $(brew --prefix)/etc/tudy/:
| File | Purpose |
|---|---|
env |
Environment variables (LLM_API_KEY, etc.) |
Caddyfile |
Caddy configuration |
llm_resolver {
api_key {env.LLM_API_KEY}
api_url {env.LLM_API_URL}
model anthropic/claude-haiku-4.5
cache_file /data/mappings.json
compose_project myproject
}# Via brew services (recommended)
sudo brew services start tudy
sudo brew services stop tudy
# Or via the CLI
tudy start
tudy stopLogs: ~/Library/Logs/Homebrew/tudy.log (macOS)
- Request arrives with a hostname (e.g.,
api.myproject.localhost) - Module checks the mapping cache
- If not cached, it:
- Discovers local processes with open ports
- Discovers running Docker containers
- Calls the LLM with hostname + service list
- LLM returns the best matching target
- Result is cached
- Request is proxied to the resolved target
# Run with hot reload (requires xcaddy)
xcaddy run --config Caddyfile
# Build the Caddy binary with the plugin
xcaddy build --with github.com/contember/tudy/llm_resolver=./llm_resolver
# Build the CLI
cd cmd/cli && go build -o tudy .
# Build the menubar app (macOS only)
cd cmd/menubar && go build -o menubar .
# Run tests
go test ./...
# Build Docker image
docker build -t tudy .llm_resolver/ # Caddy module (Go package)
module.go # Caddy module registration
handler.go # HTTP middleware, dashboard, API
resolver.go # LLM resolution logic
cache.go # Persistent mapping storage
discovery/ # Service discovery
docker.go # Docker container discovery
processes.go # Local process discovery
cmd/cli/ # CLI binary (tudy command)
cmd/menubar/ # macOS menu bar app
Formula/ # Homebrew formula
MIT