Tailscale + LLM
The Problem
Section titled “The Problem”Your local LLM (LM Studio, Ollama, etc.) runs on your desktop at home. But you want to use Sunflare on your laptop at a coffee shop or from your iPad on the couch. The LLM isn’t reachable outside your home network.
The Solution: Tailscale
Section titled “The Solution: Tailscale”Tailscale creates a private encrypted network between your devices. It’s free for personal use (up to 100 devices), takes 2 minutes to set up, and uses WireGuard encryption.
Once connected, your LLM machine has a stable address (like 100.x.x.x or your-machine.tailnet.ts.net) that works from anywhere in the world — coffee shop, hotel, airplane wifi. No port forwarding, no dynamic DNS, no VPN servers to maintain.
Setup Steps
Section titled “Setup Steps”1. Install Tailscale on your LLM machine
Section titled “1. Install Tailscale on your LLM machine”- Download from tailscale.com
- Sign in (Google, Apple, or email)
- Done — the machine gets a Tailscale IP
2. Install Tailscale on your other devices
Section titled “2. Install Tailscale on your other devices”- Mac: Mac App Store → Tailscale
- iOS/iPad: App Store → Tailscale
- Linux/Windows: tailscale.com/download
- Sign in with the same account on each device
- Your devices can now reach each other
3. Configure your LLM server to listen on all interfaces
Section titled “3. Configure your LLM server to listen on all interfaces”Your LLM server must listen on 0.0.0.0 (not just 127.0.0.1) to accept connections from your Tailscale network.
LM Studio: Settings → Server → Listen on 0.0.0.0
Ollama: Set OLLAMA_HOST=0.0.0.0 in environment (see also Alternative: Ollama below)
4. In Sunflare settings
Section titled “4. In Sunflare settings”- LLM URL:
http://your-machine.tailnet.ts.net:1234 - Or use the Tailscale IP:
http://100.x.x.x:1234 - Click Test Connection — should succeed from anywhere
Security
Section titled “Security”- Tailscale traffic is WireGuard encrypted end-to-end
- Only YOUR devices on YOUR tailnet can reach the LLM
- No public endpoints, no open ports
- Even if someone knows the Tailscale IP, they can’t connect without being on your tailnet
Alternative: Ollama
Section titled “Alternative: Ollama”Same concept works with Ollama:
- Ollama default port: 11434
- Set
OLLAMA_HOST=0.0.0.0in environment - Sunflare URL:
http://your-machine.tailnet.ts.net:11434 - OpenAI-compatible endpoint:
http://your-machine.tailnet.ts.net:11434/v1