Skip to content

Tailscale + LLM

Your local LLM (LM Studio, Ollama, etc.) runs on your desktop at home. But you want to use Sunflare on your laptop at a coffee shop or from your iPad on the couch. The LLM isn’t reachable outside your home network.

Tailscale creates a private encrypted network between your devices. It’s free for personal use (up to 100 devices), takes 2 minutes to set up, and uses WireGuard encryption.

Once connected, your LLM machine has a stable address (like 100.x.x.x or your-machine.tailnet.ts.net) that works from anywhere in the world — coffee shop, hotel, airplane wifi. No port forwarding, no dynamic DNS, no VPN servers to maintain.

  • Download from tailscale.com
  • Sign in (Google, Apple, or email)
  • Done — the machine gets a Tailscale IP

2. Install Tailscale on your other devices

Section titled “2. Install Tailscale on your other devices”

3. Configure your LLM server to listen on all interfaces

Section titled “3. Configure your LLM server to listen on all interfaces”

Your LLM server must listen on 0.0.0.0 (not just 127.0.0.1) to accept connections from your Tailscale network.

LM Studio: Settings → Server → Listen on 0.0.0.0 Ollama: Set OLLAMA_HOST=0.0.0.0 in environment (see also Alternative: Ollama below)

  • LLM URL: http://your-machine.tailnet.ts.net:1234
  • Or use the Tailscale IP: http://100.x.x.x:1234
  • Click Test Connection — should succeed from anywhere
  • Tailscale traffic is WireGuard encrypted end-to-end
  • Only YOUR devices on YOUR tailnet can reach the LLM
  • No public endpoints, no open ports
  • Even if someone knows the Tailscale IP, they can’t connect without being on your tailnet

Same concept works with Ollama:

  • Ollama default port: 11434
  • Set OLLAMA_HOST=0.0.0.0 in environment
  • Sunflare URL: http://your-machine.tailnet.ts.net:11434
  • OpenAI-compatible endpoint: http://your-machine.tailnet.ts.net:11434/v1