Skip to content

Local AI Setup

Sunflare can use AI for session summaries, terminal assist, and chat — all running locally, nothing sent to the cloud.

Run your own models on your Mac or PC using tools like LM Studio or Ollama.

LM Studio example:

  1. Install LM Studio and load a model
  2. Start the local server (default: http://localhost:1234)
  3. In Sunflare: Settings → AI → Provider → OpenAI-Compatible
  4. URL: http://localhost:1234/v1

Sunflare works with any API that speaks the OpenAI chat completions protocol:

  • Ollamahttp://localhost:11434/v1
  • vLLM — your deployment URL
  • Together AIhttps://api.together.xyz/v1 + API key
  • OpenAIhttps://api.openai.com/v1 + API key

Running a local LLM at home but using Sunflare from a different network? Use Tailscale to securely access your home LLM from anywhere.