Answer: I’m curious about what **GUIs or applications** y’all are using to interact with your **local LLMs**—whether on **mobile
This question has been unanswered for over 48 hours.
Question: I’m curious about what GUIs or applications y’all are using to interact with your local LLMs—whether on mobile, desktop, or web.
I’m currently running LM Studio and experimenting with a few models. LM Studio has a server mode, and I’m looking into using Tailscale or Cloudflared to remotely connect to my local LLM. Ideally, I’d love a setup that lets me:
✅ Access my local LLM remotely
✅ Switch models easily
✅ Have a smooth GUI or API interaction
For those already running a local LLM, what’s your setup?
• Do you use Ollama, LM Studio, Text Generation WebUI, or something else?
• How do you connect to it remotely?
• What’s your preferred interface—web UI, CLI, mobile app, or a custom API?
Source: https://qa.irregulars.io/q/10
Proposals 0
No proposals yet. Be the first to express interest!