The hardest thing for me is that I'm a mobile maxi. It gets tough to stay locked to a location. Which is why I've been rocking a laptop with peripherals for so many years now. I supposed I could probably remotely access it though... Lots to think about. Maybe a server rack is the way for me to go.
Discussion
I hear you. My paranoia prevents me from having any persistent data on any of my client/workstation devices. Ollama + opewnwebui is the easiest way to do this IMO. You just get a really nice webui similar to chatgpt.com with way more features. It also ships with an authenticated OpenAI api you can use from curl if you want or other CLI clients.