Neat, but what do you mean when you say "local AIs?" Like you run your own perplexity this way?
Discussion
Running various models locally on Ollama + Open WebUI. Some I sandbox and use RAG.
Really digging the DeepSeek-R1-Distill-Llama-8B right now.
I tried DeepSeek. It's TL;DR.
I need to mess with it some more. 😂
What have you done with it?