Neat, but what do you mean when you say "local AIs?" Like you run your own perplexity this way?

Reply to this note

Please Login to reply.

Discussion

Running various models locally on Ollama + Open WebUI. Some I sandbox and use RAG.

Really digging the DeepSeek-R1-Distill-Llama-8B right now.

I tried DeepSeek. It's TL;DR.

I need to mess with it some more. 😂

What have you done with it?

I fed a starcoder model about 50 books on programming to be my on personal dev.

The deepseek model I fed about 35 books on hacking, OPSEC, and counter intelligence. Some interesting conversations with that one.

Great idea.

Oh that's good to know..! I have the same setup but I didn't know I could connect searxng into it. I use my self hosted searxng for a couple of years already and love it!

Lol yeah. I of course had to build a script that launches both from a icon on my desktop