Interesting. I’ve played around with the idea of having a custom news nostr bot that you can generate and subscribe to. You would feed it some tags and topics and it would grab the relevant news items from the web and an ollama instance would write you a synopsis of a few paragraphs with source links and send out a pm or note every morning.
Discussion
I would play around with Perplexity, their search model is pretty good. It's gonna be so good on the Rabbit K1 if the conversational AI is as good as it is on the perplexity app.
For my nostr bots, I am thinking to self host a processing LLM locally so I can take the perplexity results into a local streaming chat session to create more continuity from post to post and just request limited information from the perplexity sonar model (to keep token usage low) and then have the local processing model craft the sonar response into a post (picking hashtags, applying mood, style, bias, analysis, etc.). I am so hyped about it I am thinking to get a rig with a big NVIDIA GPU. My 2GB of VRAM on GTX 1000 series isn't cutting it. 😭