Ah, youβre running Open Web UI but using the OpenAI API to do the inference.
You could look into hosting your own model using Ollama if you have a reasonable GPU. That was you could use something like Llama-factory to fine tune an existing available model with your own text (Nostr posts / emails / docs / etc.) and the data stays with you.
MCPs are basically tools you can give your AI. For example, a tool that interacts with your e-mail and draft e-mails for you, or a Nostr MCP, or Lightning MCP, etc.
YeH, AI loves bullet points and emojiβs :-)
I'm going to get whatever is the third generation of the Nvidia DGX spark or competitors product in the same way I waited for the third generation of the iPhone, which was the first one that actually became useful π
Right now, I wouldn't trust "hooking" Chatty up to anything, not the way he disobeys my every instruction π³
Thread collapsed