Is it possible to feed with new data? I mean to make it up-to-date?
Today many people are talking about AI. Remember, you can self-host this Llama 13 billion parameter model in just one click for free.
Open the link, click the Play icon, and it will only take a few minutes. If it fails the first time, simply click the Play icon again, and it will work.
I will also create a shortcut for stable diffusion, so you can self-host it's decent instance in one click for free.
Remember you are not using some OpenAI API, you are literally running the real model.
One click self-host link: https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb
Demo: https://labs.perplexity.ai/

Discussion
So, we can feed it data related to nostr, that is known as fine tuning, which is not that complicated.
However, making it 'up to date" is an entirely different matter. You would need an immense amount of the latest data and an enormous amount of computing power to train it on that data. But we can make it up to date in certain field.
We should somehow make it decentralized. I mean the data feed. Just like the Bitcoin Blockchain. If we share the needed compute power and storage then maybe it will take more time to answer, but it won't be biased and it will be unstoppable.