Today many people are talking about AI. Remember, you can self-host this Llama 13 billion parameter model in just one click for free.

Open the link, click the Play icon, and it will only take a few minutes. If it fails the first time, simply click the Play icon again, and it will work.

I will also create a shortcut for stable diffusion, so you can self-host it's decent instance in one click for free.

Remember you are not using some OpenAI API, you are literally running the real model.

One click self-host link: https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb

Demo: https://labs.perplexity.ai/

Reply to this note

Please Login to reply.

Discussion

Not a tech guy but does this basically means that anyone can self host chat gpt or other AI tool and use it without asking permission ?

It's little complicated, but yes. With the link provided above, it's easier to host your own chatGPT-style language model WITH 13-70 BILLION PARAMETERS than hosting your own relay.

This is incredibly cool. Thanks!

Thank you 🙏

I’ve been using Chat GPT and have been looking for a better option, definitely will look into this! Are you training it with Nostr and Bitcoin GitHub repos?!

Yes, I am training this Llama model with Nostr repos, and it also requires additional fine-tuning. I am working on a mechanism through which everyone can self host them for free on Colab or Pi.

Clicks play - "please sign in with Google" - shaking head

So right now it's just for testing environment, you can copy it in your Jupyter notebook, run it locally. In future it will not relive on Google colab.

*Rely

Is it possible to feed with new data? I mean to make it up-to-date?

So, we can feed it data related to nostr, that is known as fine tuning, which is not that complicated.

However, making it 'up to date" is an entirely different matter. You would need an immense amount of the latest data and an enormous amount of computing power to train it on that data. But we can make it up to date in certain field.

We should somehow make it decentralized. I mean the data feed. Just like the Bitcoin Blockchain. If we share the needed compute power and storage then maybe it will take more time to answer, but it won't be biased and it will be unstoppable.