You can run it on your PC or even on phone browser using Colab. You will have your own personal AI chat, hosted by you.
I mean you can give it a try:
Click on this link, and in the runtime tab in side bar, there will be a 'run all' button. Click on it, and it may fail once, but then restart it again, and it will work.
I will post the more refined process later, I am creating some shortcuts for this and also doing some fine tuning.
https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb
Way cool work. If I’m not misunderstanding, collab is hosted by Google. So it may be “locally hosted” but it’s locally hosted on a computer that is a Google-owned computer. Seems like users of this notebook wouldn’t be calling it locally hosted then, right?
It's on colab just for testing purposes, aimed to demonstrate how easy it is to deploy and run on low-spec devices. It will not rely on colab in the future.
Currently, it's in the blueprint phase as this project is going to be massive. I also want community to be involved in development process.
You should be able to host it anywhere from pi to phone in the future.
Thread collapsed
Thread collapsed