Running a 16 Billion-parameter Llama language model.

I may have a surprise for you today. You all may be able to run this model on your phone 'locally' using Colab with just one click, even if you have no coding knowledge.

I am serious about democratizing AI.

Reply to this note

Please Login to reply.

Discussion

Dayum. That's incredible.

I could possibly make stuff? Am I reading that right?

Yes you can integrate this with anything, not just that you can offer it as an API like openAi does. It's an extremely smart model.

Interesting, thank you. I'm anxious to see what you come up with.

That’s sick! Too bad our Orchard Overlords will probably not allow such a democratic and force equalizing thing in the hands of peasants!

I mean, they will do their best, but we will not stay quiet.

“Rise and rise again, until lambs become Lions!”

Although we are civil in our lives, we will fight for what is ours and support our fellow man in times of need.

#GreatestMovieQuotes

I like the decentralized aspect

https://github.com/bigscience-workshop/petals

Great!!!

We need more ancap AI!!

I’ve been on Linux since 98 & could install almost anything, but it’s the research that takes up too much time…

Once it is on this level of usability with clear howtos then the real game is ON :-)

Sooooo how complex would it be to set up a working AI with Nostr logins paid by zaps?

Could we get that going on a scalable VPS?

Also, could it have a theme that everyone then can feed text into?

Figuring that out.

You can run it on your PC or even on phone browser using Colab. You will have your own personal AI chat, hosted by you.

I mean you can give it a try:

Click on this link, and in the runtime tab in side bar, there will be a 'run all' button. Click on it, and it may fail once, but then restart it again, and it will work.

I will post the more refined process later, I am creating some shortcuts for this and also doing some fine tuning.

https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb

What do you mean by “locally”?

You can run it on your PC or even on phone browser using Colab. You will have your own personal AI chat, hosted by you.

I mean you can give it a try:

Click on this link, and in the runtime tab in side bar, there will be a 'run all' button. Click on it, and it may fail once, but then restart it again, and it will work.

I will post the more refined process later, I am creating some shortcuts for this and also doing some fine tuning.

https://colab.research.google.com/github/realiefan/NostrAi/blob/main/llama-2-13b-chat.ipynb

Way cool work. If I’m not misunderstanding, collab is hosted by Google. So it may be “locally hosted” but it’s locally hosted on a computer that is a Google-owned computer. Seems like users of this notebook wouldn’t be calling it locally hosted then, right?

It's on colab just for testing purposes, aimed to demonstrate how easy it is to deploy and run on low-spec devices. It will not rely on colab in the future.

Currently, it's in the blueprint phase as this project is going to be massive. I also want community to be involved in development process.

You should be able to host it anywhere from pi to phone in the future.

Oooooo of course. 💯💯💯

Oh snap. Id love a local ai

Very cool, very fucking cool, I asked it to explain subjective value theory and it overflowed LOL

Thank you for what you are doing to help us all

Where to download the foundational model?

I’m gonna run this on my RaspberryPi

How?

Wow 😮

Important caveat if you use Colab for such activities: the data is in their server, and stays there.