running locally mined AI on a pc as native app is resource intensive already. i think this will be even more so
If I did everything correctly, when you open this website, you should be able to run any AI language model locally in your browser, any device.
It will utilize your native hardware with new WebGPU API to run these models.
This is just a demo; I am working on a proper interface for NostrNet.work users.
Demo: https://ai.nostrnet.work
Discussion
Yes, it entirely runs on your hardware. On upcoming NostrNet PWA, you should even be able to turn off your internet, and it will still work. There are numerous options, including 1 billion models that can also run on your phone. Click on ""model name" on top centre, it will show you the list of all the models.
thanks. on my mobile device it defaulted to one model. still, it lags quite a bit. haha