Getting stuck in Pixel 6 stock using chrome/brave at: [System Initalize] Finish loading on WebGPU - arm
Any chance in future to use the tensor TPU for quicker responses?
If I did everything correctly, when you open this website, you should be able to run any AI language model locally in your browser, any device.
It will utilize your native hardware with new WebGPU API to run these models.
This is just a demo; I am working on a proper interface for NostrNet.work users.
Demo: https://ai.nostrnet.work
Getting stuck in Pixel 6 stock using chrome/brave at: [System Initalize] Finish loading on WebGPU - arm
Any chance in future to use the tensor TPU for quicker responses?
I think Google is building Gemini model for it. If they make it open source, we will see multiple variants of it.