So the plan is to have SD in Amethyst but as remote connection to colab or something? I'm trying to understand in a less techie term and more in christian term lol
Discussion
No, the plan is to run SD locally, entirely in-device. Like we do with translations
That may be a bit much for a phone to handle. But you can probably run tailscale and make calls to your remote PC running SD pretty easily
Yisas 👀 SD is like between 9 to 12GB only the base more recent mobiles has good storage space, So no problem on that. LoRas and models well depends a lot in the user. But how do you plan to have the computational power? Since 8GB Vram is the barely minimum.
Me personally that's why I depend a lot in Paperspace, Colab and the such. I still hope someone make SD for truly potato pc's lol at least that runs on Nvidia 730GT.