If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy? My RTX 3060 ti with it's paltry 8gb vram just doesn't cut the mustard for Stable Diffusion / llama.cpp
#machinelearning #deeplearning #stablediffusion #llama #chatgpt
I got a 3060 too but with 12gb of ram, since 40 series is out maybe you could get one cheap.
Please Login to reply.
No replies yet.