Global Feed Post Login
Replying to Avatar Kajoozie Maflingo

If you had a $300 budget for a GPU or any PCI-E coprocessor SPECIFICALLY FOR #AI / machine learning, no gaming, new or used, what would you buy? My RTX 3060 ti with it's paltry 8gb vram just doesn't cut the mustard for Stable Diffusion / llama.cpp

#machinelearning #deeplearning #stablediffusion #llama #chatgpt

34
asdfg 1y ago

I got a 3060 too but with 12gb of ram, since 40 series is out maybe you could get one cheap.

Reply to this note

Please Login to reply.

Discussion

No replies yet.