Having multiple GPUs can for sure speed up many AI-related tasks, but it’s not strictly necessary for running a local AI server, especially if you’re not working with highly intensive models or tasks.
Discussion
Ideally I would like to run something that's roughly equivalent to GPT-3.5.
Well, that is probably a lot of computing power 🫣💕