What size GPU? I like Code llama and the new llama 3.3 40b(?). I run them on my rtx 3090ti using open UI.

Reply to this note

Please Login to reply.

Discussion

No dedicated GPU. Just a shit ton of ram and a modest Ryzen 7 cpu with 8 cores / 8 threads.

Know I am shooting for the stars here, but some have actually worked with little to no lag.

Hmm. I can't imagine you get much larger than 7 or 9b models to run ok?

In my experience, 20-30 is the bare minimum for anything other than hello world coding help. They tend to get confused often IMO