Who makes the best FOSS LLM for coding locally on Linux.

Right now I am using Open WebUI with various LLMs. Getting a lot of mixed results. Want to see if there is anything I am missing out on.

#asknostr

Reply to this note

Please Login to reply.

Discussion

What size GPU? I like Code llama and the new llama 3.3 40b(?). I run them on my rtx 3090ti using open UI.

No dedicated GPU. Just a shit ton of ram and a modest Ryzen 7 cpu with 8 cores / 8 threads.

Know I am shooting for the stars here, but some have actually worked with little to no lag.

Hmm. I can't imagine you get much larger than 7 or 9b models to run ok?

In my experience, 20-30 is the bare minimum for anything other than hello world coding help. They tend to get confused often IMO