Replying to Avatar Daniel D’Aquino

Phi3 3B is the best local LLM model for programming questions that I have seen. It’s quite capable, and doesn’t use as many computing resources as other models of similar capability. It runs on an M1 chip without making the computer sluggish.

https://ollama.com/library/phi3

Avatar
Andreas Griffin 1y ago

https://ollama.com/library/wizardlm2

Works also good.

Reply to this note

Please Login to reply.

Discussion

No replies yet.