Phi3 3B is the best local LLM model for programming questions that I have seen. It’s quite capable, and doesn’t use as many computing resources as other models of similar capability. It runs on an M1 chip without making the computer sluggish.
Discussion
https://ollama.com/library/wizardlm2
Works also good.