But a local model that’s been fine-tuned for coding? Perfection.
It’s not that you use the LLM to do everything, but it’s excellent for pseudo code. You can quickly get a useful response back about most of the same questions you would search for on stack overflow (but tailored to your own code). It’s also useful for issues when you’re delving into a newer programming language and trying to port over some code, or trying to look at different ways of achieving the same result.
It’s just another tool in your belt, nothing that we should rely on to do everything.