i frequently find that the AI misunderstands when i tell it to change things, two common errors it makes:
add instead of change
take shortcuts like writing tests that aren't really testing anything
you just have to supervise it like it's a baby, because they are actually stupid as fuck, just mimicry.
i generally use it to write tests to cover more cases, and it often finds bugs in the implementation, and i also get it to write small things. most often, when i try to get it to change something, it gets it wrong repeatedly, and then i roll back and write it myself because i saw all the wrong ways to do it, it's clear in my mind the right way.