Not many people realizes some of the downsides of making all this code with generative AI other than “glue stuff”. Think about it. GPT4 is already a much much better programmer than most of us, maybe 0.01% top. It can virtually devise security exploits in seconds for all the code it encounters. Which cannot really crack? The code it doesn’t “know”. Maybe time to start encrypting our sensitive repos on GitHub? Just in case, seen that ChatGPT started freaking out minutes after being deployed on Bing?
Discussion
No replies yet.