I've heard people talk about bro culture, so this is what they mean.
Man follows ChatGPT’s advice – and so poisons himself:
"As described in a new paper published in the journal Annals of Internal Medicine, a 60-year-old man ended up coming down with an all-but-defunct condition known as “bromism” after ChatGPT suggested he replace sodium chloride, which is better known as table salt, with sodium bromide, a substance used in pesticides, pool and hot tub cleaners, and as a canine anticonvulsant."
We need to keep teaching people that LLMs are guessing machines, not answer engines. https://futurism.com/man-poisons-himself-chatgpt
Discussion
No replies yet.