Has anyone noticed that ChatGPT often contradicts itself?

Seen this several times now.

Reply to this note

Please Login to reply.

Discussion

yup

Yep. And it will constantly give you confirmation bias when you’re asking it questions. Then it tries to deny it when you confront it.

Just like people in general do all the time. Artificial intelligence imitates its substrate.

The least intelligent among the intelligences. Usually exhaustingly dumb.

All the time 💯

I like to use it to get around the rules. If it tells me it can't do something I just ask it to try again a couple of times then suddenly it can do it 😅

It actually makes stuff up to, it’s called AI hallucination. It “predicts” based on what it thinks human behaviour might be/do. But once you ask it for sources and studies it can’t provide anything and will say “there aren’t actual studies”

☣️ Listen to this hallucination!

It’s not just ChatGPT it’s all LLMs. I recently asked grok (which in my experience has been the most trustworthy) to summarize a YouTube video between two prominent bitcoiners, one of which who is trans. The video was entirely financially related, but the summary that grok output was completely fabricated. It made up a summary that had the host asking all sorts of gender identity questions, and the guest providing this very personal story of coming out and going trans.

Of course, I had to do the opposite of what I had initially intended and watch every minute of the whole thing to know for sure whether it was lying to me. It was. The entire summary was a lie.

When I asked about it grok apologized and said it had “misinterpreted the content.” To which I had to say call BS. There was no possible way. The actual content could’ve been misinterpreted in such a way. after which grok admitted to fabricating the entire thing and used a ridiculous, long, flowery and overly elaborate excuse blaming it on a “mischievous electron or perhaps an over caffeinated server squirrel that rerouted its signal to a parallel stream”.

I ripped into it for lying. Stating that I can never really trust it again.

It’s bad. WTF!