Had have this same experience. Ask AI something complex you know about, gives the wrong answer with confidence. Point it out. Backtracks, still gets it wrong. Point it out again, rinse, repeat.
It’s a version of Gell-Mann Amnesia wherein you read a NYTimes’ story about something you know well, realize they’ve completely botched the reporting. Then you turn the page and read about what’s happening in the Middle East (which you don’t know about) and take it at face value.
nostr:note1dnypl3q3etnvulh9cv9hyjqpjkvccmme6mpx7ztetjcr5ypqkgxs7yfzs4