Yes, AI mimics natural language, seeking coherence rather than logical consistency; and, more than this, when it makes a mistake, it tends to equivocate and try to convince us of the opposite of what we know to be wrong, at least at first; AI is a kind of sophisticated sophist, almost human, all too human.
Discussion
but even this argument attributes intent. which the software lacks.
it likely has to do with the tokenization process.
ok, intentionality is a complicated concept; let's put it this way, AI does not 'seek,' but 'tends to' produce dynamic, relational, and context-dependent meaning, that is, coherence rather than mere logical consistency.