one of the things that turned of off about llm's is that it can still misspell words. google's gemini for example spelled _lableing_ instead of labeling.
i get that it's learning but which english dictionary did it ingest for it to misspell that word.
i asked it how this is possible? one of its reasons is that it sometimes _hallucinates_
"Like all large language models, I can sometimes "hallucinate" information, including incorrect spellings. This means that I can produce text that seems correct but is actually wrong."