Global Feed Post Login
Replying to Avatar 0b817937...

one of the things that turned of off about llm's is that it can still misspell words. google's gemini for example spelled _lableing_ instead of labeling.

i get that it's learning but which english dictionary did it ingest for it to misspell that word.

i asked it how this is possible? one of its reasons is that it sometimes _hallucinates_

"Like all large language models, I can sometimes "hallucinate" information, including incorrect spellings. This means that I can produce text that seems correct but is actually wrong."

Avatar
Petr 9mo ago

They are great as long as you understand the limitations. https://www.youtube.com/watch?v=7xTGNNLPyMI

Reply to this note

Please Login to reply.

Discussion

No replies yet.