Replying to 93ef0185...

nostr:npub1traay5jdde50ps7y3mqdullw29a0pncqsg9vy637c8x7uyrwnvsq0wnpqw nostr:npub1juyh2l587qygmuupdjqr6wj300k6g7m0utnn9jtxr2wdf25qh90s7svh5e "Hallucination" has become a term of art,. Personally, I'm fine with it as long as we recognize -- as the term obscures -- that chat AI *always* hallucinates, for the reason you say. It's just that more often than not, its hallucinations turn out to be true.

nostr:npub1j5a6tz6a2ttelsdzdl3lnpqq2q8hyfluz0c7kpgntt548q7f5snsrhp848 nostr:npub1juyh2l587qygmuupdjqr6wj300k6g7m0utnn9jtxr2wdf25qh90s7svh5e I know it’s a term of art, and I hate it. Exactly for the reason you say: it obscures the fact that the LLMs are detached from reality always and by design. It is meant to suggest that this is abnormal or exceptional behavior.

Reply to this note

Please Login to reply.

Discussion

No replies yet.