Rather than saying “I don’t know, there’s not enough on this subject to formulate a working answer” — which is what you could infer from an empty search results page — an LLM will give you something that looks right. Then you have to go shoot yourself in the foot to learn it’s not right, because you didn’t know enough to know it was wrong.

https://blog.jim-nielsen.com/2024/nothing-is-something/

Reply to this note

Please Login to reply.

Discussion

No replies yet.