it scares me that everyone is now outsourcing their thinking to LLMs and will consume any gibberish it outputs without reconsideration

Reply to this note

Please Login to reply.

Discussion

It's a natural atrophy. There will be others who will eventually give them jobs doing unskilled labor. It's the dream of easy sponsored by Satan.

I think this is the plan.

It's an upgrade from Google and Wikipedia.

I used to look down on people who would retort "ur wrong it said so on bikapebia," but at least they had to pretend they could read and summarize the garbage they were using to disagree with someone. the people who copy and paste entire paragraphs unmodified straight from LLM output into their responses are a whole new type of subhuman.

I’m hearing a lot of popular podcasts use LLMs to “fact check”. Slop fact checked for more slop.

Snope and all these “fact checking” websites are completely bias. LLMs use the data from those websites to fact check. I don’t understand how the same people who didn’t trust those websites, now trust LLMs.

✅ trust the science

✅ trust the LLM

Numbers say the opposite, at least, devs trust it a bit less the more theu use it. The sad thing is that people is using ai as therapist, therefore confessing their thoughts to Big Tech.

https://newsletter.techworld-with-milan.com/p/trends-8-developers-use-ai-more-but

Devs are in a position where they are exposed more to the inaccuracies of AI and get real-time feedback (code doesn’t work or have subtle bugs)

People trust AI generated information a lot, and the inaccuracies don’t reveal themselves for a long while

People already outsource their thinking and don't stick to (not so) common sense, first principles, objective reality. This won't change much, just replaces TV for programming themselves.

1984 is here and truths and history will just be updated as requires.

Brains being replaced with LLMs.

Almost like it was the goal all along

It scares me that LLMs' loudest proponents also know this.

I'd like to keep experiencing my own hallucinations, thank you.

That's why you need to connect a local knowledge base to it that you trust, that works quite nice.