LLMs do way less thinking and way more hallucinating than people think.

Reply to this note

Please Login to reply.

Discussion

Well, with its help i am close to reaching my nutrition and fitness goals. Been prompting it for about 4 months. They will get better no doubt

[822396]

They can generate good responses for things that are low stakes and require no new thinking. Fitness and nutrition are fairly established (although I’m sure I could get it to give me bad nutrition advice) so the responses can be pretty accurate for those fields.

I also think you should give a lot of credit to yourself for hitting your goals. For 95% of people, no new information is needed to lose weight or change their diet. People naturally have a good idea of what is good for them and how to lose weight. What is needed is consistency and motivation. It seems like the AI was a great accountability buddy, but you’re still the one who ultimately did the work!

True, very true. What i particularly value is that he keeps track of what i eat and links all nutrients and benefits. We also developed a grading system and such, i guess it is all about the prompt and how you wish to adapt your agent to yourself. But i do agree

[822396]

Hallucinating isn't even the right word, it is just putting words together based on probability