Global Feed Post Login
Replying to Avatar Vitor Pamplona

Everytime I ask an AI to make a statement "better", without further instructions, the result is often a weaker, less precise, more ambiguous, fuzzier version.

It begs the question of why. What is making the model think fuzzier is "better"? Is it because most texts it was trained on were imprecise and fuzzy? Or is it because it is trying to "average" words to the most common denominator?

GM.

36
36e29152... 10mo ago

"better" is subjective.

Reply to this note

Please Login to reply.

Discussion

No replies yet.