Goodhart's Law states that “when a measure becomes a target, it ceases to be a good measure.” In other words, when we use a measure to reward performance, we provide an incentive to manipulate the measure in order to receive the reward.
Uh huh, I’m drunk let’s just party without the complicated references, k thnx bye
Please Login to reply.
No replies yet.