Goodhart's Law states that “when a measure becomes a target, it ceases to be a good measure.” In other words, when we use a measure to reward performance, we provide an incentive to manipulate the measure in order to receive the reward.
E.g.: grain and steel production in Maoist China. https://en.wikipedia.org/wiki/Great_Leap_Forward
Please Login to reply.
No replies yet.