Goodhart's law

Goodhart’s law is the loose principle that when a statistical measure becomes a target (i.e. used as an indicator or for practical evaluation), it can no longer serve as a useful measure. That is, metrics embraced for widespread evaluation, especially with powerful downstream effects, will be “hacked” by the players of the game. The necessary causal independence between the metric and the behavior it evaluates will be broken, as agents alter their behavior to maximize the measure.

Sources