Goodhart's Law

From Weekly I/O#56

Goodhart's Law: when a measure becomes a target, it ceases to be a good measure.


What might happen when a government offers a bounty on cobra skins because they want to reduce the cobra population? The cobra population increased because people started to breed cobras for their skins and earn money from the government.

When using a metric to reward performance, we provide an incentive to manipulate the metric to gain the reward. Goodhart stated that as: "when a measure becomes a target, it ceases to be a good measure."

The phenomena resulting from this law are ubiquitous because measures of performance (MOPs) are easier to define and qualify than measures of effectiveness (MOEs). However, MOPs are also easier to manipulate, leading to actions that improve performance measurement (cobra skins) but paradoxically reduce the effectiveness (cobra population).

Here's another example. Measuring an engineer's effectiveness is more challenging than measuring their performance based on metrics like the number of lines of code written. If solely the latter metric is used to determine an engineer's promotion, all engineers will strive to produce more lines of code, which could make the quality of code worse and harder to maintain.

Want to learn 5 bite-sized cool things like this every week to understand the world better? Sign up below for my free weekly newsletter and learn together!

Weeklyio Banner

You might also like