Goodhart's Law
Goodhart's Law reminds us that when a measure becomes a target, it ceases to be a good measure. Similar to Heisenberg's Uncertainty Principle, the act of measuring something changes the nature of what is being measured. So, in an age where we increasingly rely on quantitative metrics to determine success, it's important to remember that just because a metric has been insightful in the past, there's no guarantee that it will continue to be insightful in the future.
For example, imagine that I observe that when my team writes more emails, more work is done. But, if I set a goal for the number of emails to be sent by the team, people will start sending irrelevant emails in order to meet their targets. As a result, the number of emails sent will no longer correlate well with team productivity.