When KPIs Drive the Wrong Result

June 13, 2017 by Stacey Barr

When we use KPIs, their impact goes beyond our intention of monitoring our goals and reaching targets. Measures influence people’s behaviour and the results of other measures. These influences can be unexpected and undesirable, so we must either mitigate them or choose another measure.

In Queensland, where I live, we have cane toads. They were introduced to kill sugar cane beetles. But rather than curtailing the cane beetle damage, they headed off on a wildlife-killing rampage with their poison-producing glands at the back of their heads. It’s an Australian horror story. (I might be dramatising this a tad.)

Every organisation has its KPI horror stories.

Some are true, some are false, and some are… well, who really knows? One story I recall from my days working in the rail sector is a perfect illustration of a KPI driving the wrong result. The KPI was the percentage of trains on time, and the unintended consequence was the cancellation of train services that were running “too” late.

The cancellation of the trains furthest behind schedule kept the on-time running KPI looking okay. But at a cost. Customers are much more unhappy if their ride home is cancelled, than if it’s late. That’s ironic, because on-time running was measured for the outcome of making customers happier. But it was driving the result in the wrong direction.

KPIs drive the wrong result because we didn’t think it through.

When we design a KPI, it’s vitally important to think through the possible unintended consequences of measuring it. If we don’t, we run the risk of the KPI causing more harm than good.

In the PuMP Measure Design technique, thinking through the unintended consequences of a potential measure is a deliberate step before we choose the ultimate measure for our goal. We essentially scan for ways in which the measure could be a real problem:

  1. Could the measure threaten, making people feel judged and creating fear and defensiveness that might cause them to game the measure?
  2. Could the measure trivialise, giving people a tunnel-vision focus on too small a piece of the business outcome?
  3. Could the measure sabotage, where actions aimed at improving it inadvertently force other areas of performance to decline?
  4. Could the measure confuse, so that people could implement it or interpret it incorrectly and unknowingly?

It’s not always all bad.

Not all unintended consequences are drawbacks, like those above. Some are benefits. Like finding that a measure has unexpected leverage to improve several other business results. Or that it has predictive power that makes it a powerful lead indicator for a higher-level business result. Or that it drives more of the behaviour we need or want.

It doesn’t necessarily mean the KPI must die.

Checking for the unintended consequences of a measure gives us the opportunity to do something before it’s too late. We could decide not to measure it, but only as a last resort, if none of these mitigation methods will work:

  1. When the measure threatens, can you focus people on collaborating to improve the business process, rather than worrying about being judged?
  2. When the measure trivialises, can you accompany it with one or two companion measures that collectively give a fuller picture of the business result?
  3. When the measure sabotages, can you monitor the other measures together with this one, and set their targets to find balance rather than conflict?
  4. When the measure confuses, can you simply it or clarify what it means, or provide more background information on its formulation?

When the risks of a measure are quite severe, you may also want to give the measure a trial period. You’d be wise to monitor the unintended consequences to find out how bad they really are, and how realistically you can mitigate them.

TAKE ACTION:

Choose a KPI you don’t feel confident is driving the outcome you want. Use the framework above to scan for possible unintended consequences. Where you can, gather some evidence to test their prevalence.

Facebooktwittergoogle_pluslinkedinmailby feather

Speak Your Mind

Your email address will not be published. Required fields are marked *

  1. An optimum performance can be reached by concistency of commitment only.

  2. Elke says:

    Great article (as usual). But I particularly like the clear questions you formulate to help us check for those potential consequences. Having been a Pump adept for the past 7 or 8 years, I’m slowly making head-way to build the insights that measures are not just a brainstorm, what data do we have… outcome. But having people sit down and think through all angles of a potential measure before head long diving into data collection and reporting is still a weak spot. Thanks for the great (yet simple) narrative and advice. I’ll let you know how I get on :-).

    • Stacey Barr says:

      Always keen to hear from you Elke! Well done, sticking with the vision of practical performance measurement for all those years. It’s a sticky set of bad KPI habits that we are trying to change, and patience eventually pays off.

Reprinting Articles


You are welcome to use articles from the Measure Up blog, with these requirements

Connect with Stacey


Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.





*We respect your email privacy.
PO Box 422
Samford, Qld, 4520
Australia
Stacey Barr Pty Ltd
ACN: 129953635
Director: Stacey Barr
Simple Share Buttons