Measuring What You Need Versus What You Canby Stacey Barr |
Too many organisations measure what they can, because the data is available. But how do you measure what you need, even if the data isn’t yet available?
There are all kinds of reasons why so many organisations have performance reports that are bursting at the seams with measures that mean nothing, impact nothing or lead to nothing. We’re talking about the measures that some will say “that’s interesting” and others will bark “that’s a waste of time”. But no-one is saying “what’s this measure saying about performance, and what do I need to improve it?”
Often it’s because they are the measures that have always been reported. Or some manager once wanted the measure for a project that ended five and a half years ago but it’s still being reported just in case. Or because something is better than the nothing that would exist if we left it up to decision makers to decide what should be in the reports.
Irrespective of why, this is for certain:
If a measure is not informing a decision or choice, it’s wasting space and time and, most certainly, money.
But how do you start the move from measuring what’s easy to measuring what matters?
You may not be able, right now, to engage all your report users in a process to decide the most meaningful measures to include. It’s not as simple as a measures brainstorming exercise! So try these simple tactics in the meantime:
Sort the measures that matter from those that don’t.
Start asking report users which measures or pages they always look at first, and why. Chances are their attention will go to what matters most, first. Similarly, ask report users which measures they virtually never look at or never use to inform their decision making. Listen out for whether they use the word “interesting” versus “useful”!
Drop the measures that don’t matter.
Start dropping out the measures that report users rarely use or say are just interesting. Don’t ask their permission if you don’t have to, as their reactions will tell you if the measures need to be there. Often people don’t notice or miss what they don’t value.
Look for gaps in the information they need.
Ask report users what decisions they use the report to inform or assist with. I’ll bet it’s a question they’ve never asked themselves before. Your giving them the opportunity to become conscious of that may just help them get clearer about what information they do need.
Trigger measure design to fill the gaps.
If report users come up with decisions that the current measures don’t inform, start the process of defining the results they need to monitor, designing measures for those results, and planning the implementation of those new measures. Even if the data isn’t available, the sooner you start getting it, the sooner those decisions will be informed.
Don’t hesitate; be ruthless.
Make changes to the reports quickly and surgically. Highlight the measures that matter. Remove the distracting measures that don’t matter. Show the gaps where new measures are needed. Let action-learning speed up the process to a performance report that really does what a performance report should: decisively initiate performance-improving action.
But be ready, just in case it’s too much too soon.
If you have to molly-coddle your report users, because they’ll throw a wobbly if their reports change, then next month try producing two versions of the report. One as per usual, and the other a pared-down version without the least valued measures or information. Give them the latter first, and only use the original version as a backup pacifier.
This is a quick fix to the problem of measuring just because you can rather than measuring what you really need. But it’s not a lasting solution. Proper thinking is needed, going back to basics and designing the measures that really are strategically and operationally important. But this quick fix can also be a trigger to a proper approach. So make sure to ask report users if they’re ready to commit to another iteration of improvement to the report, that moves it closer to measuring what they need.
A ruthless culling of useless measures from your performance reports could be the catalyst to get users to move away from measuring what they can to measuring what they need. [tweet this]
Connect with Stacey
Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.