Seven Mistakes With Performance Dashboards That Prevent Evidence-Based Decisionsby Stacey Barr |
The purpose of a KPI performance dashboard is to interpret quickly and accurately how an organisation’s top priority performance results are doing compared to target, and initiate action to close gaps between actual and target performance. But there are some KPI dashboard design mistakes that derail this purpose.
Thanks to data visualization expert, Stephen Few, we have a set of well-founded criteria to guide an assessment of any data visualization, like a dashboard. Stephen’s seven criteria will provide the palette for the top seven mistakes I see people make when they design performance dashboards.
Mistake #1 – Filling the KPI dashboard with the easy and available measures.
Performance dashboards are faster to build than the measures they will display. Consequently, too many dashboards are filled with easy-to-get measures. Not the meaningful measures that are truly needed.
This is Stephen’s ‘usefuleness’ criterion: make sure the only measures included in the dashboard are measures of the priority performance results.
Mistake #2 – Leaving information gaps in the performance dashboard.
Some goals can be hard to measure. So, some goals have never been measured. The data doesn’t exist yet. And if this is not addressed, the performance dashboard will always only tell part of the story.
This is Stephen’s ‘completeness’ criterion: make sure that all the priority performance results have measures in the performance dashboard.
Mistake #3 – Using the typical dashboard dials, gauges and other poorly designed graphs.
Dashboard technology has made almost any visualisation possible (except the most useful graph for performance monitoring, which we’ll come to later). Just because it’s there, doesn’t mean you should use it. Graphs are for helping the data answer your question. They’re not for decoration.
This is Stephen’s ‘perceptibility’ criterion: make sure you choose the right graph for the question you’re asking of your measures, and keep the graphs as simple as possible.
Mistake #4 – Focusing comparisons on this month compared to last month.
How well this month performed compared to last month is a distortion. Performance will always vary from month to month. So any difference between two months is most likely random noise. It can’t tell us what performance truly is doing.
This is Stephen’s ‘truthfulness’ criterion: make sure each measure’s chosen graph and its design tells the data’s story without bias or distortion. That’s why we use XmR charts for our performance measures – they tell the truth. But no performance dashboard app seems to include them.
Mistake #5 – Assuming people understand how to read the KPI dashboard.
People don’t feel comfortable to say out loud if they don’t understand a data visualisation. They will attack it, and insist on shifting to something they are comfortable with, even if what you’ve chosen (like the XmR chart) is far more appropriate and insightful.
This is Stephen’s ‘intuitiveness’ criterion: make sure that the graphs and layout you use are clearly explained to users before they have to use the dashboard.
Mistake #6 – Ignoring or underestimating the power of visual design.
Colour is not decoration. Variety is not entertainment. Too much of either and your dashboard will scare its users like many children are scared by carnival clowns.
This is Stephen’s ‘aesthetics’ criterion: make sure the dashboard is pleasing to the eye. That usually means using colour and font and other design features deliberately and sparingly. And this takes skill; skill you might have to hire.
Mistake #7 – Building the KPI dashboard without user involvement.
It’s easier to use something that we feel ownership of. And ownership comes from involvement. But most users of dashboards don’t really know much about the science (or art) of effective data visualisation. So involving them doesn’t mean incorporating all their personal opinions.
This is Stephen’s ‘engagement’ criterion: make sure that you focus users on the purpose of the dashboard, and educate them just enough to appreciate the deliberate design of it.
Evidence-based decisions need powerful KPI performance dashboards.
A performance dashboard is powerful to evidence-based leaders when it gives them truthful answers to all the questions they have about the gaps between current performance and targeted performance, aligned to their strategic goals. If the dashboard can’t do all that, then it’s doing more harm than good.
Have you seen any of these mistakes happen in performance dashboards in your organisation? What were the consequences?
Under the heading of ‘Confused Terms’, would like to hear if you have strong opinions on the differences between Dashboards and Scorecards. From this past blog, you may not feel so strongly or make any distinction so clearly. Before I sound off at work , thought I’d get your take.
I felt this LinkedIn post (https://www.linkedin.com/pulse/dashboard-vs-scorecard-clarifying-differences-data-driven-khawaja/) stated it well and this is the principle distinction I’ve made in past, BUT, always open to different views and being a massive fan, wanted to see what the PuMP peeps think. I know you’ll agree that the most important distinction is monitoring vs management. Maybe what format/report you use doesn’t concern the PuMP folks that much.
Best regards, David D
Connect with Stacey
Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.
167 Eagle Street,
Brisbane Qld 4000,
Director: Stacey Barr
Speak Your Mind