Seven Mistakes With Performance Dashboards That Prevent Evidence-Based Decisions

by Stacey Barr |

The purpose of a KPI performance dashboard is to interpret quickly and accurately how an organisation’s top priority performance results are doing compared to target, and initiate action to close gaps between actual and target performance. But there are some KPI dashboard design mistakes that derail this purpose.

Thanks to data visualization expert, Stephen Few, we have a set of well-founded criteria to guide an assessment of any data visualization, like a dashboard. Stephen’s seven criteria will provide the palette for the top seven mistakes I see people make when they design performance dashboards.

Mistake #1 – Filling the KPI dashboard with the easy and available measures.

Performance dashboards are faster to build than the measures they will display. Consequently, too many dashboards are filled with easy-to-get measures. Not the meaningful measures that are truly needed.

This is Stephen’s ‘usefuleness’ criterion: make sure the only measures included in the dashboard are measures of the priority performance results.

Mistake #2 – Leaving information gaps in the performance dashboard.

Some goals can be hard to measure. So, some goals have never been measured. The data doesn’t exist yet. And if this is not addressed, the performance dashboard will always only tell part of the story.

This is Stephen’s ‘completeness’ criterion: make sure that all the priority performance results have measures in the performance dashboard.

Mistake #3 – Using the typical dashboard dials, gauges and other poorly designed graphs.

Dashboard technology has made almost any visualisation possible (except the most useful graph for performance monitoring, which we’ll come to later). Just because it’s there, doesn’t mean you should use it. Graphs are for helping the data answer your question. They’re not for decoration.

This is Stephen’s ‘perceptibility’ criterion: make sure you choose the right graph for the question you’re asking of your measures, and keep the graphs as simple as possible.

Mistake #4 – Focusing comparisons on this month compared to last month.

How well this month performed compared to last month is a distortion. Performance will always vary from month to month. So any difference between two months is most likely random noise. It can’t tell us what performance truly is doing.

This is Stephen’s ‘truthfulness’ criterion: make sure each measure’s chosen graph and its design tells the data’s story without bias or distortion. That’s why we use XmR charts for our performance measures – they tell the truth. But no performance dashboard app seems to include them.

Mistake #5 – Assuming people understand how to read the KPI dashboard.

People don’t feel comfortable to say out loud if they don’t understand a data visualisation. They will attack it, and insist on shifting to something they are comfortable with, even if what you’ve chosen (like the XmR chart) is far more appropriate and insightful.

This is Stephen’s ‘intuitiveness’ criterion: make sure that the graphs and layout you use are clearly explained to users before they have to use the dashboard.

Mistake #6 – Ignoring or underestimating the power of visual design.

Colour is not decoration. Variety is not entertainment. Too much of either and your dashboard will scare its users like many children are scared by carnival clowns.

This is Stephen’s ‘aesthetics’ criterion: make sure the dashboard is pleasing to the eye. That usually means using colour and font and other design features deliberately and sparingly. And this takes skill; skill you might have to hire.

Mistake #7 – Building the KPI dashboard without user involvement.

It’s easier to use something that we feel ownership of. And ownership comes from involvement. But most users of dashboards don’t really know much about the science (or art) of effective data visualisation. So involving them doesn’t mean incorporating all their personal opinions.

This is Stephen’s ‘engagement’ criterion: make sure that you focus users on the purpose of the dashboard, and educate them just enough to appreciate the deliberate design of it.

Evidence-based decisions need powerful KPI performance dashboards.

A performance dashboard is powerful to evidence-based leaders when it gives them truthful answers to all the questions they have about the gaps between current performance and targeted performance, aligned to their strategic goals. If the dashboard can’t do all that, then it’s doing more harm than good.


Have you seen any of these mistakes happen in performance dashboards in your organisation? What were the consequences?


Speak Your Mind

Your email address will not be published. Required fields are marked *

  1. David D says:

    Under the heading of ‘Confused Terms’, would like to hear if you have strong opinions on the differences between Dashboards and Scorecards. From this past blog, you may not feel so strongly or make any distinction so clearly. Before I sound off at work , thought I’d get your take.

    I felt this LinkedIn post ( stated it well and this is the principle distinction I’ve made in past, BUT, always open to different views and being a massive fan, wanted to see what the PuMP peeps think. I know you’ll agree that the most important distinction is monitoring vs management. Maybe what format/report you use doesn’t concern the PuMP folks that much.


    Best regards, David D

    • Stacey Barr says:

      David, thanks for sharing the article. Definitions of terms like these vary so much, in all honesty I’ve just avoided being specific about it. In PuMP, as you know, we focus more on the design of the way performance measures are displayed, so they can answer those 3 questions (what is performance doing, why is it doing that, and what should we do about it?). This often requires more than one style or design of ‘report’, to facilitate more depth where needed. I don’t particularly agree with the distinction made in the article, since a dashboard most certainly can show strategic measures and their current level or status of performance. Stephen Few would probably say that a dashboard is just a summarised display designed for fast scanning. A scorecard, to me at least, feels more like a judgmental term about how well we scored, and that’s not the vibe I wanted to design into PuMP. My suggestion is to keep searching for a better definition of these two, and other, terms used to describe how measures are visually displayed. You might like to try Nick Desbarats’ work, which is quite definitive:

Upcoming KPI Training

Our next PuMP Performance Measure Blueprint Workshops include:

>> Australia/NZ/Asia Pacific, Online Interactive, 6-7, 11-13 Oct 2021 *REGISTRATION CLOSED*

>> Africa, Online Interactive, 11-15 Oct 2021

>> United Kingdom, Online Interactive 5, 12, 19, 26 Nov & 3 Dec 2021

>> Australia/NZ/Asia Pacific, Online Interactive, 10-11, 15-17 Nov 2021

>> North America, Online Interactive, 15-19 Nov 2021

>> Africa, Online Interactive, 14-18 Mar 2022

>> Africa, Online Interactive, 20-24 Jun 2022

>> Africa, Online Interactive, 26-30 Sep 2022

>> Africa, Online Interactive, 28 Nov - 2 Dec 2022

Register for the next PuMP Blueprint Workshop near you

Stacey's Books

Prove It! How to Create a High-Performance Culture and Measurable Success, book by Stacey Barr

Order Prove It! here.

Practical Performance Measurement: Using the PuMP Blueprint For Fast, Easy, And Engaging KPIs, book by Stacey Barr

Order Practical Performance Measurement here.

Reprinting Articles

You are welcome to use articles from the Measure Up blog, with these requirements

Connect with Stacey

Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.

    *We respect your email privacy.
    Level 54, 111 Eagle Street
    Brisbane, Qld, 4000
    Stacey Barr Pty Ltd
    ACN: 129953635
    Director: Stacey Barr