CASE STUDY: 3 Common Mistakes to Avoid With Your XmR Charts (Part 1)

by Stacey Barr |

Some legacy habits with how we analyse our data often sneak surreptitiously into our XmR charts, and these habits make our XmR charts incapable of highlighting the real signals in our performance measures.

XmR charts are based on very specific and deliberate statistical theory, even though they are quite easy to interpret and understand. They are quite easy to create also, when you have the correct knowledge and procedure to build them.

Without this correct knowledge and procedure, a few mistakes creep into our XmR charts that cause them to be invalid: they don’t highlight the right signals.

MISTAKE #1: Failing to correctly remove seasonal or cyclical effects

Juanita works for an Australian city council that has a performance measure of Total Sick and Carers Leave.

This is calculated from the number of days taken as sick leave or carers leave in the period, divided by the number of employees (or full time equivalents, FTE).

The graph below reports the actual values of this measure for each month.

You can see the seasonal effect: high points around the middle of the year and low points around Christmas.

In Australia, the middle of the year is usually big for school holidays and, being winter, it’s also big for people getting sick. You’d expect Total Sick and Carers Leave to be high at this time.

In contrast, Christmas is popular for recreation leave so Total Sick and Carers Leave would naturally be lower.

This seasonal effect makes it hard to work out if Total Sick and Carers Leave is getting better or worse, or not changing at all. We need to remove the seasonal effect to find the real signals.

Using a 12-month moving average is a common way that people deseasonalise data, but it means that the moving average data points have high ‘autocorrelation’. This means that the value of one point is greatly dependent on the value of previous points.

For XmR charts, low autocorrelation is important, so we need another way to remove the seasonal effect from our measures.

Correcting this mistake…

This is the way to deseasonalise data that I learned from Donald Wheeler, the expert I learned XmR charts from. It makes sure that there is little autocorrelation in our measure values.

After we deseasonalise Juanita’s measure values, we can create a valid XmR chart that will highlight the right signals. And here it is:

Now it’s much easier to see the signal! Can you see it?

Identifying the signal from the deseasonalised measure

The signal is called a long run below the central line. It starts at January 2012, where we get a run of 10 points out of 12 in a row below the central line. Isn’t it now obvious to ask what happened in January 2012?

Recalculating the central line and upper and lower natural process limits shows us the size of improvement in the measure of Total Sick and Carers Leave: the central line moved from 0.79 to 0.73. It’s a small improvement, but it’s a real improvement.

This improvement seemed to hold until October 2012, where we can see another signal, a long run above the central line. What happened in October 2012?

We can also see a special cause signal at April 2013. And of course we’d want to know the reason for that.

Can you see how XmR charts, when done correctly, naturally encourage us to ask the right questions?

Coming up next is 3 Common Mistakes to Avoid With Your XmR Charts (Part 2)…

TAKE ACTION:

If you have a performance measure with a seasonal or cyclical effect in it, then try this method to deseasonalise your measure and then create an XmR chart to discover if you have any signals that everyone has so far missed!

Speak Your Mind

Your email address will not be published. Required fields are marked *

  1. Nice work Stacey! I’m wondering two things though. The first is about the “signal.” I understand the concept (I think), but why is it only that we ask what happened in the first month (Jan 2012 for example)? Couldn’t the signal be indicating a longer change over the span of the 12 points? And when the signal changed, the same again? Since we don’t count it as a signal until there are multiple consecutive points below (or above) the center line, couldn’t the cause of the change be of a wider breadth?

    My second question is a philosophical one. I’m not sure from the example why Jaunita was collecting/reporting this measure? I understand your use of it as an example of the mistake with seasonal data…but I can’t help but wonder what the intent of the measure was…or what the root question was. That helps me have context to how to use the results.

    Great stuff! I love getting your newsletter, thanks…
    Marty

  2. Stacey Barr says:

    G’day Marty! Great comments…

    When we see a long run in our measure, it means generally that something changed at the start of the run, and that change stayed in effect. In other words, it wasn’t a change due to a factor that popped up just in January 2012 and then went away again. A change that could cause what we’re seeing in this measure is that in January 2012, management started paying more attention to Sick and Carer’s Leave and everyone started being more cautious in taking it. Then after a while we see signs of it returning to it’s previous level, which usually happens when we try to improve something just by saying “we need to be better at this”, rather than fixing a fundamental cause. Perhaps a fundamental cause that might reduce Sick and Carer’s Leave is inflexible work hours, or too-high workloads. I don’t know the truth here, but just wanted to illustrate interpretation of a long run with an example! I did ask Juanita about this signal, and she was going to look into it (she’s not the performance owner for this measure).

    Secondly, you’re right (and I’m not surprised that you, of all people, would bring this up Marty!): it’s important to understand the context of the measure. Does management want to reduce the level of Sick and Carer’s Leave, and if so, why and to what level (without sabotaging other performance results, like employee wellbeing and loyalty)? When I asked Juanita if there was a target for this measure, she said there wasn’t really, so my guess is maybe she wanted to grab an example that was easy to get the data for.

    Thanks as always for the conversation, Marty.

  3. Prahlad Bhugra says:

    Dear Stacey, thank you for introducing the methodology for filtering seasonality effect from KPI signals. This has the potential to provide clear detection of improvements or deterioration in signals.
    I had one observation – the filter coefficients depend on previous values of signals as well as on the future values of signals – which makes the XmR chart dynamic in nature. This brings a question – whether the improvements or deteriorations detected today will remain unchanged in future as XmR will get completely changed in future.
    regards
    Prahlad

  4. Frank says:

    Dear Stacey
    Great blog, thank you. I recently read some of Wheeler’s books, where he describes the rule concerning runs about the central line. He describes 8+ consecutive points above/below the central line as a signal. I haven’t found the rule with 10 points out of 12. On the other hand it seems plausible too. May I ask you where you found the source for the “10 out of 12” rule?
    Regards, Frank

    • Stacey Barr says:

      Frank, this rule I learned many years ago in my own learning on XmR charts (likely books that I read). But in developing my ‘Using Smart Charts’ course, Donald Wheeler reviewed the material I created and confirmed the 10 out of 12 rule.

Upcoming KPI Training


>> Africa, In-Person, Dubai UAE, 3-5 May 2023

>> Australia/NZ/Asia Pacific, In-Person, Wellington NZ, 9-11 May 2023

>> UK & Europe, Online Interactive, 22-26 May 2023

>> North America, Online Interactive, 29 May - 2 June 2023

Register for the next PuMP Blueprint Workshop near you

Reprinting Articles


You are welcome to use articles from the Measure Up blog, with these requirements

Connect with Stacey


Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.



    *We respect your email privacy.
    Suite 117 Level 14,
    167 Eagle Street,
    Brisbane Qld 4000,
    Australia
    Stacey Barr Pty Ltd
    ACN: 129953635
    Director: Stacey Barr