What Lululemon Probably Isn’t Measuring (But Should Be If They Want to Retain Their Customers)

by Stacey Barr |

As a self-confessed Lululemon addict, I’ve been following their website and blogs for a couple of years now. My interest is in their product (for my running and yoga wardrobe) but also in their business model. I noticed a few lead indicators of their recent public relations catastrophe that apparently they didn’t.

Lululemon recently responded with wit to the bad PR they received for see-through athletic pants: “We sinSHEERly apologize” featured in one of their store windows, with exaggerated examples of sheer pants.

This followed with a big product recall on all pants that customers deemed too sheer. And the fired the Executive in charge of the debacle.

And only a couple of weeks later, another product recall was announced, on new pants released AFTER the first recall! How can Lululemon have let this happen? Didn’t they learn?

I wonder if either of these PR nightmares would have happened if Lululemon measured and used a few very simple lead indicators.

Lead Indicator #1: Number Online Reviews by Star Rating

Anecdotally I’ve noticed stirrings of dissatisfaction in the product reviews on Lululemon’s websites. More and more of the reviews are complaining about the same things. Sheerness. Poor fit. Thinner and cheaper feeling fabrics. And the star ratings are lower, too.

Tracking the average star ratings, week by week in a Smart Chart, would quickly pick up shifts in product satisfaction before it got out of hand. Tracking this overall, and also for their flagship products (e.g. Wunder Unders) and their newer, edgy products, would be easy. The data is right there.

Lead Indicator #2: % Product Purchases Returned

Customers are always welcomed by Lululemon to return products that don’t fit or perform as promised. You don’t need to wait for a product recall to return a product.

Product recalls are the lag effect of customer dissatisfaction. Because Lululemon are bringing out new products every week, as limited editions, they have plenty of opportunity to learn quickly about potential problems by tracking something like the percentage of products that are returned (either online sales or store sales).

If they saw a signal in the Smart Chart showing an increase, and did some very simple analytics to find the primary reasons for returns, Lululemon should be able to figure out which products they need to take back to the drawing board BEFORE a series of product recalls is required.

Lead Indicator #3: % Online Customers Posting Product Reviews

Lululemon has enjoyed an almost cult-like following. With this unusually high amount of customer engagement, their websites are jam-packed with detailed and honest customer reviews for all their products.

But as this engagement is chipped away by bad PR and poor quality and service, customers are less likely to give their time and effort to feedback. Particularly if, like me, they are seeing Lululemon do little in response to that feedback.

So another potentially useful thing for Lululemon to measure is the percentage of online customers who post product reviews. A signal of decline in this measure would be a proxy indicator of customer engagement waning.

Can Lululemon win back its once-loyal but now-ticked-off addicts?

Of course they can. But they’d need to show a lot more commitment to listening to their customers’ generous feedback. And instead of just firing someone, they need to fix the core business processes that impact the quality of the fit and function of their products. When Lululemon measures the right results that drive customer loyalty, and quickly acts on the signals in those measures, we’ll see them return to their former glory.


In your own industry, what are the drivers and possible lead indicators of customer loyalty? Share your suggestions on the blog.

Speak Your Mind

Your email address will not be published. Required fields are marked *

  1. Lisa Hill says:

    Hi Stacey, I read with interest your blog on Lululemon. Something that struck me was that they fired the executive responsible for the debacle, but then they made the same mistake. Surely the executive responsible learned a few valuable lessons which would have helped inform a better relaunched product – hardly surprising that it happened again. It was probably a costly mistake but in this brave new world aren’t mistakes things we learn from to so that we don’t repeat them? Along with losing an exec, they possibly lost a leader with the knowledge of what not to do.

    • Stacey Barr says:

      Lisa, those are wise words!

      It’s one VERY good reason we need to stop pointing the finger at people, assuming that people are the cause of performance problems. They just aren’t (most of the time, anyway). The problems are in the systems and processes, and you’re right: this executive has left the company with the knowledge they no doubt need to fix the flaw in their systems & processes for product design and launch.

      • Stephen N says:

        Who was it who said “if you take good people and put them into bad processes, the process wins”? :o)

        • Stacey Barr says:

          Stephen, I had a quick search as I suspected it was something that W. Edwards Deming might have said. But didn’t come up with anything. Anyone else know? It’s a powerful quote and would be worth dropping into the odd performance conversation now and again!

          • Eileen says:

            I first heard the expression at the Lean Learning Center in Novi, MI, while at a session led by Andy Carlino and Jamie Flinchbaugh (authors of “The Hitchhiker’s Guide to Lean, Lessons from the Road”). Not saying they coined the phrase. Just saying that’s where I heard it first. “Bad processes beat good people every time.”

          • Stacey Barr says:

            Eileen, it still amazes me how many companies still just don’t understand this, and turn their attention to fixing processes instead of trying to fix people.

  2. J. Humphrey says:

    I work with an analyst who previously worked at Lululemon so I forwarded your article. He mentioned they do measure all three of the lead inidcators you mentioned but their systems don’t work together very well. Consequently those departments that are responsible didn’t interact as much as they should. They are quick to learn though so I’m sure they are working on it.

    • Stacey Barr says:

      Thanks J – it’s kind of no surprise that the problem is lack of communication between systems or teams. It’s just so common! It would be wonderful to hear from them about how they resolve problems like this. Like I mentioned in the article, I’m so interested in their business model and so very curious to learn more about the role of measurement in their decision-making.

Upcoming KPI Training

>> North America, Online Interactive, 6-10 February 2023 - SOLD OUT

>> Africa, Online Interactive, 27 February - 3 March 2023

>> UK & Europe, Online Interactive, 27 February - 3 March 2023

>> North America, In-Person, Calgary AB, 7-9 March 2023

>> Australia/NZ/Asia Pacific, Online Interactive, 15-16 & 20-22 March 2023

>> Africa, In-Person, Dubai UAE, 3-5 May 2023

>> Australia/NZ/Asia Pacific, In-Person, Wellington NZ, 9-11 May 2023

>> UK & Europe, Online Interactive, 22-26 May 2023

>> North America, Online Interactive, 29 May - 2 June 2023

Register for the next PuMP Blueprint Workshop near you

Reprinting Articles

You are welcome to use articles from the Measure Up blog, with these requirements

Connect with Stacey

Haven’t found what you’re looking for? Want more information? Fill out the form below and I’ll get in touch with you as soon as possible.

    *We respect your email privacy.
    Suite 117 Level 14,
    167 Eagle Street,
    Brisbane Qld 4000,
    Stacey Barr Pty Ltd
    ACN: 129953635
    Director: Stacey Barr