Product update: campaign feedback & NPS

Stephanie Price, 2 min read
Tracy doing Yarno at her desk

Introduction: the value of insights

For a lot of companies, training is a ‘one and done’ part of their business: they identify a few skills they think their employees need, either buy or create some training on those topics, and that’s them good for another year.

Who fancies one and done

There is another way though (hallelujah), and that’s what we’re trying to build at Yarno – consistent, effective, and targeted training based on data and insights. Valuable insights to help streamline training can come from lots of different sources, such as:

  • On-the-job employee or team performance against defined skill criteria
  • Performance on Yarno campaigns, including performance on individual topics
  • Improvements on questions in Yarno embed campaigns (primer uplift, question uplift)
  • Confidence survey performance
  • Campaign feedback survey ratings and responses

In this article, we will dive deeper into campaign feedback specifically, and touch on some of our recent product changes that were made with the aim of increasing the volume of feedback received and prompting more targeted comments from learners.

What is campaign feedback, and how has it changed?

Firstly, campaign feedback is exactly that, feedback from learners on their experience with the Yarno campaign they have just completed. Prompted at the end of a campaign, the feedback provides insights about whether they found the campaign useful, and aspects they thought could be improved.

We recently made some changes to how we gather campaign feedback in Yarno, aimed at increasing the volume of feedback received and prompting more targeted comments from learners. These changes included:

  1. Prompting learners to give feedback directly in-app following the completion of a campaign, rather than sending the prompt via email
  2. Changing from a Net Promoter Score (NPS) rating to an average rating
  3. Adding an additional question to the feedback survey asking for recommendations for future learning

I'm here to make a suggestion

Why the move from NPS to an average rating? Isn’t NPS a market standard?

We have been conducting research with customers over the last couple of years on the value of campaign feedback and insights in Yarno. We noted several repeating themes when it came to NPS, which ultimately convinced us to move away from it as a metric:

  • Lack of understanding: Across many non-retail industries, customers were often unfamiliar with the NPS metric and how it is calculated, which made it difficult for those customers to interpret feedback summaries.
  • Differing outputs across different use cases: Some customers had a high level of familiarity with NPS, but found that there were significant differences in what was a “good” metric for learning compared with other use cases such as NPS in a retail store environment, which could lead to internal confusion when sharing learning metrics
  • The value of 7s and 8s: The calculation method for NPS considers ratings of 7/10 or 8/10 to be neutral. In a learning context where there are more factors that might lead learners to slightly discount their overall rating, discarding these ratings can create an inaccurate total picture of the campaign.

To read more about these changes, check out our previous blog post.

How have these changes impacted campaign feedback in practice?

We have seen significant changes in the patterns of campaign feedback received since these changes were implemented in March:

Significant growth in feedback volume:

Comparing campaigns before and after the changes, there has been a big jump in response rates to campaign feedback surveys - in some cases, customers gathering more than 10x the number of responses they received on past campaigns. In fact, one of our customers, Macpac, went from campaigns at the start of the year having a response rate of just under 5% to a response rate of 37% on their most recent campaign!

"We are really happy with the recent changes Yarno has made to campaign feedback, with the average response rate increasing by over 30%. The feedback provided valuable insights into the type of training the team enjoys, and where we should focus our future training. Not only that, the comments were overwhelmingly positive and highlighted how engaged the team was with the learning."

Simon Randall

Experts Lead, Macpac

Ease of review:

We are noticing customers and their stakeholders have an immediate understanding of the average ratings provided, reducing the need to provide explanations of what ratings metrics ‘mean’ up the line.

Interesting insights and recommendations:

We are seeing the new questions elicit interesting ideas and insights from learners, including:

  • Calling out specific topics where they think further training would be useful
  • Requests for specific styles of training, such as video tutorials on specific topic areas, instruction books or visual examples

We’re aware it’s still early days, having been only three months since these new updates came into effect, but signs so far are overwhelmingly positive about the impact this is having on campaign feedback overall. We’re excited by the benefits these additional insights can bring our customers!

Stephanie Price

Stephanie Price

Steph is our product manager extraordinaire. She's turning Yarno into a world-class product one feature at a time. She keeps us on track while hearing out our big, and even wild, ideas.

More from Stephanie Price

We'd love to chat about how Yarno can benefit your business

Mark Eggers

Mark, our Head of Sales, will organise a no-obligation call with you to understand your business and any training challenges you’re facing. Too easy.