How I Used the Spotify Squad Health Check Model

Being an Agile Coach & Trainer for Prowareness, I use different types of games, tools and practices every week during meetings, workshops and trainings. Some of these I’ve invented myself, but most already existed and have only been changed slightly to a format that suits me best. I’ll share the what-why-how-when and what worked well and what didn’t. By sharing my experiences I hope to inspire you to give these tools and practices a try for yourself, improve it and hopefully share the perfected version in return.

Yesterday I facilitated a retrospective within a medium sized (±30) web agency. I truly love these kind of companies. Most of them are full of passionate, energetic, and creative people. Getting an interactive, fun, and collaborative session isn’t difficult. Everyone is eager to learn and help each other out, yesterdays session proved to be a great example!

Beforehand, I only knew the team had some issues, didn’t do a retrospective that often and were curious on which areas they could improve. Therefore I was looking for a retrospective format that focused on the team as a whole that would cover all the relevant areas. Given this context, the Spotify Squad Health Check Model seemed appropriate. I’ve wanted to use this model for a long time, today I finally found the opportunity!

What is the ‘Squad Health Check Model’ about?

It’s an approach that visualises the ‘health’ of a team. It covers areas like teamwork, fun, easy to release, learning, health of codebase. While discussing the different health indicators, the team builds up self-awareness about what’s working and what’s not. The broad selection of questions helps expand their perspective. Perhaps they were well aware of the code quality issues, but hadn’t really thought about the customer value perspective, or how fast they learn. It also provides a balanced perspective, showing the good stuff as well as the pain points.


What’s the source?

Henrik Kniberg and Kristian Lindwall shared this approach via this website. It contains a comprehensive description of how to use it, therefore I’ll only share my approach, takeaways and lessons learned.


2 hours.

What you need

  • A printed version of
    • The indicators with the different topics
    • The ‘traffic lights’ (green, orange, red)
    • The instructions
  • Markers and pens
  • A large whiteboard or flip chart sheets
  • Sticky notes

How I’ve used it

For the entire session I used the standard approach for a retrospective:

Setting the stage (15 minutes)

  • I briefly explained the format and instructions and shared the different topics we would cover.
  • If the team finds a topic irrelevant or they miss a specific area, this is a good moment to remove or add it.


Gather data (30 minutes)

  • Per indicator, I shared the positive and negative description that’s is written down on every card.
  • Every participant decided for themselves what colour was most suitable. Green = good, yellow = some problems, red = really bad. When everyone had made a choice they showed it to each other.
  • I didn’t discuss the results in detail but only wrote it down on the whiteboard.
  • The next step was to determine the trend, is this generally improving or getting worse? The question mark indicates that on average the trend might seem stable but there were a lot of differences between the team members.


Generate insights (30 minutes)

  • Because of the group size (12 participants), I created two sub teams. Every group studied the result and discussed it within their team, taking a few minutes per topic.
  • The insights were written down on sticky notes and placed on the whiteboard.
  • Every team presented their insights with each other and briefly discussed the questions and peculiarities.


Decide what to do (30 minutes)

  • Defining concrete improvements, agreements, actions was the next step. Dot-voting all the specific insights would be too much, therefore we decided to dot-vote the eleven topics.
  • ‘Health of the codebase’ and ‘pawns or players’ turned out to be the most urgent topics to discuss in more detail.
  • Again the group was divided into two subteams, this time people could join the team with the topic they found most interesting.
  • Each of the teams discussed the topic in depth with each other and defined concrete improvements and agreements.
  • After 20 minutes the results we’re shared and presented. Important step was to rewrite the results into actionable improvements and clear agreements.


Close the retrospective (15 minutes)

  • As a closing of the retrospective we discussed how we could ensure the improvements and agreements would be realised. Defining ‘topic owners’ and planning a new session were some of the ideas for this.
  • The ROTI (Return On Time Invested) was the last part of the retrospective. The result was positive, therefore this format proved to be valuable enough to use more often!



I consider the Squad Health Check Model a very fun and valuable approach to assess a teams ‘health’. Especially the opportunity to add/remove topics ensures it can be suitable for all kinds of teams. Using this approach periodically will give the team some great insights of their current state and the areas in which they can improve themselves.

I hope this blog post inspires you to give this workshop a try yourself. If so, I would really like to hear your experiences! And of course, helping you out by facilitating this workshop is always negotiable.

8 thoughts to “How I Used the Spotify Squad Health Check Model”

  1. Thanks a lot for this post! The ‘general insight’ & ‘decide what to do’ parts are what the Spotify pdf is missing! They seem to be obvious, but can be easily forgotten. Cheers!

  2. Thanks a lot for your post.
    Can I translate and introduce this post in my personal blog–it is non-commercial and non-organizational 🙂
    It would be so much appreciative if you allow me to do so.
    Regards 🙂

  3. Barry, thank you very much for this detailed description!
    From what I understood, Spotify uses this also to see the trend of changes over time. My question is, whether you plan to repeat this session with the same team? If so, how do you plan to visualize the results so that the trend over months is clear? I mean, to compare the current state with the previous one you need to simplify the results (e.g. select mean or average) so that the comparison is easy to grasp. Any thoughts on this?

  4. How do you prevent, or handle, every member voting ‘yellow’ ?
    It’s never ‘perfect’, there are always some problems visible. Can you steer people away from voting for the safe middle ground? Would you want to?

Comments are closed.