Spend time fully understanding your feedback

Some people love data and will want to spend weeks dissecting and digesting every possible number and comparison available. However, this probably isn't the most useful approach to improving your culture. There is actually a lot you can do by taking a very simple approach to your results and saving more time for discussion and generating ideas for action. Thus, here we present the flow you can go through while understanding your results.

Participation

Question: Do the results represent the views of most people?

The first thing to understand is participation. If we use the analogy of voting, participation let’s you know how many people within your team took the time to cast their vote. We start off here because this gives us some context with which to view the rest of the results. On the left side of your reports, you should see participation information like the one below.

Participation_Example.png

In this example, this manager leads a group with 9 employees and they have achieved an 89% response rate. This is a good response rate and exceeds our global average of around 80%. For most purposes you can relax once you hit around 75%. For a more detailed guide take a look at our post what makes a good response rate. It's good not to get too fixated on response rates - you still have people that want their voices to be heard - and you can always tell people you can only act on the feedback you receive. Once people see you acting on the feedback you'll often find your response rates going up over time. Remember, you have 100% of the people who took the time to respond so the most important thing is to consider, discuss and act on their feedback.

If you want to dig a little further into participation, you can click on the participation tab in the top left. From this view, you can see participation based on demographics. What demographics you can see depend on how your report was configured. This page allows you to determine if the majority of responses are coming from a single demographic. For example, if your team is split across two (or more) locations and only one location participated in a meaningful way (approaching that desirable 75%), this is something to keep in mind while reviewing the results.

Engagement Score (or other Key Factor)

Question: What is the outcome we’re driving towards and how are we doing?

We often use the average score from a few key questions to create a combined outcome factor such as Engagement, as shown above. This factor represents the key outcome your organization wants to improve. Typically, this is Engagement, and we invite you to read more about here. The scores themselves are percent favorable scores - they represent the percentage of people who rated the questions using either of the top two rating options (typically Agree and Strongly Agree). You can read more about our response formats and how we calculate favorability here.

Comparisons

Question: How are we tracking compared to others?

You’ll notice in the example above to the right of the Engagement factor is a green line with +12. This visual shows that the 75% Engagement factor score is higher than the comparison by 12 percentage points, therefore the comparison Engagement score is 63%. To check what the results are being compared to, look to the “Compared To” toggle in the top right.

Insight__.png

Currently the results are being compared to the Company Overall. You might also have other comparison options, including historical surveys, or a benchmark score to compare results against. If you’ve gone through the survey process before, then comparing to previous results will be most enlightening to see if your/organizational actions have made an impact on previous focus areas. If this is your first time, comparing to the Company Overall will help you understand where your team is having a different experience (positive or negative) than others in the organization.