Survey Audit | XM Community
Skip to main content
We all know that looooooong surveys make it difficult to get adequate responses. I work for a university, and our system office requires us to send out several surveys per year, each with a long list of mandated questions. The surveys take about 30 minutes to complete and response rates are dismal. However, this year they changed the policy to allow us to select our own questions. My plan is conduct a survey audit over the course of the next month. I have gone through each survey, determined which departments on campus would use each question, and will send the lists of questions out for departments to review. They have to indicate whether or not they use each question for assessment, and if so, tell us how. This way, we can exclude the questions that are not useful.

Does anyone else regularly audit their surveys and/or survey questions? If so, how do you go about doing that? Is the process useful?

I will post a follow-up on this once the audit is complete...
I find that survey audits are really useful. Obviously if you're doing a tracking study, then that's not the best time. But for most other surveys, I think they are can really help to streamline the experience. I tend to look at each question and decide what the answer to that question is intended to inform. Then I look to see if there are redundancies.



I also sometimes ask someone who hasn't taken that survey to take it while I watch them. In this way, I'm doing a bit of "usability" on that survey and seeing what works/doesn't work. This may then help clarify if some options or language need to change.
This may not be practical for you but often times I am lucky enough to be in meetings with the staff and faculty that have their long lists of "wants" that they consider "needs." I have become a bit of a nightmare for people by always pushing the question - what will you use that data for? What makes it more than nice to know? What action can you take off of that question? If they can't answer that they are told it is on the cut list of they start going over 10 minutes for a survey and are asked to prioritize these nice to know questions. I think it has been really helpful to build a culture of why are we asking these questions and what can I do with the data. It's been helpful on reducing the stress on respondents. It doesn't work with everyone but I am starting to see a shift to more compassion to what we are putting our learners through when it comes to evaluation.
@bstrahin I love this idea! Will certainly incorporate it into some of our assessment meetings 🙂
> @bstrahin said:

> This may not be practical for you but often times I am lucky enough to be in meetings with the staff and faculty that have their long lists of "wants" that they consider "needs." I have become a bit of a nightmare for people by always pushing the question - what will you use that data for? What makes it more than nice to know? What action can you take off of that question? If they can't answer that they are told it is on the cut list of they start going over 10 minutes for a survey and are asked to prioritize these nice to know questions. I think it has been really helpful to build a culture of why are we asking these questions and what can I do with the data. It's been helpful on reducing the stress on respondents. It doesn't work with everyone but I am starting to see a shift to more compassion to what we are putting our learners through when it comes to evaluation.



This! Exactly this!
And this is a side, but still important point - the ordering of the questions can make it or break it!! I was taught to keep demographic information (things that people are often sensitive about) at the _very_ end. For example, if you start a survey asking a person about their income, it might turn them off.



Also, make sure you have an indicator of percent complete. It may be discouraging to some towards the beginning, but it helps them know that they are, indeed, making progress.
To piggy back off of Michael, a few other best practices for long surveys are:

1) Use encouraging segmentation language as the survey progresses. (Ex. "Thanks for input! We are almost done, but now will ask you a couple of questions about XYZ.)

2) Lead with low-effort questions to ease the survey taker in. This may seem counter intuitive, but putting in a few easy "yes" questions can increase completion rate, even though it makes the survey longer. (Ex. "We have a few questions to verify who you are. Is your name BLAH BLAH? Are you a student/customer of BLAH BLAH?") - The principal theory behind this is around time-investment as it compounds with level of effort of the given task (ie. "I already spent 8 minutes on this survey, but I have completed 77 questions! I can do 23 because the history of the previous questions tells me the rest will be easy/fast.")

Leave a Reply