Community Groups are officially here!
We've released Groups, a new feature that enables us to connect community members of similar industries and interests in a shared, private space. You can check out all of the details here, including information about who can join, how to join, and what Groups are currently offered. Please leave your feedback through this Community Groups Feedback Survey.

Survey Audit

VirginiaM
VirginiaM Boone, North CarolinaCommunity Member Sage ✭✭✭
edited June 23 in Survey Platform

We all know that looooooong surveys make it difficult to get adequate responses. I work for a university, and our system office requires us to send out several surveys per year, each with a long list of mandated questions. The surveys take about 30 minutes to complete and response rates are dismal. However, this year they changed the policy to allow us to select our own questions. My plan is conduct a survey audit over the course of the next month. I have gone through each survey, determined which departments on campus would use each question, and will send the lists of questions out for departments to review. They have to indicate whether or not they use each question for assessment, and if so, tell us how. This way, we can exclude the questions that are not useful.
Does anyone else regularly audit their surveys and/or survey questions? If so, how do you go about doing that? Is the process useful?
I will post a follow-up on this once the audit is complete...

Comments

  • KatC
    KatC [email protected]Community Member - Trial User Guru ✭✭

    I find that survey audits are really useful. Obviously if you're doing a tracking study, then that's not the best time. But for most other surveys, I think they are can really help to streamline the experience. I tend to look at each question and decide what the answer to that question is intended to inform. Then I look to see if there are redundancies.

    I also sometimes ask someone who hasn't taken that survey to take it while I watch them. In this way, I'm doing a bit of "usability" on that survey and seeing what works/doesn't work. This may then help clarify if some options or language need to change.

  • bstrahin
    bstrahin Madison, WICommunity Member - Trial User Wizard ✭✭✭✭✭

    This may not be practical for you but often times I am lucky enough to be in meetings with the staff and faculty that have their long lists of "wants" that they consider "needs." I have become a bit of a nightmare for people by always pushing the question - what will you use that data for? What makes it more than nice to know? What action can you take off of that question? If they can't answer that they are told it is on the cut list of they start going over 10 minutes for a survey and are asked to prioritize these nice to know questions. I think it has been really helpful to build a culture of why are we asking these questions and what can I do with the data. It's been helpful on reducing the stress on respondents. It doesn't work with everyone but I am starting to see a shift to more compassion to what we are putting our learners through when it comes to evaluation.

  • VirginiaM
    VirginiaM Boone, North CarolinaCommunity Member Sage ✭✭✭

    @bstrahin I love this idea! Will certainly incorporate it into some of our assessment meetings :)

  • Akdashboard
    Akdashboard South CarolinaCommunity Member Superuser ✭✭✭✭

    @bstrahin said:
    This may not be practical for you but often times I am lucky enough to be in meetings with the staff and faculty that have their long lists of "wants" that they consider "needs." I have become a bit of a nightmare for people by always pushing the question - what will you use that data for? What makes it more than nice to know? What action can you take off of that question? If they can't answer that they are told it is on the cut list of they start going over 10 minutes for a survey and are asked to prioritize these nice to know questions. I think it has been really helpful to build a culture of why are we asking these questions and what can I do with the data. It's been helpful on reducing the stress on respondents. It doesn't work with everyone but I am starting to see a shift to more compassion to what we are putting our learners through when it comes to evaluation.

    This! Exactly this!

  • Michael_Campbell_RedPepper
    Michael_Campbell_RedPepper Pleasant Grove, UTCommunity Member, Qualtrics Partner Sage ✭✭✭

    And this is a side, but still important point - the ordering of the questions can make it or break it!! I was taught to keep demographic information (things that people are often sensitive about) at the very end. For example, if you start a survey asking a person about their income, it might turn them off.

    Also, make sure you have an indicator of percent complete. It may be discouraging to some towards the beginning, but it helps them know that they are, indeed, making progress.

  • Akdashboard
    Akdashboard South CarolinaCommunity Member Superuser ✭✭✭✭
    edited March 2018

    To piggy back off of Michael, a few other best practices for long surveys are:
    1) Use encouraging segmentation language as the survey progresses. (Ex. "Thanks for input! We are almost done, but now will ask you a couple of questions about XYZ.)
    2) Lead with low-effort questions to ease the survey taker in. This may seem counter intuitive, but putting in a few easy "yes" questions can increase completion rate, even though it makes the survey longer. (Ex. "We have a few questions to verify who you are. Is your name BLAH BLAH? Are you a student/customer of BLAH BLAH?") - The principal theory behind this is around time-investment as it compounds with level of effort of the given task (ie. "I already spent 8 minutes on this survey, but I have completed 77 questions! I can do 23 because the history of the previous questions tells me the rest will be easy/fast.")