Distinguish between "no response" and "not applicable" on slider scale? | XM Community
Solved

Distinguish between "no response" and "not applicable" on slider scale?

  • 29 July 2020
  • 4 replies
  • 772 views

Hello,
I hope I'm not posting something that's already been answered - I've looked through previous posts to check.

I have sliding scales that also give respondents the option of selecting the not-applicable box. I realize now that Qualtrics builds into the system to auto-exclude/record as blank in the data not applicable boxes. However, these are distinct to me and what i'm interested in. Is there a way to capture this data from the responses (whether someone selected NA or is truly a skip) automatically or with some of script? Or will I need to manually look at every response and sliding scale to see if they have checked the box or skipped the question?

Thanks in advance!

icon

Best answer by ChristianB 23 September 2020, 19:59

View original

4 replies

Badge

Hi cjenkins!
Non-applicable responses are given the setting "Exclude from Analysis" by default. You are able to change this setting however, so that the NA response does actually get recorded! I'd recommend following the linked support page for a step-by-step guide on this.

Hello,
I don't believe this response actually answers the question. The hyperlinked page shows exclude for analysis for non-sliders only.
Any help with differentiating non-responses from respondents selecting the checkbox in slider question recorded data is very much appreciated!
Clarity on the implications for forcing a response when the checkbox is used for a meaningful value rather than NA would also be very helpful. For example, I use the slider as a scale of 1-50 and label the checkbox "more than 50."
Thank you!

Badge +2

Hi,
I am having the same issue as the original poster, cjenkins, and wondered if anyone had a solution. I don't think the response above works for my situation and I can't find an answer elsewhere in the Qualtrics Community. In short, I need to be able to distinguish 'not applicable' from 'no response'.
Of note, I have already collected my data so cannot change the way the questions were formatted.
I have a slider where participants rate the importance of a variable from 0-100, or they could check a 'Not applicable' box (which I renamed 'Unable to respond'). If they opted not to move the slider, this was taken to mean 0 (i.e. the variable is of 0 importance out of 100).
The study team realised we needed to distinguish between someone intentionally not moving the slider (i.e. leaving it on 0 saying the variable isn't important) from people who just skipped the question. As such, at the end of each question we had a checkbox saying: "When we analyse the survey results, we will need to be able to distinguish between missing data or 'unable to respond' and risk factors that have been marked as 0/100 in terms of importance. All of these are indicated by a ‘0’. To support our interpretation of the results, please select 'Yes' to confirm that you understand that if you mark a risk factor as zero on importance, it will be interpreted as 'Not important' unless you have checked 'Unable to respond'."
However, when I download the data I see that 'Not Applicable/Unable to Respond' is treated as a blank cell, as are any sliders which have not been moved from the default of 0. I need to be able to distinguish between people saying 0 importance vs those who said 'Unable to Respond'. Is there a way to download the 'Unable to Respond' results so I can tell these apart (NB: I have renamed to this but it would be called 'Not Applicable' in Qualtrics).
thanks

Badge +2

I ended up contacting the Qualrics support team at my university and got useful support for this and hope it can help others. For me, because I’d already done the survey so couldn’t change the settings, I had to use a workaround. (Ideally you would set the survey up from the beginning to do what you want).

Both methods below:

  • Workaround (i.e. when data has already been collected): View the raw responses from the data and analysis tab in Qualtrics (which does show when the N/A box is checked). Then I had to manually clean my downloaded data file, distinguishing the two types of blanks (N/As and slider not moved from 0) i.e. for my case I wrote ‘UTR’ for Unable to Respond, and ‘0’ for those where the slider was left at 0 on purpose. 
  • Setting up the survey correctly (before data collection - Answer courtesy of support staff at Qualtrics):

“Going forward, I thought of a workaround for you that you can use on survey's utilizing slider questions along with the N/A feature. 

- When setting up the slider question, click 'custom start position' and move the slider for each category to 50. I suggest 50 as respondents are more likely to move left or right from there (as compared to setting it to 1, for example). 

- Toggle on 'add requirements' and add 'force response'. This will prevent respondents from advancing without moving the slider or checking 'not applicable'. You can also add "please move slider or click N/A for each category" in the question header. 

- Once you have all of your responses and export the data file, click 'more options' on the bottom of the menu. Check the box to 'recode seen but unanswered questions as -99. Each -99 response will be your N/A responses, rather than a blank cell which will be easier for you filter.”

Leave a Reply