@dduarte none that I am aware of. When raised previously on my end to Qualtrics though it’s largely dependent on the quality of your sample/audience and recommendation to improve our response rate was directed to being more targeted in who gets surveyed (send less invites).
If your seeing substantial difference between programs, then it be good to consider what is different between them. For example for our programs we see between 3% to 30% response rate depending on program.
A/B testing changes within the same survey, can be one way to identify opportunities to improve response rate specific to that survey’s objective. By the sounds of it, this is what you are likely doing anyway.
Hi @dduarte,
I can only suppor@ScottG said: Response rates highly depend on the type of feedback programme you are running and on the audience.
In our programmes I have response rates in the range of 0.1% to 25% - and I would consider all of them as “good”.
I basically can see two different ways for improving response rates:
- Improve the invitation → More respondents starting the survey
- Improve the questionnaire → Fewer respondents drop out in the middle of the survey
A shorter questionnaire pays off on both points.
You also could post here some information about your survey and your response rate. Some people here have quite some expereince with response rates and can judge whether your numbers are high and low. This qualitative judgement will most probably be way better than an average statistic of Qualtrics.
Hi @dduarte
We recently has a similar question over our response rates but was unable to benchmark with any great certainty for our industry.
We decided to try and improve them none the less and undertook the following changes:
- Redrafted the ‘from name’ and subject line. Basically we made the language less formal and more inviting. Interestingly we did not see any +/- on open rates.
- We piped our first survey question into the email invite itself. This really worked and significantly drove up our ‘start rate’ for the survey.
- Generally, restructured and improved the survey flow and removed excess/redundant questions. We positioned our key research metrics to the top of the survey.
- We played around with whether the distribution time of the survey invite would have an impact, but we found it didn’t.
We performed an A/B test and overall we saw an 8% increase in response rates.
Hope this helps,
Harry