Hello - We are long time users of the Discover platform. We started with loading surveys (hosted in non-Qualtrics platforms) and later added call interactions. We execute operational uses cases in Qualtrics Discovey like monitoring views and closed the loop. We are manually exporting classification data for analysis external to Qualtrics so we can stitch call segments together (transfers) and append additional customer, policy and interaction details. And we have visions of automaating analysis once data is in our Snowflake environment.
We have struggled to use the Export API with AWS to copy records from Qualtrics to our analytics enviornment. A challenge is the quanity of data and payload restrictions for AWS to the point that this is blocking our IT team from scaling service.
I’m curious about ways clients have implimented similiar processes.
- Are you having Qualtrics batch export (e.g., scheduled file delivery to AWS S3 or another cloud storage location) as a staging point for Snowflake?
- Are you using a direct integration or native connector Snowflake for exporting data?
- If not, are there any capabilities—given that Qualtrics uses Snowflake in its own infrastructure—that could facilitate a more efficient export pattern (e.g., Snowflake data sharing or similar mechanisms)?
If neither of the above is currently available, what alternative patterns (e.g., S3 delivery, ETL partner solutions) do you recommend for scalable exports to Snowflake?
We’ve hit critical mass with 1 billion sentences in our Discover instance. I’m eager to understand how others have managed a large history copy then established regular candence for incremental loading. A third aspect is how you are managing updates to classification models for records already present in your Snowflake environment.
Kind regard, Justin Meyers