Hi, is there a way to export audit events in batch?
Currently it can be done only per userId and eventName: https://api.qualtrics.com/fde4005e2fc53-create-a-new-export-job
Same here for logs, it is also per userId and activityType: https://api.qualtrics.com/821e93ec17a62-list-events
Scalability
This approach does not really scale as one needs to:
- trigger export job per userId and eventName: https://yul1.qualtrics.com/API/v3
/audit-exports
- check export status per exportId:
/audit-exports/{exportId} - finally retrieves audit logs in NDJSON format
- ...all repeat per user and event type
Also due to current Qualtrics HTTP limits this process can take forever:
Currently for 1 user id - 66 eventNames it gives:
- 66 post = 3 minutes
- 66 gets = 3 minutes (expecting that all of them will be COMPLETED but this is the best case scenario rarely to be expected on PROD I would say)
- 66 gets = 3 minutes
In general lets say 10 min per userId.
8.33h for 50 userIds...16.66h for 100 userIds...32.99h for 200 userIds → it gets tricky if you want to export audit events daily as the whole process takes more than a day.
And everything could be scaled horizontally...but it is impossible due to limits per brand.
Other Questions
- If I have two (or N) X-API-TOKEN
for a brand for two (or N) technical users...do they share the Qualtrics limits or not? - As for a back-off policy how much time should I wait for retry if I get HTTP 429 response (TOO_MANY_REQUEST)?
Current limits:
https://api.qualtrics.com/a5e9a1a304902-limits
POST | /audit-exports | 30 requests per minute |
GET | /audit-exports/{exportId} | 30 requests per minute |
GET | /audit-exports/{exportId}/files/{fileId} | 30 requests per minute |