Skip to main content
Loading...
Skip to article
  • Qualtrics Platform
    Qualtrics Platform
  • Customer Journey Optimizer
    Customer Journey Optimizer
  • XM Discover
    XM Discover
  • Qualtrics Social Connect
    Qualtrics Social Connect

Building ETL Workflows


Was this helpful?


This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

The feedback you submit here is used only to help improve this page.

That’s great! Thank you for your feedback!

Thank you for your feedback!


About Building ETL Workflows

The Qualtrics Workflows platform contains a series of tasks to assist in importing data from third-party destinations into Qualtrics or exporting data from Qualtrics to third-party destinations. These tasks follow the Extract, Transform, Load (ETL) framework. Using ETL tasks, you can create automated and scheduled workflows to bring data from third-party sources into Qualtrics as well as export data from Qualtrics to third-party destinations.

To create an ETL Workflow, you must create 1 or more extractor tasks and 1 or more loader tasks. You are only limited by the overall limit for tasks in 1 workflow.

Qtip: ETL tasks can take up 24 hours to execute. If the task doesn’t succeed within 24 hours, the task will fail.
Attention: An ETL workflow can process multiple files as part of one workflow. Each file may be up to 5GB in size, but the total size of all files combined must be less than or equal to 10GB. Make sure to check the documentation for the tasks you’re using as some tasks have a smaller file size limit.

General Setup for Extractor and Loader Tasks

Qtip: For a general guide on using workflows, see Workflows basic overview.
  1. From the stand-alone Workflows page or the Workflows tab in a project, click Create a workflow.clicking create a workflow and choosing etl
  2. Select Extract, transform, and load (ETL) data.
  3. ETL workflows usually run on a recurring schedule. Choose a schedule for your workflow. See Scheduled Workflows for more information about setting a workflow’s schedule.
    choosing a schedule and clicking save
  4. Click Save.
  5. Click Data source (ETL extractor) to choose an extractor task to use. See Available Extractor Tasks for a list of tasks you can use. You can add multiple extractors in 1 workflow.adding an extractor task, loader task, adding notifications and turning the workflow on
  6. If you’d like to transform data before you load it, click Data transformation. This step is optional. See Basic Transform Task for more details.
  7. Click Add a data destination to choose a loader task to use. See Available Loader Tasks for a list of tasks you can use. You can add multiple loaders in 1 workflow.
  8. This step is optional, but useful for alerting you if something goes wrong with your workflow. Go to Settings to set up workflow notifications to get notified if your workflow ever fails.
  9. Don’t forget to turn your workflow on.
Qtip: After setting up your workflow, click Run Immediately to to test your ETL workflow and ensure it’s working.
top-right of workflow, next to workflow toggle, there's a run immediately button
Qtip: You can add additional tasks to your ETL workflows. For example, add a web service task to post to a webhook.

Reloading Data

When there’s been a configuration change between ETL tasks that depend on each other, a button will appear to reload all data with the new configuration. Click Reload the data if you’d like to reprocess the old data.
the "reload the data" button an an ETL task that has changed

Attention: This will reload all data from the task that was edited. Normally, ETL workflows only process new data since the last run. However, if this option is selected, then the next time the workflow runs, all data since the workflow was created will be processed.
Example: When an extract responses from survey task normally runs, only new responses since the last run will be processed. However, if data is reloaded, then the next time the workflow runs, all responses in the survey will be processed.

Available Extractor Tasks

Here are some of the extractor tasks available at this time:

Qtip: You can also view a list of extractors using the site menu to the left.

Available Loader Tasks

Here are some of the loader tasks available at this time:

Qtip: You can also view a list of loaders using the site menu to the left.

Available Data Transformation Tasks

The following tasks are available to you for transforming the data you process in your ETL workflows:

  • Basic transform task: Change the format of strings and dates, calculate the difference between dates, perform math operations on numeric fields, and more.
  • Merge task: Combine multiple datasets into a single dataset.

Troubleshooting Data Extracting and Loading Tasks

Incomplete Workflows

Data extractor and loader tasks must be used together. If you are missing a piece of their setup, the workflows editor will alert you.

Example: In this example, we are missing our data loader task. The workflows editor alerts us that our “import task is currently missing a destination.”
the "missing a destination" error message
Example: In this example, we are missing our data extractor task. The task editor alerts us that “you need to add at least one data source task.”
creating a loader task without an extractor. the task alerts you to set up an extractor first

Workflows Failing

If your tasks are failing or not firing properly, the first place you should look is Workflows reporting & history. This will contain information about every time your workflow fired and the results of that workflow.

In reporting & history, each piece of your workflow will have its own entry, making it easy to pinpoint where things went wrong.

Example: In the below example, we are using an extract data from SFTP files task and a load B2B account data into XM Directory task. We can see that the extractor task failed but the loader task succeeded. This means our extractor task is misconfigured, but our loader task is OK.
the reporting section. the extract task failed but the load task succeeded

After identifying the problem, you can click View under Details to see more information to help you diagnose how to fix the problem. This will pull up the JSON payload for the task. Scroll down to the Task Output section to find any errors.

Example: In this example, we see that the task failed because the connection had invalid parameters for an extract data from SFTP files task.
the task output section of a failed action giving the error why it failed

After editing your workflow and fixing the problem, you can return to reporting & history and click Retry to rerun the workflow.
the retry button for retrying a failed workflow

FAQs