Zuora Bulk Query Component

Zuora Bulk Query

This component uses the Zuora AQuA API to retrieve data and load it into a table. This stages the data, so the table is reloaded each time. You may then use transformations to enrich and manage the data in permanent tables.

The component offers both a Basic and Advanced mode (see below) for generating the Zuora AQuA API query.

Note: Some columns cannot be queried alongside other, specific columns:
When using the ProductRatePlanChargeTier data source, only one of the following columns can be queried at a time: Price, DiscountAmount, DiscountPercentage.
When using the RatePlanCharge data source, only one of the following columns can be queried at a time: OveragePrice, Price, IncludedUnits, DiscountAmount, DiscountPercentage.

Warning: This component is destructive as it truncates or recreates its target table on each run. Do not modify the target table structure manually.

Properties

Property Setting Description
Name Text The descriptive name for the component.
Basic/Advanced Mode Choice Basic - This mode will build a Zuora Query for you using settings from Data Source, Data Selection and Data Source Filter parameters. In most cases, this will be sufficient.
Advanced - This mode will require you to write an SQL-like query which is translated into one or more Zuora AQuA API calls.
Endpoint Select Select the endpoint for the Zuora service of choice.
Authentication Choice Select an authentication method, which must be setup in advance. Zuora uses a variation of the OAuth standard for authenticating 3rd party applications. More help is provided in the Zuora 3rd Party OAuth documentation.
Max Wait Minutes Integer The number of minutes to wait for the query to return before timing out.
Entity ID Text The Zuora entity to pull data from.
Data Source Choice Select a data source from the pre-configured list.
Data Selection Choice Select one or more columns to return from the query.
Data Source Filter Input Column Select a column on which to base your filter.
Qualifier Is - Compares the column to the value using the comparator.
Not - Reverses the effect of the comparison, so "equals" becomes "not equals", "less than" becomes "greater than or equal to", etc.
Comparator Choose a method of comparing the column to the value. Possible comparators include: 'Equal To', 'Greater than', 'Less than', 'Greater than or equal to', 'Less than or equal to', 'Null'.
'Equal To' can match exact strings and numeric values while other comparators such as 'Greater than' will work only with numerics. The Null operator matches only Null values, ignoring whatever the value is set to.
Not all data sources support all comparators, thus it is likely only a subset of the above comparators will be available to choose from.
Value The value to be compared.
Combine Filters Select Use the defined filters in combination with one another according to either "and" or "or".
ZOQL Query Text This is a query written in ZOQL, according to the Zuora data model. (Property only available in 'Advanced' Mode)
Warehouse Select Choose a Snowflake warehouse that will run the load.
Database Select Choose a database to create the new table in.
Schema Select Select the table schema. The special value, [Environment Default] will use the schema defined in the environment. For more information on using multiple schemas, see this article.
Note: An external schema is required if the 'Type' property is set to 'External'.
Table Text Provide a new table name.
Warning: This table will be recreated and will drop any existing table of the same name.
Staging Select (AWS Only) Snowflake Managed: Allow Matillion ETL to create and use a temporary internal stage on Snowflake for staging the data. This stage, along with the staged data, will cease to exist after loading is complete.
Existing Amazon S3 Location: Selecting this will avail the user of properties to specify a custom staging area on S3.
S3 Staging Area Text (AWS Only) The name of an S3 bucket for temporary storage. Ensure your access credentials have S3 access and permission to write to the bucket. See this document for details on setting up access. The temporary objects created in this bucket will be removed again after the load completes, they are not kept.
This property is available when using an Existing Amazon S3 Location for Staging.
Encryption Select (AWS Only) Decide on how the files are encrypted inside the S3 Bucket.This property is available when using an Existing Amazon S3 Location for Staging.
None: No encryption.
SSE KMS: Encrypt the data according to a key stored on KMS.
SSE S3: Encrypt the data according to a key stored on an S3 bucket
KMS Key ID Select (AWS Only) The ID of the KMS encryption key you have chosen to use in the 'Encryption' property.
Load Options Multiple Selection Comp Update: Apply automatic compression to the target table (if ON). Default is ON.
Stat Update: Automatically update statistics when filling a table (if ON). Default is ON. In this case, it is updating the statistics of the target table.
Clean S3 Objects: Automatically remove UUID-based objects on the S3 Bucket (if ON). Default is ON. Effectively decides whether to keep the staged data in the S3 Bucket or not.
String Null is Null: Converts any strings equal to "null" into a null value. This is case sensitive and only works with entirely lower-case strings. Default is ON.
Recreate Target Table:Choose whether the component recreates its target table before the data load. If OFF, the existing table will be used. Default is ON.
Load Options Multiple Select Clean Cloud Storage Files: (If On) Destroy staged files on Cloud Storage after loading data. Default is On.
Cloud Storage File Prefix: Give staged file names a prefix of your choice. Default is empty (no prefix).

Variable Exports

This component makes the following values available to export into variables:

Source Description
Time Taken To Stage The amount of time (in seconds) taken to fetch the data from the data source and upload it to storage.
Time Taken To Load The amount of time (in seconds) taken to execute the COPY statement to load the data into the target table from storage.

Strategy

Connect to the target database and issue the query. Stream the results into objects on S3. Then create or truncate the target table and issue a COPY command to load the S3 objects into the table. Finally, clean up the temporary S3 objects.