- DarkLight
Table Output
- DarkLight
Table Output Component
The Table Output component empowers users to write an input data flow out to an existing output table.
Successful validation of this component ensures the following:
- The target table exists.
- The target columns have been found.
Data is only written to the table when the Transformation Job containing the Table Output component is actually run.
Many potential problems are avoided by a successful validation; however, run-time errors can still occur during execution—for example, your Redshift Cluster may run out of disk space.
Many potential problems are avoided by a successful validation; however, run-time errors can still occur during execution—for example, your Snowflake Warehouse may hit the resource monitor cap.
Properties
Snowflake Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Warehouse | Select | Choose a Snowflake warehouse that will run the load. The special value, [Environment Default], will use the warehouse defined in the Matillion ETL environment. |
Database | Select | Choose a database to create the new table in. The special value, [Environment Default], will use the database defined in the Matillion ETL environment. |
Schema | Select | Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, see this article. |
Target Table | Select | Select the output table. The available tables depend on the selected Matillion ETL environment. |
Fix Data Type Mismatches | Select | Select whether to cast types in the event of data type mismatches.
|
Column Mapping | Input Column | The source field from the input flow. |
Output Column | The target table's output column to insert data into. | |
Order By | Column | Select the output column(s) to sort by. |
Sort Order | Set the corresponding column to be ordered ascending or descending. The default sort order is ascending. | |
Truncate | Select | Select from the following operations in regard to handling new data:
|
Redshift Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Schema | Select | Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on using multiple schemas, see this article. |
Target Table | Select | Select the output table. The available tables depend on the selected Matillion ETL environment. |
Fix Data Type Mismatches | Select | Select whether to cast types in the event of data type mismatches.
|
Column Mapping | Input Column | The source field from the input flow. |
Output Column | The target table's output column to insert data into. | |
Truncate | Select | Select from the following operations in regard to handling new data:
|
BigQuery Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Target Project | Select | Select the Google Cloud Platform project. The special value, [Environment Default], will use the project defined in the Matillion ETL environment. |
Target Dataset | Select | Select the Google Cloud Platform dataset. The special value, [Environment Default], will use the dataset defined in the Matillion ETL environment. |
Target Table | Select | Select the output table. The available tables depend on the selected Matillion ETL environment. |
Column Mapping | Input Column | The source field from the input flow. |
Output Column | The target table's output column to insert data into. | |
Truncate | Select | Select from the following operations in regard to handling new data:
|
Cast Input | Select | Select whether to CAST input types in the event of a data type mismatch with the target column's data type.
|
Synapse Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Schema | Select | Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on schemas, please see the Azure Synapse documentation. |
Target Table | Select | Select the output table. The available tables depend on the selected Matillion ETL environment. |
Column Mapping | Input Column | The source field from the input flow. |
Output Column | The target table's output column to insert data into. | |
Truncate | Select | Select from the following operations in regard to handling new data:
|
Fix Data Type Mismatches | Select | Select whether to cast types in the event of data type mismatches.
|
Delta Lake Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Catalog | Select | Select a Databricks Unity Catalog. The special value, [Environment Default], will use the catalog specified in the Matillion ETL environment setup. Selecting a catalog will determine which databases are available in the next parameter. |
Database | Select | Select the Delta Lake database. The special value, [Environment Default], will use the database specified in the Matillion ETL environment setup. |
Target Table | Select | Select the output table. The available tables depend on the selected Matillion ETL environment. |
Column Mapping | Input Column | The source field from the input flow. |
Output Column | The target table's output column to insert data into. | |
Truncate | Select | Select from the following operations in regard to handling new data:
|
Fix Data Type Mismatches | Select | Select whether to cast types in the event of data type mismatches.
|
Variable Exports
A full list of common component exports can be found here.
Source | Description |
---|---|
Automatic Compression | If the Automatic Compression property is set to "If not already", then this variable will be set to "Yes" when compression has been applied where it had not been before. Otherwise, this will be "No". |
Strategy
Perform a bulk-insert into the target table, possibly after a TRUNCATE.