Table Input
- DarkLight
Table Input
- DarkLight
Article Summary
Share feedback
Thanks for sharing your feedback!
Table Input Component
Read chosen columns from an input table or view into the job.
This component will also work with external tables.
Properties
Snowflake Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Database | Select | Choose a database to create the new table in. |
Schema | Select | Select the table schema. The special value, [Environment Default] will use the schema defined in the environment. For more information on using multiple schemas, see this article. |
Table Name | Select | The name of the input table or view. The tables and views found in the currently selected environment are provided to choose from. You can change the currently selected environment in the Environments section. |
Column Names | Multi Select | Once the Table Name is set, the columns become available to choose from. Use the Editor to select which columns to pass along. If you don't like the column names, consider using a Rename component to change them. |
Offset | Select | Offsets the table contents by the number of specified seconds. This is a function of Snowflake's Time Travel feature, allowing you to see a table as it was X seconds ago. |
Redshift Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Schema | Select | Select the table schema. The special value, [Environment Default] will use the schema defined in the environment. For more information on using multiple schemas, see this article. |
Table Name | Select | The name of the input table or view. The tables and views found in the currently selected environment are provided to choose from. You can change the currently selected environment in the Environments section. |
Column Names | Multi Select | Once the Table Name is set, the columns become available to choose from. Use the Editor to select which columns to pass along. If you don't like the column names, consider using a Rename component to change them. |
Trim Columns | Select | Wraps the column names in a BTRIM function, which will strip out all the leading and trailing spaces. See the Redshift documentation for details. |
BigQuery Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Target Project | Text | Enter the name of the Google Cloud Platform Project that the table belongs to. |
Dataset | Text | Enter the name of the Google Cloud Platform Dataset that the table belongs to. |
Target Table | Select | The name of the input table or view. The tables and views found in the currently selected environment are provided to choose from. You can change the currently selected environment in the Environments section. |
Column Names | Multi Select | Once the Table Name is set, the columns become available to choose from. Use the Editor to select which columns to pass along. If you don't like the column names, consider using a Rename component to change them. |
Include Partition Time | Yes/No | Opt whether to include the '_PARTITIONTIME' pseudo-column from the partitioned table that holds timestamps for the data in the table. This property is only visible when a partitioned table is selected in the 'Target Table' property. Partitioned tables can be created through the 'Partitioning' property in the Create Table Orchestration component. |
Partition Time Alias | Text | Choose a name for a new column that will take on the '_PARTITIONTIME' pseudo-column data. This name cannot be the same as another column already in the table. This property is only visible when a partitioned table is selected in the 'Target Table' property. Partitioned tables can be created through the 'Partitioning' property in the Create Table Orchestration component. |
Synapse Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Schema | Select | Select the table schema. The special value, [Environment Default], will use the schema defined in the environment. For more information on schemas, please see the Azure Synapse documentation. |
Table | Select | Select the table. The tables available are those found in the currently selected environment. |
Column Names | Multi Select | Once the Table Name is set, the columns become available to choose from. Use the Editor to select which columns to pass along. If you don't like the column names, consider using a Rename component to change them. |
Delta Lake Properties | ||
---|---|---|
Property | Setting | Description |
Name | String | A human-readable name for the component. |
Catalog | Select | Select a Databricks Unity Catalog. The special value, [Environment Default], will use the catalog specified in the Matillion ETL environment setup. Selecting a catalog will determine which databases are available in the next parameter. |
Database | Select | Select the Delta Lake database. The special value, [Environment Default], will use the database specified in the Matillion ETL environment setup. |
Table | Select | Select the table. The tables available are those found in the currently selected database. |
Column Names | Column Select | Select which columns to load from the selected table. |
Offset Type | Select | Select the OFFSET. The default is "None". |
Offset | Select | Select the timestamp or version to offset. This is a function of Delta Lake's Time Travel feature, allowing you to see a table as it was X seconds ago or at a defined version. This property is hidden when Offset Type is "None". |
Strategy
Generates an SQL-like SELECT
query.