Currently, it's my understanding that the only way matillion can unload data to S3 is through the "Unload to S3" orchestration component. I was wondering whether there was a way to setup an incremental unloading/updating operation from matillion to S3- not just unloading objects to S3, but updating existing objects in S3 with changes from the database Matillion is unloading data from.
4 Community Answers
Dan D'Orazio —
Hi Matthew -
I don’t believe it is possible to update objects in s3 using the unload command, and external tables in Redshift can’t be ‘updated’, as far as I’m aware. If you’re looking to utilize s3 as the storage layer for this table, I believe the best way to do this is to load the table to Redshift, apply the updates/inserts incrementally, Unload it back to S3 and truncate the tables in Redshift.
Hi Dan, Snowflake has a STREAM object that stores insertions/deletions for a particular table. Users are able to unload these updates to S3 as they would a normal table. I was wondering if Matillion ETL for Snowflake leveraged this feature.
So currently we don’t support STREAMS inside Matillion. However, you can interact with Snowflake using the SQL Script component therefore it may be worth trying the STREAM commands through the SQL Script component.