Name is required.
Email address is required.
Invalid email address
Answer is required.
Exceeding max length of 5KB

Incremental Updating to S3


Currently, it's my understanding that the only way matillion can unload data to S3 is through the "Unload to S3" orchestration component. I was wondering whether there was a way to setup an incremental unloading/updating operation from matillion to S3- not just unloading objects to S3, but updating existing objects in S3 with changes from the database Matillion is unloading data from.

Thank you!

4 Community Answers

Matillion Agent  

Dan D'Orazio —

Hi Matthew -

I don’t believe it is possible to update objects in s3 using the unload command, and external tables in Redshift can’t be ‘updated’, as far as I’m aware. If you’re looking to utilize s3 as the storage layer for this table, I believe the best way to do this is to load the table to Redshift, apply the updates/inserts incrementally, Unload it back to S3 and truncate the tables in Redshift.

Best -

Matthew Ha —

Hi Dan, Snowflake has a STREAM object that stores insertions/deletions for a particular table. Users are able to unload these updates to S3 as they would a normal table. I was wondering if Matillion ETL for Snowflake leveraged this feature.

Matillion Agent  

Damian Chan —

Hello Matthew,

So currently we don’t support STREAMS inside Matillion. However, you can interact with Snowflake using the SQL Script component therefore it may be worth trying the STREAM commands through the SQL Script component.

Best Regards,

Matthew Ha —

Thank you, that's worked for me.

Post Your Community Answer

To add an answer please login