Anyone know a way to load into SnowFlake directly from Goole Cloud Bucket?
Seems to me that the only way is to move the data from Google to S3 and then use the S3 loader, or am I missing something obvious (or even something not obvious)?
3 Community Answers
Kalyan Arangam —
You are right, we cannot load data from files in Google Cloud Bucket. The files need to be in S3.
One option is to use a DATA TRANSFER component to copy the files across and then an S3 LOAD to load them into Snowflake.
The FILE_ITERATOR + DATA_TRANSFER many be useful combination to loop over files in Google-buckets, copy them to S3 and then load into Snowflake.
Thanks for your answer. The file iterator and data transfer may be satisfactory.
I would suggest though, that Matillion should be considering creating an Azure data loader given that Snowflake is now running in Azure as well as AWS. A loading solution where we have to transfer all our data to AWS and then pay for the egress when we transfer it to Azure is not very attractive.
Thanks for the suggestion. I will raise a ticket on that.
In the meantime, one option to try may be to create a named-external-stage for the Azure Blog Storage in Snowflake. Store URL and CREDENTIALS in snowflake. Then issue a COPY command using the S3 LOAD component or SQL SCRIPT component in matillion.