Skip to main content
New Idea

SnowFlake Connector - Required Field Elements are Too Restrictive

Related products:None
brandon
Brayden
  • brandon
    brandon
  • Brayden
    Brayden

jeremy_johnston

Hello team!

We’re moving many data feeds we used to consume in GS from S3 to SnowFlake db. 

In testing our new connection, we’ve encountered some roadblocks to importing our data via SnowFlake.

 

Case 1: it is required to specify a reference for last modified date when setting up a new job within SnowFlake connector.  However, not all of our SnowFlake data objects contain DateTime fields.  We’re therefore unable to import our data via the connector and have to use a manual upload to MDA via a .csv.  When the dataset is large, as it always is when trying to establish a NEW job, we have to manually separate our source data into multiple smaller csv files to get under the max upload size allowed.  Or we have to create a rule ad-hoc to import via S3.

Ask:  don’t require a last modified date value.

Case 2: we can only specify 1 field as the unique identifier when setting up a new job with SnowFlake connector. That is not always enough to accurately identify unique records however; not all records in our sources expose a unique record id or oid.  As a result, when we try to do an initial import of ALL data from our source SnowFlake object, we only get partial data imported.  In 1 case, I knew there to be 1.6mm records and only 4,000 were imported, because I couldn’t fully specify how they are unique.  

 

Ask: allow for multiple fields to be selected to define a unique record.  This is already possible in other GS modules such as JO and Rules Engine, would expect the same in SnowFlake connector.

 

 

 

0 replies

Be the first to reply!

Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings