Skip to main content

Enable S3 Archiving for Rules That Create New S3 Files On Next Run

  • March 13, 2023
  • 2 replies
  • 20 views

jenlpro
Forum|alt.badge.img+3
  • Contributor ⭐️⭐️⭐️⭐️⭐️

I am setting up a handful of rules to export data daily from Gainsight to S3, where our team is going to pull the data to ingest into our DWH. The rules are creating tsv files in S3. We want to enable archiving for these rules such that if the rule ran today and created a file with today’s datetime stamp, it would save in our DWH folder, and a file with the same nomenclature from the same rule with an earlier datetime stamp (aka yesterday’s datetime stamp) would automatically archive. 

For our DWH, we wanted to be able to archive old files upon creation of new files to have a better audit trail in the event rules fail.

2 replies

jenlpro
Forum|alt.badge.img+3
  • Author
  • Contributor ⭐️⭐️⭐️⭐️⭐️
  • March 13, 2023

Adding more context after some conversation in the Slack community (thanks, @keith_mattes !)

There is a setting in Cyberduck/S3 itself that allows you to set up Archiving there instead of setting it up in Gainsight; however, it does not appear to be available for Gainsight Managed S3 buckets.

 


keith_mattes
Forum|alt.badge.img+3
  • Helper ⭐️⭐️
  • March 13, 2023

To add to this, using a self-managed S3 bucket does allow for archiving at this level, but this is also funded by our org, not GS.  It might be a restriction GS put in place to constrain space needed to support the bucket since they cannot manage an archive population and they can sometimes get out of hand if not maintained properly causing extraneous costs :)