EMEA Admin Office Hours - 4/4/19

  • 1 April 2019
  • 4 replies

Userlevel 4
Badge +4
This thread is for the Admin Office Hours session for Thursday April 4th, 2019. Submit your questions to this thread and we'll address them during our session at 1p-3p GMT.

There is no need to register for the sessions anymore. If you have a question, please post below or join the meeting. I will go in order of posts below or joining the session.




You can also dial in using your phone.

United States: +1 (571) 317-3117

Access Code: 860-695-853

More phone numbers

United Kingdom: +44 330 221 0097

4 replies

We have a number of REST API's and would like to set up some GET requests.

Once we have this info where is best to store the data?

Hi. I would like some guidance on how to create GRR and NRR reports. Thanks.

Badge +1
I have an issue with updating automatic/system generated CTAs. When trying to update these the system hangs. It is fine to update manual CTAs. I am new to Gainsight, so wanted to know whether this is the correct thing to do. Thanks

Badge +1
If time allows can we disucss a couple of items please:

Surveys spam - we've just launched our 1st survey (testing it internally), have set up the domain & enabled secure connection in SFDC yet the emails are being caught up in Spam. Is there maybe another step I may have missed or different setting to use? The email address is support@ so shoudl be fine.

2nd question relates to data management and how to map usage data to SFDC Account ID else that in our data lake.

- we have been loading our product usage for 2 years now with email address the unique idenitfier, the SFDC mapping happens in our own data lake. Now we have launched Person 360, I'd like to use a map in Gainsihgt - object called User mapping to determine which email address belongs to which account (this map is fluent is at is changed by CSMs from C360s). Is there a way we can load the product usage data but without SFDC Account ID - this is then mapped to SFDC ID using the internal ap in an abject rather than this being done in opur data lake?