Skip to main content

I am updating records in the batched of 50. It is existing flow but suddenly since yesterday I am getting following error for random batches. Some batches updates successfully some of them fail. on each retry of same records it fails on a different batch.

{"result":false,"errorCode":"GSOBJ_1031","errorDesc":"Transaction limit reached. Given criteria impacts more rows than the given limit. ","localizedErrorDesc":null,"requestId":"5a06fb5f-d26c-4636-b518-fad2ce30c18d","data":null,"message":"Transactional Limit (50) reached. Can't impact 51 rows.","localizedMessage":null} 3.

I have checked that the batch contains 50 records only. I have also tried by reducing the batch size to 49. I got the similar error. 

Can you share your query? I’m thinking the lookups are matching on more than just the 50 that you expect to update.

 


It is a put request where batch contains 50 records.

I have also tried by reducing the batch size to 49. But when I do that it throws exception that Can’t impact 52 records. I am not sure how reducing the batch size is impacting more records than before.
 

r = requests.post(url, json={"records": this_batch}, headers=HEADERS, params=parameters )

 


I got the issue. There are records with duplicate ids. 

Thank you Jason


Reply