Skip to main content
Solved

Failing to update batch of 50 says limit reached

  • April 17, 2025
  • 3 replies
  • 51 views

I am updating records in the batched of 50. It is existing flow but suddenly since yesterday I am getting following error for random batches. Some batches updates successfully some of them fail. on each retry of same records it fails on a different batch.

{"result":false,"errorCode":"GSOBJ_1031","errorDesc":"Transaction limit reached. Given criteria impacts more rows than the given limit. ","localizedErrorDesc":null,"requestId":"5a06fb5f-d26c-4636-b518-fad2ce30c18d","data":null,"message":"Transactional Limit (50) reached. Can't impact 51 rows.","localizedMessage":null} 3.

I have checked that the batch contains 50 records only. I have also tried by reducing the batch size to 49. I got the similar error. 

Best answer by mkujaggi

I got the issue. There are records with duplicate ids. 

Thank you Jason

3 replies

jason_metzler
Forum|alt.badge.img+2
  • Helper ⭐️⭐️
  • April 17, 2025

Can you share your query? I’m thinking the lookups are matching on more than just the 50 that you expect to update.

 


  • Author
  • Helper ⭐️
  • April 23, 2025

It is a put request where batch contains 50 records.

I have also tried by reducing the batch size to 49. But when I do that it throws exception that Can’t impact 52 records. I am not sure how reducing the batch size is impacting more records than before.
 

r = requests.post(url, json={"records": this_batch}, headers=HEADERS, params=parameters )

 


  • Author
  • Helper ⭐️
  • Answer
  • April 23, 2025

I got the issue. There are records with duplicate ids. 

Thank you Jason