Hello, In share.properties, let's say MaxRuntimeTriggers = 100. I have a "Data maintenance" batch job running Object Script on every record of "object A". My batch job creates "object B" records. When the bath job creates a record, about 20 triggers are executed. If my batch job has to created more than 5 records, then MaxRuntimeTriggers is reached and the 6th record is not created. I can see in logs "too many triggers ...". If you confirm it works this way, I guess it would be logical if MaxRuntimeTriggers would be taken into account for each object A instead of the whole batch job. I hope what I say is clear enough. Thank you very much for your help. Matthieu
Please try again with 3.9.5 release.
Pavel,
It does not work better with 3.9.5 release. We get the same issue.
Can you take a look at this ?
Thank you for your help.
Matthieu
I just did a simple test: run Data Maintenance job to update 233 records, each update invokes 1 trigger. Everything works fine.
Pavel,
Just to be sure,
Have you let the batch job doing his work on scheduled time or have you click on the batch job "Run now" link ?
If you click on the "Run now" link, the batch job works fine but if you let the batch job doing his work on scheduled time, it does not work.
If you let the batch job doing his works on scheduled time, as it worked fine for you, I will investigate the issue to know exactly what is the problem.
Thank you !
Matthieu
I double checked - Batch Job does not have limitation on total number of triggers for all records. However please keep in mind:
- Query API role must have permissions to use API in your Batch Job
- Number of triggers per record is lower than in case of UI - 20 by default, can be changed for Private Cloud customers