I am trying to figure out how this works .. here is the scenario/test I have set up.
I have one job that has this in the post process.
The variable object looks like this:
The runtime looks like this:
The variable declaration at job level looks like this:
There is a workflow with 2 instances of the job in it:
The other job has RUN_WHAT set to SLOW
The runtime tab is set like:
I have the following row in UC_CLIENT_SETTINGS:
I have executed the workflow +30 times to get enough stats.
What I was expecting to happen was, if I amend the variable object for the FAST value from 1 to 30, the runtime deviation would trigger and activate the job JOBS.UNIX.TEST_AI_OVERRUN?
Should this be the expected behavior? Are there any settings/options I am missing?
As well as the variables used by the jobs execution, is it documented anywhere what else the dynamic adaptive option uses? For example, if there was a job that under normal circumstances took seconds to run but on the last Friday of every month took an hour, is it clever enough to identify this pattern?
Also, does it get logged anywhere what ERT calculation was used during the execution?