Skip to main content

Hi list.

I've built a workspace with a loop looking for information, that I don't know the exact end of. So I loop with a margin, and I save an ID of the last found instance, and check whether I've had too many consecutive fails. If too many fails have been found, I redirect to a Terminator transformer.

However, when I ran the workspace, expecting some output to a database, no output was produced.

It seems that the Terminator transformer rollback previous database inserts. When I disabled the Terminator, all worked as expected.

Is there a way of making a peaceful halt of processing of a workspace, not the "pull-the-plug" kind of behaviour that Terminator seems to perform ?

I.e., is there an option to make the Terminator transformer less "Terminator"-like ??

Cheers

How are you writing to the database? What is your features per transaction set to? Are you using bulk insert?

 

There's no "successful exit early" version of the terminator, but you can probably accomplish what you want by changing some of the writer parameters.


Further to what @jdh​ says, since the Terminator only rolls back the currently open transaction block, it is possible to limit how much the Terminator rolls back by setting the writer transaction size to 1. That way you'll only lose the feature that triggered the Terminator, not the preceeding ones.

On the other hand, you'll notice that performance goes down because of the overhead associated with opening and closing that many transactions. But it's a technique that can be useful for debugging purposes.


Could you do something like this? The terminator will only fire once the FeatureWriter has written to the database. It will still fire even if the process doesnt produce any features to write

 

Screenshot 2020-10-27 094711


Further to what @jdh​ says, since the Terminator only rolls back the currently open transaction block, it is possible to limit how much the Terminator rolls back by setting the writer transaction size to 1. That way you'll only lose the feature that triggered the Terminator, not the preceeding ones.

On the other hand, you'll notice that performance goes down because of the overhead associated with opening and closing that many transactions. But it's a technique that can be useful for debugging purposes.

Thanks David.

I hadn't thought about it being bulk, so yes, the transaction was never committed prior to the Terminator reeking havoc. Turning it off did the trick.


Reply