Skip to main content
Solved

Terminator performs database rollback ?


lifalin2016
Contributor
Forum|alt.badge.img+29

Hi list.

I've built a workspace with a loop looking for information, that I don't know the exact end of. So I loop with a margin, and I save an ID of the last found instance, and check whether I've had too many consecutive fails. If too many fails have been found, I redirect to a Terminator transformer.

However, when I ran the workspace, expecting some output to a database, no output was produced.

It seems that the Terminator transformer rollback previous database inserts. When I disabled the Terminator, all worked as expected.

Is there a way of making a peaceful halt of processing of a workspace, not the "pull-the-plug" kind of behaviour that Terminator seems to perform ?

I.e., is there an option to make the Terminator transformer less "Terminator"-like ??

Cheers

Best answer by david_r

Further to what @jdh​ says, since the Terminator only rolls back the currently open transaction block, it is possible to limit how much the Terminator rolls back by setting the writer transaction size to 1. That way you'll only lose the feature that triggered the Terminator, not the preceeding ones.

On the other hand, you'll notice that performance goes down because of the overhead associated with opening and closing that many transactions. But it's a technique that can be useful for debugging purposes.

View original
Did this help you find an answer to your question?

4 replies

jdh
Contributor
Forum|alt.badge.img+28
  • Contributor
  • October 23, 2020

How are you writing to the database? What is your features per transaction set to? Are you using bulk insert?

 

There's no "successful exit early" version of the terminator, but you can probably accomplish what you want by changing some of the writer parameters.


david_r
Celebrity
  • Best Answer
  • October 26, 2020

Further to what @jdh​ says, since the Terminator only rolls back the currently open transaction block, it is possible to limit how much the Terminator rolls back by setting the writer transaction size to 1. That way you'll only lose the feature that triggered the Terminator, not the preceeding ones.

On the other hand, you'll notice that performance goes down because of the overhead associated with opening and closing that many transactions. But it's a technique that can be useful for debugging purposes.


hkingsbury
Celebrity
Forum|alt.badge.img+54
  • Celebrity
  • October 26, 2020

Could you do something like this? The terminator will only fire once the FeatureWriter has written to the database. It will still fire even if the process doesnt produce any features to write

 

Screenshot 2020-10-27 094711


lifalin2016
Contributor
Forum|alt.badge.img+29
  • Author
  • Contributor
  • October 27, 2020
david_r wrote:

Further to what @jdh​ says, since the Terminator only rolls back the currently open transaction block, it is possible to limit how much the Terminator rolls back by setting the writer transaction size to 1. That way you'll only lose the feature that triggered the Terminator, not the preceeding ones.

On the other hand, you'll notice that performance goes down because of the overhead associated with opening and closing that many transactions. But it's a technique that can be useful for debugging purposes.

Thanks David.

I hadn't thought about it being bulk, so yes, the transaction was never committed prior to the Terminator reeking havoc. Turning it off did the trick.


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings