Skip to main content

Hopefully have an interesting Friday question that I would be very grateful for your help on.

We are using a feature reader to read all the features/tables in a PostGIS database contained within a particular schema. There are lots of tables/features and when they write to the Filegeodatabase Open API writer for some reason we get locking issues after approximately 400 tables (sometimes more sometimes less) we can write out to spatialite etc no problem so the issue doesn't lie with the data.

What we are thinking to do if to put a condition between the two perhaps a counter/sampler which then after say 100 tables are transferred we put a decelerator so that it then can hopefully get it's composure back before it kicks off again.

Is this possible, has anyone done something similar. I guess it would also be useful if for example you needed to report back progress to a application to report how many feature are complete you could report back a counter of some type.

Thanks for the help as always,

Oliver

You can read the tables the database has with a sql executor accessing the metadata.

Count those and create attribute.

Then you only need to test that attribute. And then have a Decelerator execute some waitng time.


You can read the tables the database has with a sql executor accessing the metadata.

Count those and create attribute.

Then you only need to test that attribute. And then have a Decelerator execute some waitng time.

thank you I will give that a go, at the moment I don't understand how after the tester to use the decelerator and stop fme writing through both tester ports in parallel. I need to it run 99 first through port 1 then run 1 through port 2 with decelerator attached then run 99 through port 1 again and so on