Skip to main content

Hi,

My problem is about a process that update a postgis table from an other geographic database. Sometimes the import stop working nearly at the end as you can see :

POSTGIS writer: 6589 of 6765 features written

POSTGIS writer: 6601 of 6765 features written

POSTGIS writer: 6606 of 6765 features written

POSTGIS writer: 6628 of 6765 features written

POSTGIS writer: 6649 of 6765 features written

and then, no more log for hours.

The fact is that it doesn't seems to come from data, cause when i start the process by little packet sometimes it works, actually this slowness seems to be unpredictable. But always occur at the end of the process. And yes computer is working and doesn't seems to be saturated : Proc 8%; Memory 21%.

 

Does someone have any idea of what can i try to understand what happen ??

Thanks

Hi @jeanu,

Quite difficult to find out what is going wrong with the information you provide, could be at the FME or the database side.

With FME you can try some different settings for the Bulk Insert and Features Per Transaction writer parameters and see if that helps.


I've try with Bulk Insert Set to Yes or No, but actually my process have 2 branch, one that have to update record, and another that have to insert, i use 2 attribute creator to set the fme_db_operation to the corresponding operation.

So when i set Bulk Insert to Yes, it automatically change to No before running the scrypt, exactly has explained here : https://knowledge.safe.com/questions/79015/sql-server-writer-switch-between-bulk-insert-yes-f.html

i've try with differents feature by transaction, but it always stop at the same point with no error message.

Last point, during a while the bottom bar continue to increase current item (on the red circle) for this exemple it write until 4429, then, after, the information simply disapear.


I've try with Bulk Insert Set to Yes or No, but actually my process have 2 branch, one that have to update record, and another that have to insert, i use 2 attribute creator to set the fme_db_operation to the corresponding operation.

So when i set Bulk Insert to Yes, it automatically change to No before running the scrypt, exactly has explained here : https://knowledge.safe.com/questions/79015/sql-server-writer-switch-between-bulk-insert-yes-f.html

i've try with differents feature by transaction, but it always stop at the same point with no error message.

Last point, during a while the bottom bar continue to increase current item (on the red circle) for this exemple it write until 4429, then, after, the information simply disapear.

If it is the same feature, the you can start the translation from that feature using the Feature To Read parameters:

This way you are efficiently reading the data and can easily debug the feature.


Reply