Skip to main content

For a client we are working on a large fme workbench which has to insert data in multiple oracle tables which are related with several foreign keys.

In earlier versions we got a lot of 'parent key not found' errors while running, so we put writing of parent tables in separate writers so they would be executed first, and that would solve that problem.

The problem now is that the system also has Oracle Rule Framework to guard several business rules on the database. Rule Framework will test these rules on every commit to check if all rules are met, if not it will rollback everything.

This leaves us with a problem since our separate writers will each commit when finished and than our rules are not all met and everything will be rollbacked. So to comply with the rules all feature types should be written by only one writer and be committed only ones. But than we have the first problem with our foreign keys again.

Removing the foreign keys, even temporary, is not an option for our customer.

Does anyone have a solution for this problem?

Regards,

Koen

Hi Koen,

Have you tried using the FME 2016 FeatureWriter instead of the Oracle Writer.

You might have more control over the order in which the tables are written.

Erik Jan


You could try to set the transaction size to 0 in the FME Writers, so that all the features are written into one single transation. If you have a very large number of objects to write, however, you will probably see that the performance takes a dive while you fill up the database transaction space.

An alternative would be to assemble the necessary SQL statement for one single logical object and insert it all into the different tables using a single SQLExecutor (hint: you can insert geometries using WKT/WKB).

My best-practice solution would be to ask for a staging schema in the database that is identical to the production schema, but without any constraints. Then you load your data from FME into the staging schema, validate it and then transfer the entire staging contents over to the production schema using SQL.


The only way that springs to mind is to go back to a single writer and control the order in which features arrive at the feature types. You would need an ID number for each table, sort the data by that ID, then filter it with an AttributeFilter. I don't guarantee that, but it's what I would try first.

Failing that please do contact our support team for more assistance (safe.com/support) - you could also quote PR#23356 and PR#60010 which are enhancement requests to allow ordering of writer tables.


Thanks for all quick responses.

@erik_jan: unfortunately we're bound to FME 2015, so we can't use the FeatureReader

@david_r:

We also though of the staging schema solution, but our customer didn't like the idea of yet another schema. So we marked that as a last resort.

The sql option could also work, but it sounds like a lot of work and not like the fme-way to go. But I'll give it a go if other options fail.

Setting transaction size to 0 is necessary I guess and I'm afraid we'll have to take the reduced performance for granted. But I still need a way to control the order the features are written.

@Mark2AtSafe: I like your idea of ordering the features. I'll give that a try.

I'll start experimenting with your solutions and let you know my results.

Regards, Koen


I tried sorting the features with an id and Sorter and AttributeFilter transformer, and I am able now to write all features with just 1 writer.

One strange thing though, this only works when I sort the features in descending order. So when I write the parent records last. That sounds very counter-intuitive.

Any idea why this is?


I tried sorting the features with an id and Sorter and AttributeFilter transformer, and I am able now to write all features with just 1 writer.

One strange thing though, this only works when I sort the features in descending order. So when I write the parent records last. That sounds very counter-intuitive.

Any idea why this is?

I'm glad you got it working. I have no idea why it only works in descending order. The best way to be sure what is happening is to add a breakpoint on the connection into the AttributeFilter. Then - once the workspace reaches that point and is in inspection mode - click the button to advance one connection at a time. That way you'll see which feature goes out of which connection first. See if the parent features really are last. The only thing I can think of is that you got the id the wrong way around, but I'm sure you would have thought of that, so I really don't know.


Found some other problems with integrating FME and Oracle Rule Framework into one process.

Oracle Rule Framework needs all it's edits in one transaction. This transaction needs to be initiated, committed and ended outside of FME. After consulting Safe Support we came to the conclusion we have to little influence on FME-transactions to make this happen.

Our current approach is to create a temporary clone staging schema for FME to write to, and copy the records in this schema to the production schema afterwarth. Like the solution @david_r suggested.


Found some other problems with integrating FME and Oracle Rule Framework into one process.

Oracle Rule Framework needs all it's edits in one transaction. This transaction needs to be initiated, committed and ended outside of FME. After consulting Safe Support we came to the conclusion we have to little influence on FME-transactions to make this happen.

Our current approach is to create a temporary clone staging schema for FME to write to, and copy the records in this schema to the production schema afterwarth. Like the solution @david_r suggested.

I agree, using a staging schema is very common for these scenarios.


We had a similar issue not too long ago and solved it by writing text files with FME and use Oracle's SQL Loader to load the text files into the database.

Performed very good as the text files were stored locally and gave more control over transactions and insert order.


We had a similar issue not too long ago and solved it by writing text files with FME and use Oracle's SQL Loader to load the text files into the database.

Performed very good as the text files were stored locally and gave more control over transactions and insert order.

What did you use to write the text files?

Seems like a lot of work to construct the correct sql for all feature types.


Reply