Skip to main content
Question

How to rollback the changes made by SQLExecutor transformer if any of the Initiator fails?


Forum|alt.badge.img

Hi! I am writing an FME script to insert records into target table. Before inserting a record, I want to delete the record has the same ID in database, if it exists. I made an FME workspace as screenshot to do run a DELETE T-SQL command for each feature, such that

  • all deletions are performed before any insertion.
  • the translation will terminate if any of the deletion fails.

However, I considered the following hypothetical scenarios (in practice I never seen any of them happened, but still possible).

  • The 100th feature fails in the SQLExecutor, such that the translation ends with 99 records deleted from database.
  • All feature passes SQLExecutor, but the first feature failed in the writer, such that the translation ends with all deletions done, but not insertion.

Considering the importance of data integrity. Is it possible for me to change the workspace, such that:

  • If any feature fails in SQLExecutor transformer, then rollback all the changes made by the same SQLExecutor?
  • If any feature fails in writer, then rollback all the deletions that has been done?
  • If any feature fails in writer, then rollback all the features inserted in the same translation before?

Improving any of these will help. Thank you!

More details: working with SQL Server with Non-versioned SDE database.

12 replies

fmelizard
Contributor
Forum|alt.badge.img+17
  • Contributor
  • October 13, 2017
Hi @sui_huang

 

 

Thanks for asking you question. Are you aware that the SDE Geodatabase writer has an options to Update features based on an ID? This should remove the need for the SQL Executor. Would this help?

 


fmelizard
Contributor
Forum|alt.badge.img+17
  • Contributor
  • October 13, 2017
Hi @sui_huang

 

 

Thanks for asking you question. Are you aware that the SDE Geodatabase writer has an options to Update features based on an ID? This should remove the need for the SQL Executor. Would this help?

 


Forum|alt.badge.img
  • Author
  • October 13, 2017
fmelizard wrote:
Hi @sui_huang

 

 

Thanks for asking you question. Are you aware that the SDE Geodatabase writer has an options to Update features based on an ID? This should remove the need for the SQL Executor. Would this help?

 

Hi @MattAtSafe thank you for the suggestion. If I set the "Writer Mode" of the writer to "update" instead of "insert", I think it will work together with the specified key field. It eliminates the aforementioned SQLExecutor problem. It helps a lot.

 

 

Now the question left is: if the 100th feature into the writer fails, how have the 99 inserted/updated features rolled back?

 


stalknecht
Contributor
Forum|alt.badge.img+19
  • Contributor
  • October 14, 2017

I recommend you write your data into a temporary staging schema first, validate it for consistency and then transfer it into your production schema. For the final part you can use a SQL script that starts a transaction and rollbacks if anything fails.


Forum|alt.badge.img
  • Author
  • October 14, 2017
stalknecht wrote:

I recommend you write your data into a temporary staging schema first, validate it for consistency and then transfer it into your production schema. For the final part you can use a SQL script that starts a transaction and rollbacks if anything fails.

hi @stalknecht

 

Staging and data validation before inserting can make the workspace robust and in most of the case avoid the insertion problems, but it can increase the workspace complexity and expose more chance for me to make mistake. I am just looking for a safeguard as simple as possible.

 

 

For the final part how can I use a SQL script to start and rollback?

stalknecht
Contributor
Forum|alt.badge.img+19
  • Contributor
  • October 14, 2017
sui_huang wrote:
hi @stalknecht

 

Staging and data validation before inserting can make the workspace robust and in most of the case avoid the insertion problems, but it can increase the workspace complexity and expose more chance for me to make mistake. I am just looking for a safeguard as simple as possible.

 

 

For the final part how can I use a SQL script to start and rollback?
Using transactions:

 

https://docs.microsoft.com/en-us/sql/t-sql/language-elements/begin-transaction-transact-sql

 

 


fmelizard
Contributor
Forum|alt.badge.img+17
  • Contributor
  • October 16, 2017

Hi @sui_huang, building off of what @MattAtSafe commented, you can set the Features per Transaction (found in the Navigator under Advanced) to be higher than your expected number of features. FME won't commit the changes to the database until that number has been reached. (If you set it to 100 and the 99th feature fails, the previous 98 features will not be committed.)


Forum|alt.badge.img
  • Author
  • October 16, 2017
fmelizard wrote:
Hi @sui_huang

 

 

Thanks for asking you question. Are you aware that the SDE Geodatabase writer has an options to Update features based on an ID? This should remove the need for the SQL Executor. Would this help?

 

Hi @MattAtSafe I tried the idea with FME 2016.1 by setting the Writer Mode to "Update" and specifying the ID column name. However, I found that it can only update the existing records with matching ID value, but cannot add new record if there is no existing matching record.

 

I also tried Writer Mode="Insert", then the writer will ignore the ID column name as document specified (https://docs.safe.com/fme/html/FME_Desktop_Documentation/FME_ReadersWriters/geodatabase/Configuration_Parameters.htm for GEODB_UPDATE_KEY_COLUMNS parameter).

 

 

My features to write includes some feature update and some feature creation, so the "row select" setting may not be a solution for my situation.

 

Did I miss anything?

 

Thank you!

 


Forum|alt.badge.img
  • Author
  • October 16, 2017
fmelizard wrote:

Hi @sui_huang, building off of what @MattAtSafe commented, you can set the Features per Transaction (found in the Navigator under Advanced) to be higher than your expected number of features. FME won't commit the changes to the database until that number has been reached. (If you set it to 100 and the 99th feature fails, the previous 98 features will not be committed.)

@TiaAtSafe even though I found that using "row selection" does not fit my situation because I am trying to do record updating and creation at the same writer, but changing features per transaction is a good idea.

 


Forum|alt.badge.img+2
  • October 16, 2017
fmelizard wrote:

Hi @sui_huang, building off of what @MattAtSafe commented, you can set the Features per Transaction (found in the Navigator under Advanced) to be higher than your expected number of features. FME won't commit the changes to the database until that number has been reached. (If you set it to 100 and the 99th feature fails, the previous 98 features will not be committed.)

I had a colleague ask about a similar issue last week. Despite this not being always a safe option, I suggested using a Workspace to count the number of updates, before passing this in as a parameter to a child Workspace using the count + 1 to define the Feature per Transaction. This only really works if the number of features is small and the process to load is simple with little processing that affects the number of features read to those written.

 

 

My preferred solution was similar to @stalknecht to use a staging table.

 


Forum|alt.badge.img
  • Author
  • October 17, 2017

I think all the answers from @stalknecht , @MattAtSafe , and @TiaAtSafe helps in some aspect. Accepting all answers.


Forum|alt.badge.img
  • Author
  • October 17, 2017
fmelizard wrote:
Hi @sui_huang

 

 

Thanks for asking you question. Are you aware that the SDE Geodatabase writer has an options to Update features based on an ID? This should remove the need for the SQL Executor. Would this help?

 

I am looking for the Accept button to also accept this answer, but couldn't find the button. When can I find it?

 

 


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings