Skip to main content
Solved

Ignore primary key inserts errors

  • August 16, 2017
  • 8 replies
  • 430 views

richardsr
Contributor
Forum|alt.badge.img+1

Hello,

I am curious of there is a setting or ability to ignore "insert errors" related to primary key constraints in a SQL Database. The workspace writer is doings its job by not allowing duplicate keys to be created but I would like to override/ignore these errors to allow the workspace to continue and finish successfully.

I know I could use an UpdateDetector or reader from the destination to prevent these inserts from hitting the writer but this will really slow down the process to read all existing records to perform this compare.

I was hoping to set a SQL writer or workspace property to allow the workspace to continue when a duplicate key is encoutered.

Any tips or techniques for this situation without actually reading the destination and comparing the input records?

Thanks

 

Ronnie

Best answer by courtney_m

Unfortunately, there is no quick way to tell FME to continue if a record fails to write to the database. You would have to previously determine which records are duplicates (using the UpdateDetector or Read from the database and use a duplicate remover based on the primary key).

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

8 replies

courtney_m
Contributor
Forum|alt.badge.img+19
  • Contributor
  • 127 replies
  • August 16, 2017

Have you tried adding a SQL statement on the writer under the "SQL To Run Before Write" parameter?

You can use a statement such as:

ALTER TABLE <table_name>

NOCHECK CONSTRAINT <PK_Constraint_Name>;


richardsr
Contributor
Forum|alt.badge.img+1
  • Author
  • Contributor
  • 7 replies
  • August 16, 2017

Have you tried adding a SQL statement on the writer under the "SQL To Run Before Write" parameter?

You can use a statement such as:

ALTER TABLE <table_name>

NOCHECK CONSTRAINT <PK_Constraint_Name>;

 

Interesting... that is a neat trick. I have not used this before but to clarify I do not want to disable any constraints I just want the writer and/or workspace to continue when primary key errors are encountered.

courtney_m
Contributor
Forum|alt.badge.img+19
  • Contributor
  • 127 replies
  • August 16, 2017

Have you tried adding a SQL statement on the writer under the "SQL To Run Before Write" parameter?

You can use a statement such as:

ALTER TABLE <table_name>

NOCHECK CONSTRAINT <PK_Constraint_Name>;

If you are writing to a SQL Server database, you may need to try the following syntax:

 

ALTER INDEX constraint_name ON table_name

 

DISABLE;

 

 

In that case, you may want to add a SQL statement for the "SQL To Run After Write" to enable the constraints again:

 

ALTER INDEX constraint_name ON table_name

 

REBUILD;

courtney_m
Contributor
Forum|alt.badge.img+19
  • Contributor
  • 127 replies
  • Best Answer
  • August 16, 2017

Unfortunately, there is no quick way to tell FME to continue if a record fails to write to the database. You would have to previously determine which records are duplicates (using the UpdateDetector or Read from the database and use a duplicate remover based on the primary key).


courtney_m
Contributor
Forum|alt.badge.img+19
  • Contributor
  • 127 replies
  • August 16, 2017

 

Interesting... that is a neat trick. I have not used this before but to clarify I do not want to disable any constraints I just want the writer and/or workspace to continue when primary key errors are encountered.
Oh, sorry for the misunderstanding. Hopefully no DBA's read this answer ;-)

 

 


erik_jan
Contributor
Forum|alt.badge.img+22
  • Contributor
  • 2179 replies
  • August 16, 2017

Unfortunately, there is no quick way to tell FME to continue if a record fails to write to the database. You would have to previously determine which records are duplicates (using the UpdateDetector or Read from the database and use a duplicate remover based on the primary key).

Or use the SQLExecutor with SQL Select count(*) as PK_EXISTS from table where PK_ID = @Value(PK)

 

assuming the PK_ID is the ID field in the table and PK is the name of the attribute.

 

Then use a Tester to test if PK_EXISTS = 0 and write only if the feature passes the test.

 


richardsr
Contributor
Forum|alt.badge.img+1
  • Author
  • Contributor
  • 7 replies
  • August 16, 2017
Or use the SQLExecutor with SQL Select count(*) as PK_EXISTS from table where PK_ID = @Value(PK)

 

assuming the PK_ID is the ID field in the table and PK is the name of the attribute.

 

Then use a Tester to test if PK_EXISTS = 0 and write only if the feature passes the test.

 

I like this! Good suggestion, it is dynamic and might be a little slower in the pipeline but faster overtime than the UpdateDetector. My worry with updateDetector is overtime this table will be huge and the memory consumption in this transformer would be very large.

 

 


koenterralytics
Contributor
Forum|alt.badge.img+10
  • Contributor
  • 33 replies
  • November 1, 2019

I think I found another way to solve your issue which is fast and flexible. Instead of using the sql writer you can use the transformer SQL-executor. When using this transformer you are much more flexible when inserting records. 

For example (postgres syntax):

INSERT INTO customers (NAME, email)
VALUES
   (
      'Microsoft',
      'hotline@microsoft.com'
   ) 
ON CONFLICT ON CONSTRAINT customers_name_key 
DO NOTHING;

When inserting many records this method can become slow. As a work-around I use an aggregator to insert many records at one time (bulk insert). 

0684Q00000ArKLdQAN.png