Skip to main content

I have a bunch of workbenches running and one feature class in particular is acting very peculiar.

This data is simply being moved from one GDB to another, it is new data overwriting old data.

Today (for example) it is just five features, the first feature in the list will not overwrite, but the rest do.

I imported the first feature on it's own and it worked, I also tested it where I selected the tables in a different order, and that worked.

I then deleted the reader and created a new one, and the feature ran and was successful on the feature class. The second time I tested it, it didn't work.

This same thing happened on a different workbench with the same feature list, that first one did not work.

Any ideas why it won't work sometimes? The only thing that I see is it is the first in the list of the feature list I'm running and it is the only name that shows in the "Feature Class or Table Name" in the transformer (see second image below), even though it's running the entire list as a single merged feature.

Hopefully that makes sense?

 

Hi @kimburrows1242,

Thanks for your question! To get a better picture of what might be happening, I'd be curious to see the writer parameters. In particular, I'd want to look for the Feature Operation and Table Handling, as well as whether Dynamic Schema Definition is checked.

Try setting Feature Operation to Insert, and Table Handling to Drop and Create if you are writing to an existing geodatabase. I think this article might provide a bit more information on working with Dynamic workflows, and this documentation also covers a bit more. Give that a try and if none of these changes work for you, it'd be helpful if you could share your workspace so far and a sample of your problem dataset as I'm not able to reproduce the same behaviour. Hope that helps a bit!


Hi @kimburrows1242,

Thanks for your question! To get a better picture of what might be happening, I'd be curious to see the writer parameters. In particular, I'd want to look for the Feature Operation and Table Handling, as well as whether Dynamic Schema Definition is checked.

Try setting Feature Operation to Insert, and Table Handling to Drop and Create if you are writing to an existing geodatabase. I think this article might provide a bit more information on working with Dynamic workflows, and this documentation also covers a bit more. Give that a try and if none of these changes work for you, it'd be helpful if you could share your workspace so far and a sample of your problem dataset as I'm not able to reproduce the same behaviour. Hope that helps a bit!

If I used Drop and Create, would it still create if there were items that were not already there?


If I used Drop and Create, would it still create if there were items that were not already there?

Drop and Create vs Truncate

 

 

Drop and Create

 

Remove featureclass and all records. Then create featureclass and insert records.

Truncate

 

Keep featureclass, but remove all records. Then insert records.

Extra notes

If the featureclass is used in a service, you need to stop the service before you can remove the featureclass. In this situation you need to truncate it.

If the featureclass has any intelligence, like specific indices on attributes, you need to recreate those when Drop and Create was used.

Truncate only works when the featureclass already exists. This might be problematic in a dynamic workflow.

I think the indices are not updated when using Truncate, this means performance using Drop and Create should be better. But someone should confirm this as I'm not sure.

Also see the documentation:

People at Safe wrote a lot of pieces around working with databases. For example:

https://knowledge.safe.com/articles/43885/data-loading-updating-and-deleting.html

https://www.safe.com/blog/2018/10/beginners-database-updates-evangelist180/


Drop and Create vs Truncate

 

 

Drop and Create

 

Remove featureclass and all records. Then create featureclass and insert records.

Truncate

 

Keep featureclass, but remove all records. Then insert records.

Extra notes

If the featureclass is used in a service, you need to stop the service before you can remove the featureclass. In this situation you need to truncate it.

If the featureclass has any intelligence, like specific indices on attributes, you need to recreate those when Drop and Create was used.

Truncate only works when the featureclass already exists. This might be problematic in a dynamic workflow.

I think the indices are not updated when using Truncate, this means performance using Drop and Create should be better. But someone should confirm this as I'm not sure.

Also see the documentation:

People at Safe wrote a lot of pieces around working with databases. For example:

https://knowledge.safe.com/articles/43885/data-loading-updating-and-deleting.html

https://www.safe.com/blog/2018/10/beginners-database-updates-evangelist180/

Thank you, that's what I thought. My issue now is that multiple features are coming in, some already exist, some are new, lots of different geometries too (I've used the Geom filter).

So does FME not have a function for features already there and new features to be outputted in one writer?


Thank you, that's what I thought. My issue now is that multiple features are coming in, some already exist, some are new, lots of different geometries too (I've used the Geom filter).

So does FME not have a function for features already there and new features to be outputted in one writer?

You probably are looking for the ChangeDetector. But the question is still to broad to answer it well. Different situations I can think of:

  • features for not existing featureclass
    • create featureclass
  • features for existing featureclass, same geometry
    • truncate existing
  • features for existing featureclass, different geometry
    • drop and create
    • or create featureclasses with the geometrytype in the name
  • jadajadajada

Are both source and target schemas (featureclasses, attributes, geometrytypes) always the same?

Do all featureclasses in source always have at least one record? (If no record is read from source, no record is written to target, so cleaning up the featureclass without features is not easy.)


Reply