We have the following stack:
- FME reads XML, inserts in and deletes from SDE.
- Data in non versioned SDE on Azure SQL Server.
- ArcGIS Enterprise serving MapService based on SDE.
- Interal WebGIS using MapService.
Most of this works fine but when we clean up large datasets it takes a long time to complete. What I know is mostly self-taught so I wonder if I missed something vital.
The task is to delete records from 20ish quite big featureclasses where projectname = 'projectname'.
What I do now:
- Using a FeatureReader, I read from all featureclasses using a where statement, reading only one feature.
- Now I know which featureclasses contain data from this specific project. Next I can use these features in a FeatureWriter, where I Row Selection = Columns ProjectName.
I think this is the right thing to do, let the database do the work. I expect it does delete from featureclass where projectname = 'projectname'.
The thing is, the column 'projectname' has an attributeindex to accelerate performance in the webviewer. So now I wonder, does it delete a row, updates the index, delete another row, etc etc, and is this killing performance?
Alternative I can think of:
- Use SQLExecutors instead of FeatureWriters, maybe more control but probably downsides as well? No sure of this.
- Remove attributeindices before deleting records, then delete, then restore attributeindices.
- Get objectid's from the to be deleted features using a SQLExecutor and a select statement, then delete the records using a SQLExecutor.