Skip to main content

Hi all,

My question is regarding performance actually...

I'm going through a process of reading data (shp) trimming it, doing change detection and writing it to a sde. Some of my data contains 5.000.000 objects, so it takes forever to run it, which is not a problem in production mode.

 

In the testing fase, when tweeking and changing settings it is a big hazzle though. I have added a tester to both my original data source and the revised data source to filter out a small portion of data, but it still has to go through every single object when reading the data...

 

So my question is: Is there a way to avoid having to read everything from both datasets?

 

Thanks :-)

One way to do this is to use a polygon to select only a part of the data in test.

This can be done with FeatureReaders, first read the polygon, then read the data with a spatial filter.

This will work by default for SDE, the shape set will need a spatial index tho, but you can do that using FME (Esri Shapefile Parameters, Write Spatial Index).


Hi Niels,

Thank you for your answer - sounds like a sensible solution, I will give it a try :-)

Thanks


Reply