Question

Reading large dataset

  • 5 September 2017
  • 2 replies
  • 8 views

Badge

Hi,

I am trying to fine tune my model (taking 4-5 hours to run). When I look at the logs it seems to be taking along time to run some larger datasets thru the SpatialRelator. These are datasets that have been made from rasters.

For my output, I simply only need to tag those that a spatially related with a new attribute so would like to dissolve and simplify the datasets prior to running thru SpatialRelator. I tested the dissolver and the Aggregator. The Dissolver takes too long. The Aggregator seems to work but the dataset is still hard to work with due to the number of vertices (see pic). I tried the AreaAmalgamator but it sends the output out the rejected port.

Does anyone know the best way simplify a big dataset.


2 replies

Userlevel 1
Badge +4

Don't know if this is possible in your workflow but in general it is often faster if you can "prep" the raster before extracting polygons/vectors. If this done in FME you can try with resampling cell-sizes and/or also change the number of possible values in the cells to make the extracted polygons a bit "smoother".

 

 

If this is out of the question you might have a look at the Generalizer to reduce the number of vertices.
Badge

You can try the Tiler, to process your data in smaller units.

Reply