Skip to main content
Question

My scenario is that I have to ensure that all polygons going through a process have fewer than 2000 vertices. These polys can vary in size and complexity quite significantly. Districts, towns, counties...

  • October 5, 2020
  • 3 replies
  • 10 views

Forum|alt.badge.img+1

I'm trying to build an expression in Generalizer's'tolerance' setting that address this in every scenario (I have a Tester for node count beforehand, so this will only have to deal with polys with 2000+ nodes). I assume it's related to the perimeter but I'm not getting very far.

 

Any suggestions gratefully received

 

Thanks

 

Paul

 

 

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

3 replies

Forum|alt.badge.img+2
  • October 5, 2020

@paulc1​  Chopper has the option to break polygons by vertex count. There is also a GridChopper on FME HUB that is useful for areas with a large extent


virtualcitymatt
Celebrity
Forum|alt.badge.img+47

You could use the looping function in a customer transformer to slowly increase the generalisation? This might avoid trying to build a 'smart' expression​.


Forum|alt.badge.img+1
  • Author
  • October 6, 2020

You could use the looping function in a customer transformer to slowly increase the generalisation? This might avoid trying to build a 'smart' expression​.

ooh....good call.....