Skip to main content
Solved

Response Paging alternative for WFS 1.0.0, 1.1.0 -service

  • August 10, 2017
  • 3 replies
  • 89 views

Hi,

I'm trying get a workbench to load 1000000+ features from a WFS version 1.1.0 with a download restriction of 10.000 features. Currently version 2.0.0 is not supported by this service.

I would like to know if it's possible to download those features in a similar way 'pagination' does for version 2.0.0. I understand this is not suppported in version 1.1.0.

I have tried to set parameters 'Max Features', 'Start Index' and 'Count' within my WFS reader with no luck. Same features are downloaded every single time.

Is there an alternative way to achieve this ?

**Edit

What am now trying to do is to achieve this with HttpCaller + a custom 'Loop' transformer.

However I'm stuck trying to configure this looper...

Best answer by redgeographics

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

3 replies

redgeographics
Celebrity
Forum|alt.badge.img+62
  • Celebrity
  • Best Answer
  • August 10, 2017

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)


  • Author
  • August 10, 2017

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)

Thanks redgeographics, that's what I want to do.

As the features are buildings it's quite difficult to make a 'regular' tile scheme fitting 10.000 features (rural, urban areas...)


redgeographics
Celebrity
Forum|alt.badge.img+62
That would be my next suggestion (the HTTPCaller in a loop) but I don't have any hands-on experience with that I'm afraid.