Skip to main content
Solved

Response Paging alternative for WFS 1.0.0, 1.1.0 -service


Hi,

I'm trying get a workbench to load 1000000+ features from a WFS version 1.1.0 with a download restriction of 10.000 features. Currently version 2.0.0 is not supported by this service.

I would like to know if it's possible to download those features in a similar way 'pagination' does for version 2.0.0. I understand this is not suppported in version 1.1.0.

I have tried to set parameters 'Max Features', 'Start Index' and 'Count' within my WFS reader with no luck. Same features are downloaded every single time.

Is there an alternative way to achieve this ?

**Edit

What am now trying to do is to achieve this with HttpCaller + a custom 'Loop' transformer.

However I'm stuck trying to configure this looper...

Best answer by redgeographics

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)

View original
Did this help you find an answer to your question?

3 replies

redgeographics
Celebrity
Forum|alt.badge.img+50
  • Celebrity
  • Best Answer
  • August 10, 2017

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)


  • Author
  • August 10, 2017
redgeographics wrote:

I'm assuming you're loading features within a certain area. If you tile up that area into small parts, send a request for each part seperately, using a 2nd workspace through a WorkspaceRunner and storing the data temporarily if you have to, then combine all of the data and filter out duplicates it might work. You'll have to make sure the tiles are so small that they're all going to be under than 10.000 feature limit though.

It might also be wise to slow things down (Decelerator) to not hit the server too hard and you might violate the terms-of-service of that WFS :)

Thanks redgeographics, that's what I want to do.

As the features are buildings it's quite difficult to make a 'regular' tile scheme fitting 10.000 features (rural, urban areas...)


redgeographics
Celebrity
Forum|alt.badge.img+50
That would be my next suggestion (the HTTPCaller in a loop) but I don't have any hands-on experience with that I'm afraid.

 

 


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings