Skip to main content

Welcome,

I need to download data from 300 wfs servers (list of it in any format) for 2000 objects (3 km buffer form point) like on a picture.

Is it possible to add a list with server addresses to FME where it will check the spatial extent of the data from the server and then from the polygon range and download the data?

Yes possible with FME

 

FME has a tool called the WorkspaceRunner which will let you run a child workspace and have FME input the Parameters to each job (I guess 300 in your case)

 

You could also do with with a batch file and running FME on the command line.

 

There are some potential issues, however, WFS can be a little annoying to work with sometimes and you might find that one configuration of the workspace is able to fetch data from Server A but failed when trying with Server B. you would need to do a bit of testing with different services to get a feel it it can be done dynamically or not.

 

Of course if you know the specific parameters for each server which you want to use to fetch the data then you can also pass these to the child workspace from your input server list


virtualcitymatt thank you for your reply, it gives me hope that it can work. 

 

I will try to provide more detail for this:

  • a polygon layer containing cadastral parcels is downloaded from each server
  • the sql expression for each WFS server looks the same  SELECT * FROM dzialki 
  • nevertheless the layers are called slightly different, but each has the string dzialki in it

 

Here are some addresses I would like to use:

https://ikerg.podgikgryfice.pl/gryfice-egib
https://sbl.webewid.pl:8443/iip/ows
https://mapy.geoportal.gov.pl/wss/ext/PowiatoweBazyEwidencjiGruntow/0403
https://geoportal.wms.um.gorzow.pl/map/geoportal/wfs.php
https://zninski-wms.webewid.pl/iip/ows
https://bielski-wms.webewid.pl/us/wfs/sip
https://wms2.um.warszawa.pl/geoserver/wfs/wfs
https://geodezja.powiatopolski.pl/ggp
https://e-odgik.chorzow.eu/arcgis/services/chorzow_egib/serwer

I can't hide the fact that I haven't dealt with FME before, but when I started looking for a solution to my problem I came across this programme.  Will creating such a tool in FME be very complicated and require a lot of learning of new things?

 

 


Or more simply, is it possible to download from all these servers the data named dzialki without checking the spatial relationship with another layer which I mentioned in the first post.

 


So I checked the first 3 services all of which have that layer name. Below you can see the kind of options which can be set in a WFS "reader". The field "Feature Types" is where the layer is defined.

 

image 

If I read from the service with these settings it will try and fetch all the features in that feature type. In this case I gor 117072 polygons

image 

Some WFS services limit the amount which can be requested so you might run into issues but worth having a go.

 

What you're wanting to do is probably and intermediate to advanced task in FME and you would probably need to be somewhat familiar with certain FME tools/transformer/concepts in order to get it working properly.

 

Your first step is to build a workspace where you can input any one of these URLs as a parameter (and maybe a layer name parameter) and then get the desired output. You should be able to do it for any url in your list.

 

Once oyu have that then you can run it in batch using either a Batch file or the WorksapceRunner

 

Here's a few articles which you should find helpful:

 

Building dynamic workflows (important when the data have different schema) https://community.safe.com/s/article/dynamic-workflow-tutorial-introduction

 

Using the workspace runner to do batch processing: https://community.safe.com/s/article/batch-processing-using-the-workspacerunner-1

 

FME Academy for general training: https://safe.my.trailhead.com/today

 

FME Basic training - a long video but worth it if you have the time to go through it, it will show you what kinds of things FME is capable of: https://engage.safe.com/training/recorded/fme-desktop-basic-2022-1/

 

FME Advanced Training - if you get though this then you will really have a lot of tools to work with: https://engage.safe.com/training/recorded/fme-desktop-advanced-2022-1/

 


virtualcitymatt let me ask you straight. If I provided you with a list of WFS servers would you be able to create a workspace with these parameters?

I don't know how much time this would take? Would I then be able to open this on my FME and run it?

 

These servers have no limits on the amount of data downloaded. 


virtualcitymatt let me ask you straight. If I provided you with a list of WFS servers would you be able to create a workspace with these parameters?

I don't know how much time this would take? Would I then be able to open this on my FME and run it?

 

These servers have no limits on the amount of data downloaded. 

ha! Sadly I don't have the time. If you want to pay someone to do the job you can check out the partner locator and get in touch with a local service provider: https://engage.safe.com/partners/partner-locator/


Of course I understand. Thank you for your help


ha! Sadly I don't have the time. If you want to pay someone to do the job you can check out the partner locator and get in touch with a local service provider: https://engage.safe.com/partners/partner-locator/

Thanks for the link. I will try it.


Hello do you find any soution about it?

 

I thining about the same idea


There is an alternative approach you may wish to try other than WorkspaceRunner. One challenge with the WorkspaceRunner approach is that the WFS reader has a lot of interactive parameters that are hard to autopopulate when you run it in an automated way.

 

If you are somewhat familiar with the OGC WFS standard, then you can use HTTPCallers to issue the GetCapabilities, DescribeFeatureType and GetFeature requests and then parse the responses yourself. If you have a list of WFS server urls, you should be able to have an automated process that composes GetCapabilities requests for each of these. The response XML can then be parsed and used to compose GetFeature requests to download the data from each. Of course you have to be careful how fast you generate these requests and how much data you ask for.

 

As a hint, when you use the WFS reader to add a WFS source to your workspace and then run it, you can look at the FME log for all the related GetCapabilities, DescribeFeatureType and GetFeature request GET urls. You can then use these as templates for your WFS harvest workspace HTTPCallers.

 

I created a WFS harvester example for this type of task some time ago. I tracked it down and had to update the HTTPCallers to get it to run in the current FME release. Feel free to try it on your own WFS server list .

 

See also: https://community.safe.com/s/question/0D54Q000080haUqSAI/download-all-tiles-from-a-wms-server

and

https://community.esri.com/t5/arcgis-data-interoperability-blog/harvesting-wfs-data-in-bulk/ba-p/883755


There is an alternative approach you may wish to try other than WorkspaceRunner. One challenge with the WorkspaceRunner approach is that the WFS reader has a lot of interactive parameters that are hard to autopopulate when you run it in an automated way.

 

If you are somewhat familiar with the OGC WFS standard, then you can use HTTPCallers to issue the GetCapabilities, DescribeFeatureType and GetFeature requests and then parse the responses yourself. If you have a list of WFS server urls, you should be able to have an automated process that composes GetCapabilities requests for each of these. The response XML can then be parsed and used to compose GetFeature requests to download the data from each. Of course you have to be careful how fast you generate these requests and how much data you ask for.

 

As a hint, when you use the WFS reader to add a WFS source to your workspace and then run it, you can look at the FME log for all the related GetCapabilities, DescribeFeatureType and GetFeature request GET urls. You can then use these as templates for your WFS harvest workspace HTTPCallers.

 

I created a WFS harvester example for this type of task some time ago. I tracked it down and had to update the HTTPCallers to get it to run in the current FME release. Feel free to try it on your own WFS server list .

 

See also: https://community.safe.com/s/question/0D54Q000080haUqSAI/download-all-tiles-from-a-wms-server

and

https://community.esri.com/t5/arcgis-data-interoperability-blog/harvesting-wfs-data-in-bulk/ba-p/883755

Thank you deanatsafe for the tips !

 

Yeach exactly it's path what I tried to make it by myself, your .fmw file compled my missing parts to finished it.

 

If we thinking about many WFS servers, I guess to react each time to add new "User Parameters". Mostly WFS servers have a similar shema model - but could have different limits for "GET" parameters for example: IP or DOS attack etc.

 

What do you think, is possible to extend the workflow and tranform data for more simple format like .GeoJson or .GPKG and save it with "attributes types" or is not make sense - and better to done it with another workspace using WorkspaceRunner ?

 

I'll be back if I found some good results.

 

Thanks for extand the knowlage :)


Reply