Skip to main content
Solved

What is the best way to download Esri Feature Services from multiple sources and download onto a local file geodatabase?

  • December 12, 2023
  • 6 replies
  • 105 views

Hi community! I've been trying to formulate an efficient workspace that would:

  1. Extract feature service URLs from an excel sheet (many have the item ID number at the end)
  2. Download the GIS data from each feature service
  3. Transform the idea with typical cleaning up
  4. Write out to a file GDB on a local server drive

 

For steps 1 and 2, I've been experimenting with HTTPCaller and FeatureReader but haven't yielded a successful extraction yet. I think 3 and 4 will be straightforward, but if there any any suggestions (specifically with getting from 1 to 2) welcomed, and appreciate any feedback in advance!

Best answer by hkingsbury

This is a good example of a dynamic workflow as you're dealing with multiple input schema and requiring them to go through the same process. Additionally, you're not sure on the schema before the process starts so you can't manually expose attributes and set the output schemas.

 

I've attached an example process (built in 2023.1) that uses two of Esris sample layers. The process broadly follows:

  1. Get the details of the service using an HTTP Caller to get the JSON
  2. Extract the name of the Layers
  3. Strips the layer id from the end of the URL
  4. Uses the URL and Name in a feature reader to read the data. Note that the features are coming out of the Generic port. This is because we don't know what the feature classes are going to be before we run the process.
  5. Use a schema scanner (with groupby on the feature type) to build a schema and remove 'object id' as this will cause issues as its a reserved field in a GDB
  6. With a GDB writer set as dynamic, write out the features with the correct schema(s)

 

View original
Did this help you find an answer to your question?

6 replies

hkingsbury
Celebrity
Forum|alt.badge.img+53
  • Celebrity
  • Best Answer
  • December 12, 2023

This is a good example of a dynamic workflow as you're dealing with multiple input schema and requiring them to go through the same process. Additionally, you're not sure on the schema before the process starts so you can't manually expose attributes and set the output schemas.

 

I've attached an example process (built in 2023.1) that uses two of Esris sample layers. The process broadly follows:

  1. Get the details of the service using an HTTP Caller to get the JSON
  2. Extract the name of the Layers
  3. Strips the layer id from the end of the URL
  4. Uses the URL and Name in a feature reader to read the data. Note that the features are coming out of the Generic port. This is because we don't know what the feature classes are going to be before we run the process.
  5. Use a schema scanner (with groupby on the feature type) to build a schema and remove 'object id' as this will cause issues as its a reserved field in a GDB
  6. With a GDB writer set as dynamic, write out the features with the correct schema(s)

 


  • Author
  • December 13, 2023

This is so very helpful @hkingsbury​ and thank you for educating on the process! I have quite a lot of services that are pretty large, so it's been running for the last few hours. I'll report back once it's complete but so far so good!


  • Author
  • December 14, 2023
hkingsbury wrote:

This is a good example of a dynamic workflow as you're dealing with multiple input schema and requiring them to go through the same process. Additionally, you're not sure on the schema before the process starts so you can't manually expose attributes and set the output schemas.

 

I've attached an example process (built in 2023.1) that uses two of Esris sample layers. The process broadly follows:

  1. Get the details of the service using an HTTP Caller to get the JSON
  2. Extract the name of the Layers
  3. Strips the layer id from the end of the URL
  4. Uses the URL and Name in a feature reader to read the data. Note that the features are coming out of the Generic port. This is because we don't know what the feature classes are going to be before we run the process.
  5. Use a schema scanner (with groupby on the feature type) to build a schema and remove 'object id' as this will cause issues as its a reserved field in a GDB
  6. With a GDB writer set as dynamic, write out the features with the correct schema(s)

 

I can confirm that this is exactly what i needed and it download as i needed! However, one issue is that because some of the feature services are quite large, I'm letting it run several hours (8-13 hours) and FME eventually freezes. @hkingsbury​ (or anyone else)..any advice on the best option to make each feature service URL run completely through, then go to the next service?

 

Much appreciated!


changeofmotion
Contributor
Forum|alt.badge.img
hkingsbury wrote:

This is a good example of a dynamic workflow as you're dealing with multiple input schema and requiring them to go through the same process. Additionally, you're not sure on the schema before the process starts so you can't manually expose attributes and set the output schemas.

 

I've attached an example process (built in 2023.1) that uses two of Esris sample layers. The process broadly follows:

  1. Get the details of the service using an HTTP Caller to get the JSON
  2. Extract the name of the Layers
  3. Strips the layer id from the end of the URL
  4. Uses the URL and Name in a feature reader to read the data. Note that the features are coming out of the Generic port. This is because we don't know what the feature classes are going to be before we run the process.
  5. Use a schema scanner (with groupby on the feature type) to build a schema and remove 'object id' as this will cause issues as its a reserved field in a GDB
  6. With a GDB writer set as dynamic, write out the features with the correct schema(s)

 

 

Hi ​@hkingsbury - This looks super helpful, and I’m working to do something very similar. You mention you attached an example process to the above post. Where would I find this so I can take a look? Thanks so much again.


hkingsbury
Celebrity
Forum|alt.badge.img+53
  • Celebrity
  • November 21, 2024
changeofmotion wrote:
hkingsbury wrote:

This is a good example of a dynamic workflow as you're dealing with multiple input schema and requiring them to go through the same process. Additionally, you're not sure on the schema before the process starts so you can't manually expose attributes and set the output schemas.

 

I've attached an example process (built in 2023.1) that uses two of Esris sample layers. The process broadly follows:

  1. Get the details of the service using an HTTP Caller to get the JSON
  2. Extract the name of the Layers
  3. Strips the layer id from the end of the URL
  4. Uses the URL and Name in a feature reader to read the data. Note that the features are coming out of the Generic port. This is because we don't know what the feature classes are going to be before we run the process.
  5. Use a schema scanner (with groupby on the feature type) to build a schema and remove 'object id' as this will cause issues as its a reserved field in a GDB
  6. With a GDB writer set as dynamic, write out the features with the correct schema(s)

 

 

Hi ​@hkingsbury - This looks super helpful, and I’m working to do something very similar. You mention you attached an example process to the above post. Where would I find this so I can take a look? Thanks so much again.



It looks likes the attachment was lost when the community was migrated early this year

Hopefully these links will help:

https://support.safe.com/hc/en-us/articles/25407480725645-Tutorial-Dynamic-Workflows
https://fme.safe.com/blog/2021/02/fme-tips-tricks-creating-dynamic-workflows/


changeofmotion
Contributor
Forum|alt.badge.img

Many thanks ​@hkingsbury - I’ve taken a look. 95% there now I think.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings