Skip to main content
Question

Batch analysis of FileGDB layers

  • September 27, 2017
  • 2 replies
  • 20 views

Forum|alt.badge.img+1

Hi,

I have a situation where I have two file geodatabases. They contain subsets of data from the same spatial areas and have a structured coding (see example image below).

As the layers can be large I wan't to process them in a workbench one pair at a time. But there are many pairs, so I would want to batch this. I know of a method to do this using PATH reader and WorkspaceRunner, when the inputs are Shape files. But with the PATH reader you can't map in to FileGDB layers. I do not want to convert to Shape, as I would loose the complex attribute names.

Any suggestions how I could send each pair at a time to a WorkspaceRunner in batch? Attached is the logic how it would work with Shape-files.

Thanks for any ideas!

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

2 replies

bruceharold
Supporter
Forum|alt.badge.img+19
  • Supporter
  • September 27, 2017

How about have two GeodatabaseFile readers with * merge filter (i.e. single merged feature) and a TestFilter with an output port for each 'ends with' case and test fme_feature_type.


Forum|alt.badge.img+1
  • Author
  • September 28, 2017

How about have two GeodatabaseFile readers with * merge filter (i.e. single merged feature) and a TestFilter with an output port for each 'ends with' case and test fme_feature_type.

Thanks for the suggestion. I was thinking something similar, but it's not a sustainable solution as the approach needs to be generic. The 'ends with' part can change over time and I would need to recreate the testfilter everytime.