Question

FeatureReader Transformer – Schema Records grouping by fme_feature_type, missing duplicate files in different folders

  • 6 March 2020
  • 4 replies
  • 14 views

I was looking to do a data transformation task by making use of the following high level workflow:

 

  1. Run a data audit to compile a database table of filepaths of MapInfo tab files
  2. Compile data about the file discovered, included a field as to whether to transform this particular dataset (Field “TransformRequired” with Yes or No values).
  3. After our data audit is complete, use the Database table as an input list for the transformation, making use of the FeatureReader transformer
  4. Use a MapInfo Tab writer with a dynamic schema, using the “Schema from Schema feature” option as the Schema source.

 

The problem I have is that the input table list will include MapInfo tables which have the same file name, but which are in different file system folders. The paths I am trying to read in include:

 

\\Input\\GDA2020\\MapInfoV16\\PARCEL_VIEW.tab

\\Input\\GDA2020\\MapInfoV17\\PARCEL_VIEW.tab

\\Input\\GDA94\\PARCEL_VIEW_GDA94.TAB

\\Input\\GDA94\\PLAN_ZONE.tab

 

When reading the list into the FeatureReader transformer and attempting to use the Schema and Generic ports, I find that the unique schema list is output on the fme_feature_type, which means that I get 3 schema records, instead of my expected 4,

Also on the output, I find that only 3 tables are output - the duplicated mapinfo table name has the records for both the mapinfo tables "Parcel_view" tables.

 

In the image attached, showing the schema records from the FeatureReader, there are actually two mapinfo tables of name “PARCEL_VIEW” but in different system folders.

 

 

I was wondering if there is a way to have the FeatureReader return 4 records (one for each MapInfo table) and still use the “Schema from Schema feature” option in the Dynamic Writer.

 


4 replies

Userlevel 1
Badge +10

It should still be possible to do what you want to, but I agree that all 4 schemas should be returned from the feature_reader

To get all 4 schemas you can use another FeatureReader with a schema reader to get the right number of schemas and then some attribute manipulation and sorting to match the schemas with the correct data from your other featurereader for the dynamic writer

It should still be possible to do what you want to, but I agree that all 4 schemas should be returned from the feature_reader

To get all 4 schemas you can use another FeatureReader with a schema reader to get the right number of schemas and then some attribute manipulation and sorting to match the schemas with the correct data from your other featurereader for the dynamic writer

Thanks very much for your reply. I was just wondering if you may be able to provide a bit more detail on the Schema reader part you mentioned? I think I understand the featurereadet transformer ok but the Schema reader wasn’t clear to me. Happy to read up any links you suggest, or some brief pointers on use of this

Thanks again

Userlevel 1
Badge +10

So as well as using the FeatureReader to read your mapinfo files, you also use a second FeatureReader with a format of Schema (Any Format) and read the schemas as feature types, in that way you get all 4 schemas.

It's probably also worth highlighting this issue to Safe as a schema per dataset with the FeatureReader should be produced in this scenario in my opinion.

So as well as using the FeatureReader to read your mapinfo files, you also use a second FeatureReader with a format of Schema (Any Format) and read the schemas as feature types, in that way you get all 4 schemas.

It's probably also worth highlighting this issue to Safe as a schema per dataset with the FeatureReader should be produced in this scenario in my opinion.

Thanks @ebygomm

I found this fme community post gave me the detail that I needed for the Schema (Any Format) to return all schemas required:

https://knowledge.safe.com/questions/72144/featurereader-fails-to-output-multiple-schemas.html

Thanks for pointing me in the right direction initially.

Reply