Skip to main content

This is supposed to be a quality of life improvement when working with unreliable data providers or when you have unreliable internet connection. I hope this functionallity exists in FME already but I could not find it.

I am trying to make a custom node that saves the data stream it intercepts locally and reads that data instead if the original source doesnt come through. There are two problems that I have been unable to solve. One is that i have to run the script with feature cashing to get it to save the files and there is an error upon saving, it does not seem to understande the grouping for some reason:

localize_data_2_FeatureWriter (WriterFactory): Group 2 / 2: MULTI_WRITER: No dataset was specified for MULTI_WRITER_DATASET or localize_data_2_FeatureWriter_0_DATASET or FFS_DATASET
 

So as the image suggests the custom transformer saves the data stream here at 3 points. It dynamically creates local files named by the customTransformer instance name and with dynamic schema.

The second problem is dynamically reading the files and restore the exposed attributes. To with I can not find any solution yet. I just want to apply the same attributes again, maybe i could use a saved schema file somehow to connect to the reader? I uploaded the script, if anyone has any ideas. (Or even knows if this function already exists in FME?)

 

Sincerely,

Robin

I might add, this is still usefull in its current form, since it atleast saves the attributes dynamically and creates a name and place dynamically. I run each of these transformers one by one using feature cashing, and then manually drop in the files. Very clunky compared to what it could be and all of the “smart replay” function is unused but still saves a lot of time beacause I dont have to manually save each file.

 


The “Recorder” and “Player” transformers are made for this, but it should work the way you set it up using FeatureReaders and FeatureWriters. The NoFeaturesTester from the hub can also help.

I often see that what you describe (workflow works only with FeatureCaching step by step) is being caused by incorrect workflow order. Connections between transformers are processed one by one, in serie, not parallel.

Dynamic exposing attributes is not a thing, for dynamic writing this is solved by schema’s (SchemaScanner). If attributes are not know before runtime, you are not able the set them in the transformers.


Thank you for replying!

I had missed the recorder and player, nice to know, they are slightly more conveinant than file out nodes i suppose, although not as smooth as what i was hoping for. They should ideally have been the same node. As it stands, my custom transformer is still more convenient, even in its current broken form.

 

I used the NoFeatures tester before, untill i realised I could just use a featureMerger. 

 

Good to know that dynamic exposing is not a thing. If it had been a native record-replay node i suppose you could have gotten around that. I might add that as a suggestion.

 

I’m not sure I completely understand the implications of incorrect workflow orders. But I cant see that  blocking the possibility of running multiple instances of the custom transformer att different parts in the workbench? Unless featureReader or featurewriter are blocking transformers….

 


Reply