Skip to main content

Hi,

Is it possible to call the s3Uploader transformer using a python shutdown script? My workflow currently uses a python shutdown script to convert the FME output data into a number of proprietary formats and now I would like to upload these proprietary files to AWS S3. I would preferred to do it all using a single work bench without install extra python modules(ie Boto3)

Thanks

Hi @aaronthorn, I don't think it's possible. There are two workarounds:

[a] Create another worksapce, which contains a WorkspaceRunner (Wait for Job to Complete: Yes) to run the current workspace and a S3Uploader to upload the resulting dataset(s) to S3. If the workspace will be run with FME Server, use the FMEServerJobSubmitter instead of the WorkspaceRunner.

or

[b] Replace all the writer(s) in the current workspace with FeatureWriter transforme(s) and add S3Uploader(s), which uploads resulting dataset(s) to S3 after writing completed.


Hi @aaronthorn, I don't think it's possible. There are two workarounds:

[a] Create another worksapce, which contains a WorkspaceRunner (Wait for Job to Complete: Yes) to run the current workspace and a S3Uploader to upload the resulting dataset(s) to S3. If the workspace will be run with FME Server, use the FMEServerJobSubmitter instead of the WorkspaceRunner.

or

[b] Replace all the writer(s) in the current workspace with FeatureWriter transforme(s) and add S3Uploader(s), which uploads resulting dataset(s) to S3 after writing completed.

[Addition] In the [b] approach, the process that creates datasets in the proprietary formats should be moved from the shutdown process to the workspace body.

 

 


Reply