Skip to main content
Solved

Calling the s3uploader transformer using a shutdown python script

  • February 1, 2018
  • 2 replies
  • 34 views

Hi,

Is it possible to call the s3Uploader transformer using a python shutdown script? My workflow currently uses a python shutdown script to convert the FME output data into a number of proprietary formats and now I would like to upload these proprietary files to AWS S3. I would preferred to do it all using a single work bench without install extra python modules(ie Boto3)

Thanks

Best answer by takashi

Hi @aaronthorn, I don't think it's possible. There are two workarounds:

[A] Create another worksapce, which contains a WorkspaceRunner (Wait for Job to Complete: Yes) to run the current workspace and a S3Uploader to upload the resulting dataset(s) to S3. If the workspace will be run with FME Server, use the FMEServerJobSubmitter instead of the WorkspaceRunner.

or

[B] Replace all the writer(s) in the current workspace with FeatureWriter transforme(s) and add S3Uploader(s), which uploads resulting dataset(s) to S3 after writing completed.

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

2 replies

takashi
Celebrity
  • Best Answer
  • February 1, 2018

Hi @aaronthorn, I don't think it's possible. There are two workarounds:

[A] Create another worksapce, which contains a WorkspaceRunner (Wait for Job to Complete: Yes) to run the current workspace and a S3Uploader to upload the resulting dataset(s) to S3. If the workspace will be run with FME Server, use the FMEServerJobSubmitter instead of the WorkspaceRunner.

or

[B] Replace all the writer(s) in the current workspace with FeatureWriter transforme(s) and add S3Uploader(s), which uploads resulting dataset(s) to S3 after writing completed.


takashi
Celebrity
  • February 1, 2018

Hi @aaronthorn, I don't think it's possible. There are two workarounds:

[A] Create another worksapce, which contains a WorkspaceRunner (Wait for Job to Complete: Yes) to run the current workspace and a S3Uploader to upload the resulting dataset(s) to S3. If the workspace will be run with FME Server, use the FMEServerJobSubmitter instead of the WorkspaceRunner.

or

[B] Replace all the writer(s) in the current workspace with FeatureWriter transforme(s) and add S3Uploader(s), which uploads resulting dataset(s) to S3 after writing completed.

[Addition] In the [B] approach, the process that creates datasets in the proprietary formats should be moved from the shutdown process to the workspace body.