Multiple possible solutions, but I often use the ZipArchiver transformer to zip my output before emailing from server.
Should I place the ZipArchiver transformer before my file gdb writer? Im trying to avoid having an additional workspace just to zip the output.
Should I place the ZipArchiver transformer before my file gdb writer? Im trying to avoid having an additional workspace just to zip the output.
No, use a FeatureWriter and use the summary outputport to initiate the ZipArchiver. Then use the emailer to send the mail.
No, use a FeatureWriter and use the summary outputport to initiate the ZipArchiver. Then use the emailer to send the mail.
Also, don't forget to test if your zip is smaller than the max attachment size for your mail client.
I ended up using the feature writer to create the file gdb in an FME Server resource folder then took the output dataset and fed it into a zip archiver. I used published parameters to allow the location to be set for the gdb and zip at run time. I used an FME Server automation to send the email though.
I ended up using the feature writer to create the file gdb in an FME Server resource folder then took the output dataset and fed it into a zip archiver. I used published parameters to allow the location to be set for the gdb and zip at run time. I used an FME Server automation to send the email though.
Can you explain why you use an automation to send the email? I'm trying to learn and not sure why or when to use automations.
Can you explain why you use an automation to send the email? I'm trying to learn and not sure why or when to use automations.
Absolutely. So the old notification system, listed as "classic" in FME Server interface, is the process of clients sending information via a publication to a topic. Example a workspace runs (client), sends the resulting information via a publication, to a topic, which is then disseminated by subscriptions to the receiving clients. The link above is a good visual representation.
This was confusing and required a large amount of network config to work correctly. The new automations system still uses topics to handle messaging but does not use the publication or subscription system. Here is an example.
In this example the automation is triggered by posts to the ASYNC_DATADOWNLOAD_JOB_SUCCESS topic. The log action logs the entire job event as JSON so I can filter on the job information. I then filter by searching for a specific repository because im only interested in receiving emails from our reporting workspaces. Then the automation sends an email containing relevant attachments or information.
This way I can write any report in FME Desktop then publish it to the repository GIS REPORTS without having to configure the email part of it. I just do reader > transform > writer, then publish and the notifications about success and failure is already handled.
Lets look at another example.
This automation is triggered by posts to the ASYNC_JOB_SUBMITTER_SUCCESS or ASYNC_JOB_SUBMITTER_FAILURE. It then filters the events by searching for "BuildingFootprint" which isolates the workspaces I want to focus on. It creates a log event of the full JSON for further filtering. It then splits the flow depending on which of the workspaces triggered the automation. If its the attribute check I did it takes the top path. If its the geometry validation workspace it follows the bottom path.
You can email in the workspace without issue. All of this could be completed in the workspace itself but when I have many related workspaces participating in an App Gallery.
To change the notification handling of all the workspaces involved i can modify 1 automation vs modifying and republishing every workspace.
It is entirely dependent on the scale and purpose of what you are doing.