Skip to main content

Hello everyone, hope all are doing well. I have following published workflow in FME Server in which we have some parameters.

This datadownload service activates through an web application, by login users can download the data in different data format . All the download history now we want to store in a database table. I saw our required information is inside the specific logfile. Can we implement some automated process thus when a job has been finished the respective informations regarding the "email of user", "Datetime", "Datatitle" and "Download Format" will be store inside a table? And this runs automatically, we dont want to run any process.

A detail information is needed. PLease help us in this purpose.

 

With kind regards

Muqit

The easiest is to simply add e.g. a SQLExecutor inside the current workspace that writes the necessary parameter values to the database.

Another option is to configure the workspace to trigger a topic on completion. You can then create a workspace subscription that listens for that topic notification and starts another workspace that queries FME Server for the job parameters (you can e.g. use the REST API for this) and writes it to the database.


The easiest is to simply add e.g. a SQLExecutor inside the current workspace that writes the necessary parameter values to the database.

Another option is to configure the workspace to trigger a topic on completion. You can then create a workspace subscription that listens for that topic notification and starts another workspace that queries FME Server for the job parameters (you can e.g. use the REST API for this) and writes it to the database.

Hi David, thanks for your quick response. You second idea will be good for us. I am not good in FME so if you have an example process then it will be very helpful for us. I have understood what you told but how the second workspace will only get the recent FME server job id's, becuase we dont want duplicate IDs.


The easiest is to simply add e.g. a SQLExecutor inside the current workspace that writes the necessary parameter values to the database.

Another option is to configure the workspace to trigger a topic on completion. You can then create a workspace subscription that listens for that topic notification and starts another workspace that queries FME Server for the job parameters (you can e.g. use the REST API for this) and writes it to the database.

I have implemented one topic and a new subscription for FME Workspace, I have also implemented one new workspace for logfile data extraction, I have also published this workspace as Job Submitter. Everything is running well but i am not getting any result and the job FAILED everytime. I think in my new worlflow I have done something wrong, how my workflow will know the last JOB which has been succeed or failed thus it can evalute it and read it?


The easiest is to simply add e.g. a SQLExecutor inside the current workspace that writes the necessary parameter values to the database.

Another option is to configure the workspace to trigger a topic on completion. You can then create a workspace subscription that listens for that topic notification and starts another workspace that queries FME Server for the job parameters (you can e.g. use the REST API for this) and writes it to the database.

The new subscription I have created for the data download job success


Reply