Skip to main content

Hello Esteemed Community,

I have been banging my head on this issue and have tried the help documents, this forum and the academy training to find an answer and am still stumped. I am struggling to access data that I submit with a topic through the data streaming service inside my automation via the FME Flow Topic Trigger. I have successfully tried the FMEFlowNotifier transformer just to see if I could get that working, but it does not fit in line with what I am trying to do which is to post to topics on success or failure at the end.

This is what the service properties looks like:
 

Service Properties

And the JSON writer:
 

JSON Writer

I have tried access the properties both using {job_id} and {data.job_id} but the translation log shows that it uses those values as is.

Value Example

One of the key issues I have is that I need to set the values to a custom attributes because the parameters I am using them for are number type and it won’t let me type something non-numeric.
 

Custom Attributes on FME Flow Topic
Numeric Parameter

I am fairly new to FME, and so I am sure I am either missing something obvious or going about this in the wrong way. Looking forward to hearing back from you veterans. :)

Oh, and this is a proof of concept workspace because I am trying to solve an issue where it seems you cannot use the data streaming service as part of an automation. My use case is that I want an application to call an automation in FME Flow with parameters and receive JSON output once it is done, but I don’t think that is possible. So my work around is to call a workspace using a webhook, have the data streaming service return the data and execute the rest of the automation based on its success or failure. Am I right or wrong about automations not being able to respond directly back to an application that initiated it though a web hook?

Greetings,

Ken

Hi @kketola

Yes, the data streaming service isn’t available in Automation workspace actions. 

For your use case, does the triggering application have an API? If so, I would trigger the Automation using a webhook and have the workspace in pass the data back to the Automation using an AutomationWriter. Then, you can connect an HTTP Request (External Action) to the AutomationWriter output and post the data back to the triggering web application in the request body. 


Hi @kketola

Yes, the data streaming service isn’t available in Automation workspace actions. 

For your use case, does the triggering application have an API? If so, I would trigger the Automation using a webhook and have the workspace in pass the data back to the Automation using an AutomationWriter. Then, you can connect an HTTP Request (External Action) to the AutomationWriter output and post the data back to the triggering web application in the request body. 

Thanks for responding @mattmatsafe ! In this case no there isn’t an api to use in the source system. We are trying to proof of concept using FME for real time usage where FME itself is being triggered through an API call using database code, reporting back data and continuing in the code using logic based on what came back. I almost have something that could work. The data streaming gives back postman data and initiates the automation. I just can’t seem to get access to the data from the Notify on Job Completion section in the automation.

Attempt at a visual for the workflow. Normally, we just have the workspace embedded into our automation, but needing data streaming is what is causing us to get more creative.
 

Process Flow

Ken


Hi @kketola,

This is an interesting use case! I hadn’t tried this before myself. If you monitor the topic you’ve registered with the Streaming service in FME Flow, you should see the data included in the JSON payload. However, the data is not included in the Topic Trigger’s event payload. I believe this is a bug and will file it. I can’t see why this wouldn’t work.

I know you already mentioned that you tried the FMEFlowNotifier. Could you put it right at the end of your workspace and pass the data that way? The FMEFlowNotifier shouldn’t trigger if the workspace fails. Although if your workspace is complex and splits into multiple parallel flows it might. In that case you might need to adjust the logic to bring the flows together at the end before triggering the FMEFlowNotifier, to make sure that you don’t trigger the topic when you don’t want to (could use a tester or something to do a logical test)


Hey @mattmatsafe ,

So, it was challenging to figure out a way as topic monitoring isn’t configured on my test system, but I figured out how to get access to the payloads being sent.

 

So, when using data streaming -> Notify on Job Completion and Post Data from Writer:

If I use {all} on a logger action, I get something like the following which has a lot of data, but none of mine though the subscriber_folder attribute does point to a spot where the JSON is written and correct:

{

    "automation.id": "170ff35b-7fd9-4836-b77e-cfa4e32c94de",

    "workspace": "stream_test.fmw",

    "timeRequested": "Fri-27-Sep-2024 02:23:17 PM",

    "NumFeaturesOutput": "2",

    "requestKeyword": "STREAM_DOWNLOAD_SERVICE",

    "source": "topic",

    "repository": "Training",

    "jobsuccess_topic": "DATA_STREAMER_TRACKING_SUCCESS",

    "jobfailure_topic": "DATA_STREAMER_TRACKING_FAILURE",

    "global.integration_id": "-1",

    "StatusNumber": "0",

    "user.ERROR": "0",

    "id": "2605",

    "timeFinished": "Fri-27-Sep-2024 02:23:18 PM",

    "logHome": "//tstfs01/TstFMEdata/FMESoftwareFileShare/resources/logs",

    "logUrl": "http://TSTFMEAPP01:80/fmerest/v3/transformations/jobs/id/2605/log",

    "user.JOB_ID": "{job_id}",

    "timeStarted": "Fri-27-Sep-2024 02:23:17 PM",

    "OutputLocation": "\\\tstfs01\\TstFMEdata\\FMESoftwareFileShare\resources\\system\temp\\engineresults\\FME_24066936_1727472197799_3528",

    "LogFileName": "job_2605.log",

    "StatusMessage": "Translation Successful",

    "user.INTEGRATION_ID": "{integration_id}",

    "subscriber_folder": "\\\tstfs01\\TstFMEdata\\FMESoftwareFileShare\resources\\system\temp\\engineresults\\FME_24066936_1727472197799_3528_nw",

    "time": "2024-09-27T14:23:18-07:00",

    "event.id": "cd9423ce-c907-40a7-b7b9-0510ee155dd8",

    "automation.name": "INT0000 Integration Tracker Template(1)"

}

If I use {all} in an email subscriber to the same topic, I get my intended data:

}

                {

                                "json_featuretype" : "data",

                                "job_id" : 2615,

                                "integration_id" : 9999

                }

]

When using FMEFlowNotifier in the workspace:

If I use {all} on a logger action, I get some other data, but mine is included as well:

{"automation.id":"170ff35b-7fd9-4836-b77e-cfa4e32c94de","global.integration_id":"-1","integration_id":"9999","job_id":"2649","source":"topic","time":"2024-09-27T15:00:23-07:00","event.id":"67c6ce62-8ac9-4edb-8c4a-e9b4a58670a0","automation.name":"INT0000 Integration Tracker Template(1)"}

If I use {all} in the email subscriber, I get _undefined_ but my attributes are accessible directly by using {job_id} and {integration_id}.

_undefined_ 2659 9999

So, it looks like the payload is not available to automations when using the data streaming notify on job completion service. Is it possible this is unintended, because I would have expected my payload by notifying a topic to be available to any subscriber whether defined on the subscriber tab or through an FME Flow trigger.

Ken


Hi @kketola, in my opinion, yes the data should be included in the Automation when triggered as part of the topic notify registration. I’ve filed a bug for our team to investigate, though it may be a while before they are able to. It’s possible this was never implemented for a reason, but at this point I’m not sure. The number for your reference is FMEFLOW-24034.

So, for the time being, yes please use the FMEFlowNotifier. I’m sorry for the frustration this has caused you!


Thanks so much @mattmatsafe for working through this with me! It was nice to have someone to bounce thoughts and ideas off of and get to the point of letting go of trying to make it work in this specific way for now. Have a great week!

Ken


Reply