Skip to main content
Question

HTTPCaller Timeout after 300s Regardless of Timeout Values Set


aaron_chaput
Contributor
Forum|alt.badge.img+2

Hi,

I’m attempting to run a workspace on FME Flow Hosted from within another workspace with an HTTPCaller using a webhook URL.

The child workspace will eventually return a URL link to a zip file as it uses the DataDownload service.

The workspace can take some time to run (> 5mins) as the data can be quite large.

My issue is that I want to run the job synchronously so I’m able to retrieve the URL within the parent script to send it within an email, but the HTTPCaller always times out if the child script takes longer than 300s to run even if I set the timeout values to anything over 300s or even 0 (wait indefinitely). It is successful if the child script runs quickly (<300s).

This is a problem as most instances will take longer than 300s.

I don’t understand why it times out if it goes over 300s even if I tell it to wait longer.

Is this a bug? If so, is there a work around or is it on the path to being fixed?

Thanks

hkingsbury
Celebrity
Forum|alt.badge.img+50

It could well be the server/proxy that is closing the connection.

Based on what you’re trying to do, a better approach would be to use the FMEFlowJobSubmitter. This gives you the option to wait for a job to complete (or not).

 


aaron_chaput
Contributor
Forum|alt.badge.img+2

That’s a great idea, but is there a way to submit a job with the DataDownload service?

I need the zipped file URL that the DataDownload service provides.

I was using the HTTPCaller as it allows me to submit a workspace with the DataDownload service like so:

https://xxxxx.fmecloud.com/fmedatadownload/eCommerce/eCommerce_Data_Clip_Script.fmw?encoded_message=helloworld&opt_showresult=false&opt_servicemode=sync&token=123456789


hkingsbury
Celebrity
Forum|alt.badge.img+50

Ah, sorry, I missed you were wanting a datadownload, in that case the FMEFlowJobSubmitter won’t work (It will submit it as a standard job).

What you can do instead is set the servicemode in the query string to async. This will submit the job then straight away return a response indicating if the job was successfully submitted along with the jobid.

You can then periodically query fmerest/v3/transformations/jobs/id/{jobid}. When the submitted job is complete it will return the result object with a url to download the result in the “resultDataetDownloadUrl” key.
 


This will require a looping custom transformer to periodically check the response from the API https://docs.safe.com/fme/html/FME-Form-Documentation/FME-Form/Workbench/transformers_custom_looping.htm

 

Another option is to use a decelerator after the job submission with a sufficiently long time to allow the job to complete. Best case the job takes 30s to run and the process waits 60s, worse case the job takes over 60s and the process then fails as its expecting a download url to be returned.


aaron_chaput
Contributor
Forum|alt.badge.img+2

ah ok I had thought of the looping option, but I was hoping that wasn’t necessary.

Theoretically, the HTTPCaller method I proposed initially should work if it doesn’t incorrectly time out.. Right?

I’ll explore the looping option, but I might take out a ticket with Safe to see why it times out when it shouldn’t.


hkingsbury
Celebrity
Forum|alt.badge.img+50

Yea, worth speaking with Safe, especially as you’re using Flow Hosted.

Last year I did a presentation called ‘Looping Without Looping’ - https://locusglobal.com/wp-content/uploads/2024/06/Looping-Without-Looping_Hamish_Abley.pdf

You could use the variable retriever/setter approach, it’s a bit of a different way of thinking about how you can manipulate data flow in FME, but it gets the job done


aaron_chaput
Contributor
Forum|alt.badge.img+2

The timeout issue may be unrelated to FME (IT setting or server architecture).

I was able to get the process working by checking an asynchronous job status within a loop.

Once the job finished, I was able to email the URL to the end user.


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings