Question

FME TEMPLATE failing to read large number of features on output file.

  • 5 December 2019
  • 6 replies
  • 3 views

My template fails to read when there is large data like features above 2600. When I run the template manually it reads all features but when I run the same template from application it's failing when higher number of input features. I am reading the data from Oracle.


6 replies

Badge +2

@gauripearl Can you give us a few more details? By 'template' do you me an FME Workspace? If FME is failing can you attach a log file or at least the error message that might be reported. Attaching the problem FME Workspace might also help someone identify the cause of the problem. Thanks

@gauripearl Can you give us a few more details? By 'template' do you me an FME Workspace? If FME is failing can you attach a log file or at least the error message that might be reported. Attaching the problem FME Workspace might also help someone identify the cause of the problem. Thanks

@markatsafeYes template means workspace. When I run manually it works perfectly fine without any errors. But when I run this from another application(desktop tool runs it through command prompt) it fails when there is large number of rows returning from Oracle data tables. Probably the number of rows more than 2600. Template does not generate output file and from log files it seems to be stopped at query execution with no features returned. Are there any settings changes I am missing on or have done wrongly.

Badge +2

@gauripearl FME shouldn't behave any differently when run on the command line. You can check your command line parameters by running FME Workbench. At the top of the log, the command line parameters are displayed. If the problem persists, I would contact your FME Reseller for additional support.

@gauripearl FME shouldn't behave any differently when run on the command line. You can check your command line parameters by running FME Workbench. At the top of the log, the command line parameters are displayed. If the problem persists, I would contact your FME Reseller for additional support.

I tried running my workspace from command line separately and it working perfectly fine. When the its running from desktop application it's failing for large data only. My log file shows different path for "current working folder". When I run my workspace through application the path for current working folder is -

FME Configuration: Current working folder is `C:\\Program Files (x86)\\GIS Server'

(For this, it is failing for large data but works fine for smaller amount of data).

GIS server is my desktop tool which s executing FME workspace through cmd.

and when I run workspace independently.the path is

FME Configuration:Current working folder is `C:\\Users\\gisdev'

Does this path affects the fme failing for large data?

 

Badge +2

@gauripearl Do you have a log file that shows the error message of the failed FME Workspace?

"FME Configuration:Current working folder" is the location that you're running the workspace from. However, this might affect the temporary working directory. Also, your cmd application might run under a different (system) user which might give it a different temp directory.

Sometimes in larger organisations, "user" folders are limited in storage size. Try setting the FME_TEMP environment variable to point to another folder, outside of c:/users : https://knowledge.safe.com/articles/176/fme-temp-environment-variable.html

Perhaps try running your workspace using the Quick Translator instead of Workbench to see if that causes the same failure.

Without showing us the log files, I think we're pretty limited in understanding the cause of the issue.

@gauripearl FME shouldn't behave any differently when run on the command line. You can check your command line parameters by running FME Workbench. At the top of the log, the command line parameters are displayed. If the problem persists, I would contact your FME Reseller for additional support.

@markatsafe thanks for looking into this. My issue is resolved. There was process time limit set on cmd process in my application which would close the application if there is larger processing timing requirement for larger data.

Reply