Skip to main content

Dear All.

I have made an absolut simple workbench with LAS reader to a Recap writer. Nothing fancy.

But it seems to have a limit of an amount of files that I can read and write. right now the limit is about 70 files in write.

I have searched and asked around at the office, but cant get an answer.

I have 100gb available space and plenty of memory, so that is not the issue.

Attached is the script.

Hope there is a solution, since there are about 1.000 files that needs to be converted which will result in about 10.000 files written.

 

Hello @mrmlaursen​ , there should be no limit on number of files on reading or writing. Took a look at your workspace, nothing obviously alarming there. I decided to see if I could repro this and I got the following error: "ReCap Writer: Failed to convert LAS files to RCS scan files. The following error occurred: ..." . If you would want to try by dividing up your translation in some way?

 

I think one potential solution would be to change the writer to use Dataset Fanout instead of Feature Type Fanout. Right now you are setting Feature Type Fanout to:

@Value(fme_basename)-@Value(_split_value)

If you instead set Dataset Fanout to:

@Value(fme_basename)

 and Feature Type Fanout to:

@Value(_split_value)

the translation may work, but would have a different directory structure than you do now.

 

Let me know if this works for you! Here to help, Kailin.


Hi, yes I started that way with the simplest of LAS reader and Recap writer but is also have the limit at about 70-100 files.

See attached. But still issues.

 


Can you write a parent workspace and run one file through at a time with a workspace runner?


Can you write a parent workspace and run one file through at a time with a workspace runner?

@ebygomm​ I feel like that may work, with Wait For Job To Complete parameter set to Yes maybe.


Reply