Hi @ewb_fme ,
I'm not sure how your workspace currently works but you could make a published parameter for the file. Then, feed the filename in from the directory watch. When two files arrive in the directory, the workspace will run twice, 1 per file.
See this article for details https://community.safe.com/s/article/run-a-workspace-when-data-arrives-in-a-directory
Hi @ewb_fme ,
I'm not sure how your workspace currently works but you could make a published parameter for the file. Then, feed the filename in from the directory watch. When two files arrive in the directory, the workspace will run twice, 1 per file.
See this article for details https://community.safe.com/s/article/run-a-workspace-when-data-arrives-in-a-directory
@siennaatsafe Thank you very much for your input!!! This is exactly what I was looking for! 😀
@siennaatsafe
I have an additional question regarding this topic. What if I have to precess these files according to their creation date? For example when I copy 4 files to this folder, then I would like that FME Flow would process the oldest file first, then the second oldest and so on. Right now, it processes the files randomly... Do you have an idea how to solve this?
Hi @ewb_fme
Unfortunately, there is no easy way to do this with the Directory Watch trigger.
However, you could do this, if you wanted to use a schedule.
In a workspace, you'll want to use a directory and filepaths reader (to get the filenames and the times submitted), a datetime converter (to convert to epoch time), a sorter, and an Automations writer.This will sort your files by the time they arrive and the Automations writer can be used to pass the filenames into the following workspace through the automation.
Now, if you use this method, you'll have to add some logic to not reprocess files, this could either be moving them out of the directory after they've been processed or you could have a database that tracks which files have already been processed.
Hi @ewb_fme
Unfortunately, there is no easy way to do this with the Directory Watch trigger.
However, you could do this, if you wanted to use a schedule.
In a workspace, you'll want to use a directory and filepaths reader (to get the filenames and the times submitted), a datetime converter (to convert to epoch time), a sorter, and an Automations writer.This will sort your files by the time they arrive and the Automations writer can be used to pass the filenames into the following workspace through the automation.
Now, if you use this method, you'll have to add some logic to not reprocess files, this could either be moving them out of the directory after they've been processed or you could have a database that tracks which files have already been processed.
@siennaatsafe thank you very much for your answer!
Another option I tried was
- to use a "Resource or Network Directory" trigger with CREATE and DELETE
- do the sorting in Workbench as you suggested
- and then instead of using an Automation Writer, just do the data manipulation directly in the same workbench and check the "Skip if Job In Progress" box.
- After the workbench is done I move the file and this triggers again point 1 (because of the DELETE option)