Skip to main content

Hi All,

 

I have a workbench built in FME Desktop 2019.1 which I am planning on deploying to FME Server 2019.1. This workbench does incremental updates to records in a SQL Server database on the basis of updates delivered in XML payloads. The workbench consists of an XML reader, various transformers to transform XML data into the required format, and a SQLExecutor to run an 'upsert' in SQL Server (we elected not to use a SQL Server Writer). XML payloads are delivered to a directory on a file share of some kind (probably FTP, maybe SMB - this is an implementation detail, I can map the directory as a network drive of some kind if required). We do not need to handle any kind of geospatial data.

 

What I am trying to do is determine a way to trigger the workbench in FME Server by a file being dropped into a directory, and then, once the workbench has run, move the file to another archive directory. I can see that triggering an Automation in FME Server on the basis of a change in a directory is not overly complicated, but the action of moving the file when finished is not clear to me.

 

I am very open to suggestions on this as I have not yet been able to find any way to do something like this. I am not locked into this particular read-then-archive workflow either - all I need to do is make sure FME Server will not read the same files in the directory again after it has successfully processed them once. The workbench is not set in stone either, I can make changes here if necessary.

 

Thanks in advance!

Hi @kenant​ ,

I have a couple workflows set up like this. I use two separate workspaces set up in an Automation that triggers when a new file is added to my 'Input' folder. My set up uses two folders, 'Input' and 'Processed'. Input is always empty except when a user drops a file in to be processed. My first workspace processes this file and reads the data into the database. When that succeeds, it triggers the second workspace which does the file handling. I use a Directory and File Pathnames Reader to read that there is a file in the Input folder. I then run it through a DateTimeStamper transformer and use this to write(in a FeatureWriter) a new folder in the Processed folder with the name of the date. From there I use a File Copy Writer with the operation set to Move, to write the file into the new folder. There's also an AttributeManager along the way to help manage the filecopy attributes. The second workspace looks like this:

CaptureThis moves the file out of the Input folder into the new date folder in Processed , leaving the Input folder ready for the next file to be dropped in.

 

Hope this helps.

 

Dave


Hi @kenant​ ,

I have a couple workflows set up like this. I use two separate workspaces set up in an Automation that triggers when a new file is added to my 'Input' folder. My set up uses two folders, 'Input' and 'Processed'. Input is always empty except when a user drops a file in to be processed. My first workspace processes this file and reads the data into the database. When that succeeds, it triggers the second workspace which does the file handling. I use a Directory and File Pathnames Reader to read that there is a file in the Input folder. I then run it through a DateTimeStamper transformer and use this to write(in a FeatureWriter) a new folder in the Processed folder with the name of the date. From there I use a File Copy Writer with the operation set to Move, to write the file into the new folder. There's also an AttributeManager along the way to help manage the filecopy attributes. The second workspace looks like this:

CaptureThis moves the file out of the Input folder into the new date folder in Processed , leaving the Input folder ready for the next file to be dropped in.

 

Hope this helps.

 

Dave

@drc43​ this looks very promising, I will have a play with these ideas!

 

When your second workspace is triggered, do you keep track of the filename from the first workspace in any way? We may have files dropped in any time, and it is possible for multiple files to be dropped in at once - I am thinking through possible race conditions where a file is moved before it is processed. Some time ago I briefly looked into whether the XML reader keeps track of the filename and might need to look into this again, as it would neatly solve this problem by just moving a specific file.

 

Thanks heaps for your help!


@drc43​ this looks very promising, I will have a play with these ideas!

 

When your second workspace is triggered, do you keep track of the filename from the first workspace in any way? We may have files dropped in any time, and it is possible for multiple files to be dropped in at once - I am thinking through possible race conditions where a file is moved before it is processed. Some time ago I briefly looked into whether the XML reader keeps track of the filename and might need to look into this again, as it would neatly solve this problem by just moving a specific file.

 

Thanks heaps for your help!

@kenant​ 

When I first set this up, it was before Automations were in FME Server, so I used Publications which kind of limited the filename transfer from one workspace to the next. This wasn't really a problem as our workflow didn't have a lot of people submitting files. Except when multiple files were all dropped in at once. The first workspace would run fine, running once for each file dropped in, but then the second workspace would move all of the files at once, causing the subsequent running of the second workspace to fail cause there were no more files to move. This wasn't really an issue, cause everything was processed and moved as expected, but I would end up seeing a lot of failed jobs on server. Certainly could be improved upon.

 

I just recently moved these Publications to be Automations and didn't really take the time to set them up as well as they probably could be. I have other Automations set up as you say, with parameters being passed from one workspace to another, so I know it is possible, I just haven't set it up on these directory watcher automations just yet.


Reply