I am wondering if anyone have any suggestions to use FME to process live data streams that are funneled into various text files, server log files being one of the examples. Thanks!
Best answer by fmelizard
View originalI am wondering if anyone have any suggestions to use FME to process live data streams that are funneled into various text files, server log files being one of the examples. Thanks!
Best answer by fmelizard
View originalHi @bo,
You can use WebSockets to capture these messages/data streams. Here are some resources to get you started:
Hopefully that helps, let us know if you need anymore information.
-Liz
Hi @bo,
You can use WebSockets to capture these messages/data streams. Here are some resources to get you started:
Hopefully that helps, let us know if you need anymore information.
-Liz
If you have access to FME Server then you can create a schedule to run your workspace to grab the appended data. You can set up the schedule to run until cancelled so your output dataset will be continually updating from the log file.
Let me know if I'm not on the right track of what you are asking.
Hi @bo,
You can use WebSockets to capture these messages/data streams. Here are some resources to get you started:
Hopefully that helps, let us know if you need anymore information.
-Liz
This article Directory Watch Publisher with Idle Time Delay (Advanced) (2017) updates your output file by only grabbing the newly appended data. It works on a Modify trigger and the file can be locally or within FME Server. It writes to Google Fusion Tables, but ultimatly it can be written to format.
I'm going to pass this question on to one of our experts and see if they have a better solution for you.
Hi,
If you read in the latest file, and had a copy of the log file in it's previous state (last time you read it) you could use a duplicatefilter transformer to only work with the new, unique rows/lines in the log file. You'd then have to schedule the workspace to run really frequently, so it wouldn't quite be live streaming but fairly close.,
Hi,
If you on a frequent schedule read in the text file, but also kept a record of the previous state of the text file (reading them in with a feature for every line) you could use a DuplicateFilter to only process the new lines added. You'd have to run the job via a schedule very frequently, I don't know of a way that you could live stream a text file.
Hi,
If you read in the latest file, and had a copy of the log file in it's previous state (last time you read it) you could use a duplicatefilter transformer to only work with the new, unique rows/lines in the log file. You'd then have to schedule the workspace to run really frequently, so it wouldn't quite be live streaming but fairly close.,
Hi,
If you on a frequent schedule read in the text file, but also kept a record of the previous state of the text file (reading them in with a feature for every line) you could use a DuplicateFilter to only process the new lines added. You'd have to run the job via a schedule very frequently, I don't know of a way that you could live stream a text file.
Hi @bo,
You can use WebSockets to capture these messages/data streams. Here are some resources to get you started:
Hopefully that helps, let us know if you need anymore information.
-Liz
Hi @bo,
The Directory Watch has a min poll of 1 minute, whereas a schedule can run every second if needed. Another option is to use a looping custom transformer to re-read the file over and over again in an endless cycle (it would require an engine (or workspace) to run constantly though . I put one together which you can test out.
Hi @bo,
The Directory Watch has a min poll of 1 minute, whereas a schedule can run every second if needed. Another option is to use a looping custom transformer to re-read the file over and over again in an endless cycle (it would require an engine (or workspace) to run constantly though . I put one together which you can test out.
Enter your username or e-mail address. We'll send you an e-mail with instructions to reset your password.