Skip to main content

Hi,

I'm trying to ingest log files into Azure monitor so I can send out organisation wide notifications but i've run into an issue.

 

When I look into the machine that flow is installed on, I can see that it is storing logs in folders in batches of 1000 and it will create new folder each time it hits a multiple of 1000. I can process files into Azure, but this doesn't seem to work with a wildcard in the /current/ folder - it needs a folder that already exists.

 

Is there a way to write these logs all to the same folder?

The file pattern/query i'm using:

 C:\\ProgramData......logs\\engine\\current\\jobs\\*\\job_*.txt

Many thanks,

 

Sam

If you're using FME to iterate over the log files, you can use the following pattern to look for files in all sub-directories:

 C:\ProgramData......logs\engine\current\jobs\**\job_*.txt

Note the double asterisk to match any directory.


If you're using FME to iterate over the log files, you can use the following pattern to look for files in all sub-directories:

 C:\ProgramData......logs\engine\current\jobs\**\job_*.txt

Note the double asterisk to match any directory.

Thank you for the response, David. Unfortunately the cloud team has dictated that this all must go through Azure. I'll try the double asterisk and see if that helps at all.


Thank you for the response, David. Unfortunately the cloud team has dictated that this all must go through Azure. I'll try the double asterisk and see if that helps at all.

If you're using software supplied by Azure to ingest the log files you may have better luck asking on an Azure specific forum. I don't think it's possible or even desirable to have FME keep all the (potentially tens of thousands) job logs in a single directory.


Reply