Skip to main content
Hello FMERS,

 

We are planning to move from using batch files for error checking the daily FME workbench's log files for success or failure. 

 

Ideally we would like to write workbench parameters to the PostGreSQL database which then could be interrogated by another script

 

 

Does anyone have an example of a Python Shutdown script which  uses a PostGreSQL writer to add parameters such as FME_STATUS / data time into a PostGreSQL table.

 

 

Regards

 

Andrew

 

 

 

 

 

 

 

Hi,

 

 

I haven't got an example, but I believe you could use the FMEUniversalWriter object of the fmeobjects API to write this data to PostGIS.

 

 

You'll find the documentation under <FME>/fmeobjects/python/apidoc/index.html

 

 

David
Hi Andrew,

 

 

Another approach: If you create a second workspace having an SQLCreator transformer to write some data into the database, and link "SQL Statement" of the SQLCreator to a published parameter (named "SQL_STATEMENT" for example), you can create an SQL statement in Shutdown Script of the main workspace, pass the statement to the second workspace and run it via fmeobjects.FMEWorkspaceRunner object. Example (omit error handling): ----- import fmeobjects, time   sql = "\\"insert into myTable (status, tstamp) values ('%s', '%s')\\""  \\     % (FME_Status, time.strftime('%Y-%m-%d %H:%M:%S'))   ws = FME_MacroValueso'FME_MF_DIR'] + 'second_workspace.fmw' params = {'SQL_STATEMENT' : sql}   runner = fmeobjects.FMEWorkspaceRunner() runner.runWithParameters(ws, params) runner = None -----

 

 

Takashi

Reply