I have a workflow which writes data to Esri SDE (Enterprise Geodatabase) via the Geodatabase Writer.
It is dynamic and reads schema definition from a CSV file.
In the Geodatabase the target schema is already defined at FME write time. It is created beforehand in ArcCatalog by importing an XML workspace document.
This workflow produces compatible results for Personal GeoDB, FileGDB and SDE with identically configured writers for MDB, FileGDB and SDE.
Up to now I used SDE based on SQL Server or ORACLE. So far so good.
Now my problems start when I try the same workflow on an SDE based on Postgresql.
The symptoms I notice are:
1) In ArcCatalog: After Import of the XML workspace document all Table names are as expected (mixed case) but all column names are converted to lower case in contradiction to what happens on ORACLE or other environments.
2) Writing data gives No WARNING, NO ERROR not any hint something went wrong but all columns are empty
After studying Esri SDE Postgresql docu this is not really a suprise (:-(() but how is the solution for this dilemma ?
Is there any way I can achieve a compatible solution for all flavours of SDE ?
Thanks for any hints
Michael