Skip to main content

I am reading from an Esri SDE GeoDB (ArcObjects) and use Archive Where Clause with GDB_FROM_DATE/GDB_TO_DATE to read archived data.

... after processing the data, I write it to another SDE with mostly the same schema. This is my first migration step.

 

After a while I want to do the same workflow for the data which were changed since the first migration. So I use GDB_FROM_DATE/GDB_TO_DATE in order to find the changed data in the source SDE.

 

At the end of the second processing I need to update some of the features of the first migration (those which have been removed since the first round) and I try to use fme_db_operation UPDATE and a fme_db_operation_where clause attribute, where I try something like this:

fme_db_operation_where=UUID='DEBE0707Hb000020' AND BEG='20170619110522' AND ENDE IS NULL

 

BEG is a column of type timestamp (in the underlying Oracle DB) and needs to be compared to the value read from GDB_FROM_DATE which is of type date

 

I get errors like the following and suspect it has to do something with the interpretation of date vs. timestamp, but I am not sure about this.

 

ERROR message:

901936 Error while retrieving an ArcObjects update cursor for the table/feature class 'AX14003'. The error number from ArcObjects is: '-2147220985'. The error message from ArcObjects is: {An invalid SQL statement was used.}

929507 Geodatabase Writer: WHERE clause is evaluated as 'UUID='DEBE0707Hb000020' AND BEG='20170619110522' AND ENDE IS NULL'

 

UUID ist type text, BEG and ENDE is type timestamp in Oracle, but displayed as type date in ArcCatalog.

 

Any idea what might be the problem ?

 

Thanks

Michael

Is your UUID not unique for each row regardless of BEG and ENDE? If it is unique for each row, then you don't need to include BEG and ENDE in the writer


UUID is NOT unique, that is why I need the rest of the WHERE clause ...


@mhab​ I think the format of the timestamp has to match the format of the underlying database timestamp, not the FME date/time format. So if your ArcSDE is on SQL Server it would be something like YYYY-MM-DD HH:MI:SS as per the SQL Server user docs.


@mhab​ I think the format of the timestamp has to match the format of the underlying database timestamp, not the FME date/time format. So if your ArcSDE is on SQL Server it would be something like YYYY-MM-DD HH:MI:SS as per the SQL Server user docs.

Sounds convincing ...

... but puts me in a very undesirable situation :-((

I don't know what database is used underneath the SDE. Im my case it might be Oracle, Postgres or SQL Server.

Long live SQL Standard, grrrr


UUID is NOT unique, that is why I need the rest of the WHERE clause ...

By definition a UUID should be unique for each row.

 

Following on from what @Mark Stoakes​  has said, you should be able to find the database type by opening your connection file in ArcGIS and looking at the properties


By definition a UUID should be unique for each row.

 

Following on from what @Mark Stoakes​  has said, you should be able to find the database type by opening your connection file in ArcGIS and looking at the properties

UUID is unique for existing features, but not for those in the Esri archive. If a feature was updated several times, there exist many older states in the archive.

But rest assured: UUID is not my problem here.


By definition a UUID should be unique for each row.

 

Following on from what @Mark Stoakes​  has said, you should be able to find the database type by opening your connection file in ArcGIS and looking at the properties

... and about looking into the connection file properties ...

How to do that from within an FME Workspace, where the fme_db_operation_where_clause attribute is generated at runtime ?

All I know at that time is, that I am going to write to SDE and the timestamp attributes would have to be formatted depending on the DB below the SDE ...

I don't know how to retrieve this info from within FME ...


Sounds convincing ...

... but puts me in a very undesirable situation :-((

I don't know what database is used underneath the SDE. Im my case it might be Oracle, Postgres or SQL Server.

Long live SQL Standard, grrrr

So finally I found a Esri Support page which gives the details: https://resources.arcgis.com/en/help/main/10.2/index.html#//00s500000033000000

 

It means for SDE on

SQL Server: BEG = '2003-01-08 14:35:00'

Oracle: BEG= TO_DATE('2003-01-08 14:35:00','YYYY-MM-DD HH24:MI:SS')

PostgreSQL: BEG= TIMESTAMP '2003-01-08 14:35:00'

 

and for

FileGDB: BEG= date '2003-01-08 14:35:00'

 

This is diversity I would be happy to do without :-(

 


Reply