Skip to main content

I am using Data Interoperability (Build 21302) from Pro 2.8.4 to frequently refresh tables in an ArcGIS Enterprise Geodatabase. The tables are registered and published to Portal. The client is using the data in web maps, apps and dashboards in ArcGIS Enterprise Portal and require specific web layers to have little or preferably no time where it's empty of data.

 

Instead of using the Truncate and Append to Existing table in the ArcSDE Writer I am using the ChangeDetector to set all new features coming in as both "Inserted" (containing the latest data / timestamp) and "Deleted" (data from 5mins ago) then in the writer match by Unique ID. From testing this enables a "seemless" refresh with minimal time of the table appearing empty in Portal.

 

Is this method at all stupid or precarious? If so, what is a better solution. Additionally is it risky to read in and write to the same table in the same workspace?

 

Background Notes:

One frequently refreshed table can have duplicate polygons with different data.

The data can change frequently every 5 - 10 mins and then sometimes not for days.

The Enterprise Database is using Postgreql

 

I have attached the image of the workspace for reference. Please let me know if I am missing anything. Any suggestions would be greatly appreciated.

 

Hello @chazbacca​ , thanks for posting to the FME Community! Your current method looks fine. For the frequency, this may be the preferred approach as it sounds like you could be running every 5-10 minutes.

Truncate is a better option if dealing with very large quantities of data (eg. why try deleting 100,000 rows when you could just truncate). Also, I think you're alright to read/write to the same table in the same workspace. Hope this helps! Kailin


Looks/sounds fine to me. One thing to be aware of (can't tell from your screenshots - appoligies if this doesn't apply) is to make sure the unique ID you're using is not the ObjectID. ObjectIDs can and will change, potentially breaking your data. Best bet would be to use a GUID in a separate field


Awesome thank you! I'm relieved as I really wasn't sure about it. Luckily the data "should" never be more than about 500 records.

Additionally the unique id is not the objectid but thank you for checking :-)


@chazbacca​ can you share your workbench? I have a similar use case and have not been able to make headway.


Reply