Hey!
I was wondering what the best practices are when it comes to updating a sde table that has archiving enabled.
I have a few tables in a sde database that I need to update regularly. These tables have archiving enbaled and the tables have between 100,000-900,000 records.
I have noticed that when I want use “Truncate Existing” in my sde writer that it takes a long time (I’ve never actually had the patience to let it run to completion… I usually cancelled it after a few hours)
If I use the “SQL To Run Before Write” to truncate the table before I write to it, all the “archived” data gets lost.
If I use the arcpy function “Truncate Table” in a Python Caller right before I write the data to the table it works really well, it doesn’t take too much time and the archiving function is working as well.
I guess one other option would be to detect all the changes and only update those rows but for different reasons this makes the workflow a bit more complicated. But maybe something worth looking into more?
Thanks for any help and inspiration :)
/Vera