Skip to main content
Solved

Truncate SDE table with archiving enabled


veschwab
Contributor
Forum|alt.badge.img+14

Hey!

I was wondering what the best practices are when it comes to updating a sde table that has archiving enabled.

 

I have a few tables in a sde database that I need to update regularly. These tables have archiving enbaled and the tables have between 100,000-900,000 records.

I have noticed that when I want use “Truncate Existing” in my sde writer that it takes a long time (I’ve never actually had the patience to let it run to completion… I usually cancelled it after a few hours)

If I use the “SQL To Run Before Write” to truncate the table before I write to it, all the “archived” data gets lost.

If I use the arcpy function “Truncate Table” in a Python Caller right before I write the data to the table it works really well, it doesn’t take too much time and the archiving function is working as well.

 

I guess one other option would be to detect all the changes and only update those rows but for different reasons this makes the workflow a bit more complicated. But maybe something worth looking into more?

 

Thanks for any help and inspiration :) 

 

/Vera

Best answer by hkingsbury

Archiving does slow down the processing time. Waiting for a couple hours is what i would expect, especially given the volume of data you’re dealing with.

I would suggest it’s worth your time going through a change detection process to only update what needs updating. Do you have last edited dates on your source data? That will greatly improve the speed of being able to figure out what changes in the source data need to be made in the destination.

If you’re needing to do a full load, and don’t want archiving kept (but it appears you do) then disabling archiving, then truncating, loading, then turning archiving back on is going to be the most efficient.

View original
Did this help you find an answer to your question?

4 replies

hkingsbury
Celebrity
Forum|alt.badge.img+56
  • Celebrity
  • Best Answer
  • July 16, 2025

Archiving does slow down the processing time. Waiting for a couple hours is what i would expect, especially given the volume of data you’re dealing with.

I would suggest it’s worth your time going through a change detection process to only update what needs updating. Do you have last edited dates on your source data? That will greatly improve the speed of being able to figure out what changes in the source data need to be made in the destination.

If you’re needing to do a full load, and don’t want archiving kept (but it appears you do) then disabling archiving, then truncating, loading, then turning archiving back on is going to be the most efficient.


veschwab
Contributor
Forum|alt.badge.img+14
  • Author
  • Contributor
  • July 17, 2025

@hkingsbury Thank you! That’s what I already thought. I do need to keep the archiving on unfortunately, otherwise I also thought that would’ve been a good idea.

I’ll try to maybe make it work with a change detection process, unfortunately no last edited dates but there might be other ways of making it work. Thanks so much for the help and ideas!


ebygomm
Influencer
Forum|alt.badge.img+39
  • Influencer
  • July 17, 2025

I’d suggest a change only detection process, repeated truncate and loads with archiving turned on is going to generate a huge amount of data which no real value alongside the processing time.


veschwab
Contributor
Forum|alt.badge.img+14
  • Author
  • Contributor
  • July 17, 2025

@ebygomm Yeah I also had some concerns about this… Thanks for the tip!


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings