Skip to main content
Question

Truncate apparently not working

  • October 5, 2018
  • 9 replies
  • 455 views

lorinda
Contributor
Forum|alt.badge.img+6

Running an automated script that gets data from another source to map. Have table in SDE (10.3.1) and set to Insert with Truncate Existing. The table is not truncating. It appears to work with Drop and Create. However, we want to keep the permissions and such intact, therefore using Truncate. This feed started today with about 15 rows, changed to 8 rows, then to 0 rows, back to 8 rows. This data is being used in a Dashboard on ESRI Portal for real time reporting.

FME 2018.1.0.2 64 bit Build 18547 ArcDestop 10.3.1

Thanks,

This post is closed to further activity.
It may be an old question, an answered question, an implemented idea, or a notification-only post.
Please check post dates before relying on any information in a question or answer.
For follow-up or related questions, please post a new question or idea.
If there is a genuine update to be made, please contact us and request that the post is reopened.

9 replies

lorinda
Contributor
Forum|alt.badge.img+6
  • Author
  • Contributor
  • October 5, 2018

Need to edit this to mention that when there is NO data to be written, then the table is not truncating. It would seem to me that the table should truncate no matter if there is new data to be written or not.

Thanks


fmelizard
Safer
Forum|alt.badge.img+21
  • Safer
  • October 5, 2018

Hi @lorinda

As mentioned in the Geodatabase writer documentation, both 'Truncate Table' and 'Drop and Create' parameters are only valid if a feature reaches the writer.

 


lorinda
Contributor
Forum|alt.badge.img+6
  • Author
  • Contributor
  • October 5, 2018

Hi @lorinda

As mentioned in the Geodatabase writer documentation, both 'Truncate Table' and 'Drop and Create' parameters are only valid if a feature reaches the writer.

 

That seems to be the issue. So I got around it by creating a data model in ArcMap to 'Delete Features' from the feature class first. Then wrote it out to python and have it working as a python script run before the rest of the process starts. That seems to have solved the issue - until we get hit with very large files for this process. I'm hoping that at that point, when running on the server, that the speed between the servers will compensate for the increased processing time for the number of features. Have another FME script that this same process will be utilized to see if it helps with processing time as it was taking over 4 hours (and sometimes killed after 36 hrs due to not processing at all) to truncate the feature class before inserting the new rows (approximately 125K features).

 

 


runneals
Supporter
Forum|alt.badge.img+29
  • Supporter
  • October 5, 2018

To do what you want, I suggest putting the following in the "sql to run before write": 

TRUNCATE TABLE [SCHEMANAME.TABLENAME]

debbiatsafe
Safer
Forum|alt.badge.img+21
  • Safer
  • October 5, 2018
That seems to be the issue. So I got around it by creating a data model in ArcMap to 'Delete Features' from the feature class first. Then wrote it out to python and have it working as a python script run before the rest of the process starts. That seems to have solved the issue - until we get hit with very large files for this process. I'm hoping that at that point, when running on the server, that the speed between the servers will compensate for the increased processing time for the number of features. Have another FME script that this same process will be utilized to see if it helps with processing time as it was taking over 4 hours (and sometimes killed after 36 hrs due to not processing at all) to truncate the feature class before inserting the new rows (approximately 125K features).

 

 

If you're using arcpy.DeleteFeatures_management() in your Python script, I would suggest using arcpy.TruncateTable_management() instead.

 

 

As noted in the Esri documentation for Delete Features, deleting all features can be slow and it is recommended to use Truncate Table instead if you intend to delete all rows.

 


lorinda
Contributor
Forum|alt.badge.img+6
  • Author
  • Contributor
  • October 8, 2018

To do what you want, I suggest putting the following in the "sql to run before write": 

TRUNCATE TABLE [SCHEMANAME.TABLENAME]
This would be fine for a regular sql table.  ESRI has many dependencies on the basetable which would be the schemaname.tablename.  Therefore, using a data model translated to python works in the ESRI world to clean out all of the tables associated with the one.

 

 


runneals
Supporter
Forum|alt.badge.img+29
  • Supporter
  • October 9, 2018
This would be fine for a regular sql table. ESRI has many dependencies on the basetable which would be the schemaname.tablename. Therefore, using a data model translated to python works in the ESRI world to clean out all of the tables associated with the one.

 

 

I do this for our sde tables when they are massive hundred thousand+ record tables, as it is fast, however it doesn't allow you to revert like the truncate option does and it works just fine and doesn't cause any issues.

 


runneals
Supporter
Forum|alt.badge.img+29
  • Supporter
  • October 9, 2018
If you're using arcpy.DeleteFeatures_management() in your Python script, I would suggest using arcpy.TruncateTable_management() instead.

 

 

As noted in the Esri documentation for Delete Features, deleting all features can be slow and it is recommended to use Truncate Table instead if you intend to delete all rows.

 

Also note that ArcPy is Python 2.7, so that should be considered until it is upgraded to 3.4 since future versions won't support 2.7

mike_gresh
Contributor
Forum|alt.badge.img+7
  • Contributor
  • March 10, 2019

Hi @lorinda,

 

For the SDE table to be truncated it must be in an "edit" session therefore you will need to at least have your Transaction Type set to "Versioned edit session" or "Non-versioned edit session" depending on your data. You can either do this when you add the writer or change it through the writer parameters under "transaction type".

Let me know if it works for you.