Skip to main content
Solved

Can FeatureWriter's UPDATE Feature Operation Be Sped Up?


Forum|alt.badge.img

Hello.

Currently I'm using the FeatureWriter to update my file geodatabase. It's running at a rate of 18 features per minute, and is comparing 13,998 features against 1.2 million in the dataset being updated.

I'm running a test of this process on a group of features which is a good representation of what a typical update will be in terms of feature quantity.

Is there any way to speed this up?

Thanks.

Best answer by kimo

Does the EntityID have an index? This is critical for performance. Reindex it using ArcGIS indexing tool. It might be useful if the data type of the id is the same so that conversion between integer and string is not being done.

As a comparison, my last update took 10 min for an update of 120,780 updates on a table containing 23M records (200/sec)

Turning off transactions can improve the speed. In my case I see that I did not change the default count of 1000.

My dataset is local, not on a network.

View original
Did this help you find an answer to your question?

6 replies

kimo
Contributor
Forum|alt.badge.img+10
  • Contributor
  • Best Answer
  • October 10, 2018

Does the EntityID have an index? This is critical for performance. Reindex it using ArcGIS indexing tool. It might be useful if the data type of the id is the same so that conversion between integer and string is not being done.

As a comparison, my last update took 10 min for an update of 120,780 updates on a table containing 23M records (200/sec)

Turning off transactions can improve the speed. In my case I see that I did not change the default count of 1000.

My dataset is local, not on a network.


Forum|alt.badge.img
kimo wrote:

Does the EntityID have an index? This is critical for performance. Reindex it using ArcGIS indexing tool. It might be useful if the data type of the id is the same so that conversion between integer and string is not being done.

As a comparison, my last update took 10 min for an update of 120,780 updates on a table containing 23M records (200/sec)

Turning off transactions can improve the speed. In my case I see that I did not change the default count of 1000.

My dataset is local, not on a network.

I have not indexed the dataset, no, and so far don't have any experience with it, but I can definitely try it. I would need a method to do it via FME.

 

 

So Transaction Type would be "None"? No problem trying that out, just verifying. I'm not really too familiar yet with how Transactions works, to be honest.

 

 

The EntityID field type in the dataset to be updated (File GDB) is written as DOUBLE, but I'm not sure about the updating features. The translations I work on are writing to a network drive.

 

 

Thanks!

 

 


Forum|alt.badge.img
kimo wrote:

Does the EntityID have an index? This is critical for performance. Reindex it using ArcGIS indexing tool. It might be useful if the data type of the id is the same so that conversion between integer and string is not being done.

As a comparison, my last update took 10 min for an update of 120,780 updates on a table containing 23M records (200/sec)

Turning off transactions can improve the speed. In my case I see that I did not change the default count of 1000.

My dataset is local, not on a network.

Also, would I need to index the dataset to be updated or the updating dataset?

 

 

Thanks again.

 

 


Forum|alt.badge.img

Thanks @kimo. This is pretty delayed, but I've been using your answer and indexing the feature class works extremely well. According to my last test run, I went from four hours processing to five minutes. That's just crazy.


kimo
Contributor
Forum|alt.badge.img+10
  • Contributor
  • January 11, 2019
ronaldmcoker wrote:

Thanks @kimo. This is pretty delayed, but I've been using your answer and indexing the feature class works extremely well. According to my last test run, I went from four hours processing to five minutes. That's just crazy.

I have just replaced my writer for version 2018.1 and used the  fme_db_operation variable to merged the delete/update/add operations into one transformer and the time has been reduced to 1 minute from 20 min, a jump to 1700/sec. Maybe the CSV2 reader has made a difference too, but that is a big improvement. It seems that the changes are cached and then applied in one transaction instead of serially. I turned off transactions.

Title_Memorial                                                 133920
2019-01-07 14:50:05|  74.3|  0.0|STATS |changeset              133920
2019-01-07 14:50:05|  74.3|  0.0|STATS |Total Features Written 267840
2019-01-07 14:50:05|  74.3|  0.0|INFORM|Translation was SUCCESSFUL with 909 warning(s) (267840 feature(s) output)
2019-01-07 14:50:05|  74.3|  0.0|INFORM|FME Session Duration: 1 minute 17.5 seconds. (CPU: 60.1s user, 14.1s system)
2019-01-07 14:50:05|  74.3|  0.0|INFORM|END - ProcessID: 16104, peak process memory usage375528 kB, current process memory usage117120 kB

Forum|alt.badge.img
kimo wrote:

I have just replaced my writer for version 2018.1 and used the  fme_db_operation variable to merged the delete/update/add operations into one transformer and the time has been reduced to 1 minute from 20 min, a jump to 1700/sec. Maybe the CSV2 reader has made a difference too, but that is a big improvement. It seems that the changes are cached and then applied in one transaction instead of serially. I turned off transactions.

Title_Memorial                                                 133920
2019-01-07 14:50:05|  74.3|  0.0|STATS |changeset              133920
2019-01-07 14:50:05|  74.3|  0.0|STATS |Total Features Written 267840
2019-01-07 14:50:05|  74.3|  0.0|INFORM|Translation was SUCCESSFUL with 909 warning(s) (267840 feature(s) output)
2019-01-07 14:50:05|  74.3|  0.0|INFORM|FME Session Duration: 1 minute 17.5 seconds. (CPU: 60.1s user, 14.1s system)
2019-01-07 14:50:05|  74.3|  0.0|INFORM|END - ProcessID: 16104, peak process memory usage375528 kB, current process memory usage117120 kB

Wow, even better. Thanks so much for the info on this @kimo.


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings