I have a workbench that is reading from an already populated FGDB with 140 Feature Classes. I am trying to add an extra column to the majority of those feature classes and then populate the column for each row with a URL and ID to match the row. I am finding that it is taking 6 minutes to write back 60,000 rows and considering there are over 10 million to update that is an unacceptable timescale.
I am creating the new attribute and a non spatial index with a python caller. Feature Reader to get the ID of each row. AttributeCreator and Manager to create the URL and merge it with the ID. Finally an AttributeFilter to the FeatureWriter. In the FeatureWriter I am doing an Update and using PrimaryKey columns for the row selection. I would appreciate any thoughts on how I might quicken the process.