Hi all,
We've just upgraded to 2018.0.0.2 from 2017.1.x and a workbench that used to run in about 5 minutes is now running at 1 record per second and I have 62,000 records to process. The bottleneck appears to an AttributeValueMapper with 140 lookup values. The FileGDB and AttributeRemover read all 62,000 records in next to no time and then they hit the AVM and everything slows right down.
We've tried a FeatureWriter (to CSV with DuplicateRemover) to get a file of unique values and updated this with the new mapped values and then used a DatabaseJoiner but that is still slow at 2 features per second.
Any suggestions on how to speed things up?
The workbench takes a FileGDB table and uses DatabaseJoiners to join two more FileGDB tables (with a common key field in each one) to build one output table in PostgreSQL with selected fields from each table.
Thanks
Ross