I made a simple workflow to compare two datasets using the Change Detector and make the changes to one of those datasets using a writer. The process works fine, but it takes roughly a second per records updated. Is that a normal speed? It's alright for small numbers of changes, but this dataset can have thousands of changes from day to day, which would make the workflow run for over an hour, and that's only one dataset when we need to update dozens of them every night.
I've looked at the performance tuning article and it either doesn't seem applicable or I don't know enough about the software to know how to check it. My data isn't versioned so I'm using a Transactions writer; I saw another question that had this problem but the answer was to not using the versioning writer. The writer is set to fme_db_operation (so that the updated, inserted, and deleted ports of the Change Detector will work) with Use Existing as the table handling. If there's other information that would help that I can provide, let me know.
Any ideas on how to make it faster, or is a second per updated record a normal speed?