Skip to main content

Hi, we are coming up against an issue now that is now a real blocker to our workflow. We are running a workbench attached on FME cloud which picks up content in postgis and writes to fgdb. The problem is at random - (honestly) it will fail with the error:

Geodatabase Error (-2147220987): The user does not have permission to execute the operation.

If you run the task again specifically running on the feature it failed on, it will run no problem.

I know there are other cases of this reported, however when we run locally there is no issue.

https://knowledge.safe.com/questions/45200/writing-esri-file-geodatabase-issue.html

 

https://knowledge.safe.com/articles/362/workspace-fails-on-fme-server-when-using-unc-paths.html

 

@brianatsafe @david_r @mark2atsafe have all commented on this error before, reading through I can't see any issues that relate - as we are running as superuser and also have tested locally.

The two logs attached outline the error, the larger log, is for running a very big task exporting thousands of tables/feature classes, the other is just for the one which it (by chance) failed on - which ran successfully.

 

From the larger log the error reports here, not able to export the feature - GDE_K_150L_GEOCHRON

'2019-05-31 10:52:16| 332.0| 0.1|INFORM|FileGDB Writer: Created table 'GDE_K_170L_GEOCHRON'

 

2019-05-31 10:52:16| 332.1| 0.1|INFORM|FileGDB Writer: Created table 'GDE_K_160M_GEOCHRON'

 

2019-05-31 10:52:16| 332.1| 0.1|INFORM|FileGDB Writer: Created table 'GDE_K_160L_GEOCHRON'

 

2019-05-31 10:52:16| 332.2| 0.1|INFORM|FileGDB Writer: Created table 'GDE_K_150M_GEOCHRON'

 

2019-05-31 10:52:17| 332.2| 0.1|INFORM|FileGDB Writer: Created table 'GDE_K_150L_GEOCHRON'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Geodatabase Error (-2147220987): The user does not have permission to execute the operation.

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |FileGDB Writer: A feature could not be written

 

2019-05-31 10:52:17| 332.3| 0.0|STATS |Storing feature(s) to FME feature store file `/data/fmeserver/resources/logs/engine/current/jobs/142000/job_142382_log.ffs'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Feature Type: `GDE_K_150L_GEOCHRON'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Attribute(encoded: UTF-8) : `age' has value `93.60000000'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Attribute(string) : `attribute_0__fme_data_type' has value `fme_int32'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Attribute(string) : `attribute_0__name' has value `objectid'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Attribute(string) : `attribute_0__native_data_type' has value `int4'

 

2019-05-31 10:52:17| 332.3| 0.0|ERROR |Attribute(string) : `attribute_10__fme_data_type' has value `fme_datetime''

 

Yet the second log where I ran just GDE_K_150L_GEOCHRON worked successfully. Full logs here:

 

Initially we have run this task sequentially - writing to the fgdb in a seperate workspace with a runner for each feature being passed, no we have tried to do it within a single workspace. Both ways we get the same issue.

The same workflow works fine for shapefile, sqlite etc can someone please suggest where we might be going wrong.

Thank you

Oliver

I have had issues like this before using the File Goedatabsae Open API and reading \\ writing to the same file geodatabase specifically when trying to drop the same table that you are writing to. It appears to be the .lock file that gets created in the .gdb directory and the permission error happens when trying to delete the lock file or trying to drop a table while the .lock file is there.

Since you are getting the error with 2 workspaces ran sequentially I am assuming that FME is not releasing the connection between the feature writers.

One thing I noticed, you are writing all features then running a delete where changedetect_uuid = '-9999', is there a way that you can filter this value before writing eliminating the need for the second write or is this value being written in the FGDB?


@dellerbeck thank you for the reply and taking time to review the workbench. I think you are absolutely correct that at some point a lock file is getting created when the feature class is being made and then for some reason not released when fme tries to write to it and it is just not releasing.

With regard to the second writer which is performing a delete, the error/fail happens before fme gets to this point and with this totally removed from the workbench it still errors out.

As a not so idea work around we have created within Desktop an empty file geodatabase with every possible feature class and then are testing with FME Cloud writing to this and inserting data rather than having to create and insert.

Early signs are this is promising but this isnt a long term solution as we end up giving the client thousands of feature classes for any delivery irrespective of how much they have actually asked for. There is also the issue that this removes out idea of essentially working dynamically with dynamic schemas. Any change to the schema means a regeneration of the file geodatabase is required.

Thanks again for your help.

 


Reply