Skip to main content

Hi,

I appear to have an issue with a FeatureWriter failing when writing a certain amount of features.

I have a list of features which I want to write some dummy data for into a Feature Writer. For each entry in the list I fetch 1 object (using a Sampler) and the relevant schema from PostGIS tables and then use a FeatureJoiner to match the list feature to the object and schema feature. After that I remove the geometry and feature values so each feature in the list has just one empty object. A lot of features, but very little actual data.

 

I then write everything to the FeatureWriter. This works absolutely fine when the list is only about 500 items long, so there doesn't appear to be anything wrong with the process, but when I'm testing lists of about 4000 features the Writer seems to timeout and fail on a random feature.

The error messages I'm getting are along the lines of:

Geodatabase Error (-2147220987): The user does not have permission to execute the operation.

FileGDB Writer: A feature could not be written

Feature Type: `J_080M_GDE_MINEX'

*Then it lists all of the attributes*

FeatureJoiner(FeatureJoinerFactory): A fatal error has occurred. Check the logfile above for details

Is this a timeout issue, and can it be resolved?

Regards

Are you writing the FGDB to a local disk? If so, I don't think a timout is the issue. Can you check that you're not running out of disk space during translation?

You may also want to try replacing the FeatureJoiner with a FeatureWriter and see if that makes a difference.


I would agree with @david_r - it doesn't look like a timeout error - none of your error messages indicate a timeout. It might help the community to find an answer if you expand a little on what you're trying to accomplish here. Why are you writing dummy features to a File Geodatabase? Is it the API or Arc Objects version of the Geodatabse writer?

If the feature classes already exist in the Geodb then you may need to use Edit Session option, or drop the feature class before writing. Also, try switching to File Geodb (Arc Objects).


So, I am using the Open API version of the GDB writer, writing to a brand new GDB on FME Server (so Edit Session shouldn't be needed). There are no duplicate feature class names to cause a problem and disk space is definitely not an issue. And I know the process works for a smaller amount of features, so the FeatureJoiner isn't causing it.

We have many layer files where we don't want broken links or to have to remove elements. So the reason we are writing out dummy features first is to make sure that all possible features have a dummy set of data first. A user requests data from a geographic area but, depending on what they request, not all features will contain data.

After this dummy features process completes we then have an FMEServerJobSubmitter which processes the full data and overwrites only the dummy features classes it needs to using Drop & Create. And voila! Full layer files without any broken links! Again, this runs fine with less data, and so far there haven't been any issues with the FMEServerJobSubmitter part.

If it helps, I can mock-up a very simplified version of my workbench.


Have you tried using a template file on the File Geodabase writer? If you want to write a File Geodb with all the feature classes created (even if some are not written too) then using a template file geodatabase is the way to go. For the File Geodatabase (API) writer, you can point the writer to an empty File Geodb that has all the features classes (and feature datasets and subtypes and domains etc) then then write into that.

Doesn't identify the cause of your Geodatabase Error, but I think it would simplify your workflow considerably (unless there is a reason that a File Geodb Template doesn't work for you).


Have you tried using a template file on the File Geodabase writer? If you want to write a File Geodb with all the feature classes created (even if some are not written too) then using a template file geodatabase is the way to go. For the File Geodatabase (API) writer, you can point the writer to an empty File Geodb that has all the features classes (and feature datasets and subtypes and domains etc) then then write into that.

Doesn't identify the cause of your Geodatabase Error, but I think it would simplify your workflow considerably (unless there is a reason that a File Geodb Template doesn't work for you).

In other scenarios Mark, this could be a good way around, but the PostGIS tables we read from contain whatever the current schema is. If any of the schemas change, which is likely, then the template geodatabase will be out-of-date. Reading from the PostGIS tables means the schema is truly dynamic.

 

 


I have created a simplified workbench attached here to show the process. featurewriter.fmwt

The bit in the green bookmark is just the process to get the list of around 4000 features. It's the bits in the grey and orange bookmarks we are concerned with.

The list gets passed to the DuplicateFilter because in the list of 4000 there is only 21 unique schemas. One random feature per schema is then read in from the PostGIS table, and then each one is joined back to its corresponding schema name from the full list. The geometry of the random feature is then removed, and the features are written out. Each feature class written out contains one empty feature - no geometry but correct schema.

As I said, this process works fine with about 500 features but fails at 4000 (I believe it fails at around 850-900). The process then runs out to the FeatureHolder where other data is coming in, before we run another workbench using the FMEServerJobSubmitter, but the job will have already failed by the FeatureWriter when I'm running this list of 4000.

I appreciate you won't be able to test this past the List Builder but you can at least see the process.

Incidentally, I have tried using the SchemaMapper, but it always stalls at the Reading Dataset part.


Reply