I am trying to see if I can update a feature layer in our ESRI Portal from a FGDB. We know we can insert, update, and delete features and have success on those fronts. However, we will like to also have dynamic schema available as we may change the data in our FGDB by removing/adding fields. I have tried several times to get a dynamic writer working to account for a schema change but without luck. I have tried to delete data in the feature layer to then insert but that had errors.
Does anyone have advice or has done this?
The FGDB is a testing stage as eventually we would want to update directly from SDE.
Dynamic Writing is usually used for where either
- Creating a new table with an undefined set of fields from scratch; or
- Updating multiple tables at once that have different Schemas to each other, but each table Schema is fixed
If you are only updating a single, fixed Schema Feature Layer with Insert, Update or Delete then don’t confuse having dynamic features for the need to do dynamic writing. In these write modes you cannot add/remove/alter fields in the source (FGDB, SDE etc.) and expect the Writer to add/remove/alter those fields in the destination. These modes do not alter the Destination Schema unless a particular Writer has a “Drop Table” option.
Further if you try to write data to field names that don’t exist in the destination layer/table, which can happen in a dynamic write update/insert mode, it will be likely in most write cases that the Writer will encounter an error in the destination system essentially replying that there is no field name “xyz’’ and will often fail to write any data for that feature at all.
So, in the case of a single existing Feature Layer needing to update, I would consider using a non-dynamic Writer in either Automatic or Manual mode. You can still have dynamic features in the workspace, but to do this is simply to add an AttributeExposer just before the Writer that lists the Field Names in the destination layer.
Alternatively, a harder way for this particular use case is to continue to do dynamic writer mode, however, for insert and update writer modes, a separate Schema Feature needs to be the first feature sent to the writer (can often ensure this using a Sorter to sort the Schema Feature(s) to be above the Data Features), and it needs to be the Schema of the Destination Feature Layer being updated (and not the Schema of the Source), and in order to be successful, the source data field names and data types in each dynamic feature need to match this Schema for where intend to update/insert for this particular group of destination fields. To generate this Schema Feature is to use a separate Eg. FeatureReader in Schema Only mode to generate what the destination schema is as an input to the Writer. This can be harder to design and test, hence why instead on just a single fixed Feature Layer schema, simply doing a non-dynamic Automatic/Manual Write could be quicker to implement in the workspace.
If the use case is instead to want to add/remove/alter fields in the destination then the workflow needs to drop/delete the Feature Layer first, or use the administration tools on the layer outside of FME to alter the Feature Layer to reflect the source fields before doing an FME write.
Additionally note, that Update and Delete modes will almost always need Primary Key/Unique Identifier field exposed to the Workspace on either the Reader or downstream of the Reader with AttributeExposer to be used as the Key lookup value for the feature ID to update in the destination.
I would like to add/remove fields in the destination (a Feature layer in AGOL and in Portal) based on a FGDB. We have cases in which we add fields in our source dataset (SDE) and want to update the feature layer in our online feature layers.
We can do this in Python by deleting the feature service then rewriting it with the new data but are moving some of our scripts over to FME and ran into the problem that occasionally, we add new fields in our SDE database and their respective feature layers need to reflect that update.
I’m using a FGDB to test these changes but ultimately, SDE is our source dataset.
So is it possible to add a field on in an already published Feature layer in AGOL/Portal if it was not originally published and therefore does not exist currently?
So is it possible to add a field on in an already published Feature layer in AGOL/Portal if it was not originally published and therefore does not exist currently?
The ArcGIS Portal Feature Service Reader/Writer documentation suggests that the Schema can only be modified by overwriting an existing Feature Service (and its Layers) by deleting it and replacing it during the Write
Layers can only be created as part of creating a Feature Service. Creating new layers in an existing Feature Service is not supported at this time
This does sound somewhat similar to your existing Python script, however. The parallel in the FME Writer is to set Feature Service Handling to Overwrite Existing and for each Feature Type to set in Insert Mode. Delete and Update feature write operations in contrast will generally only work during a Write to an existing schema that is not to be overwritten.
To create the layers only requires a single feature per feature type as the Writer uses the first feature per feature type to overwrite the existing Feature Service and create new layers with a Schema = to the first Feature per Feature Type. A Schema replacement + Insert all features can be done then in one operation, or separate Portal Schema generation done using just single features.
So it is “possible” to do this, but by overwriting and replacing the existing Feature Service. Note the documentation notes in the above link as well as how to define Field Alias at the same time.
Alternatively if want to continue to handle this in Python if there were particularly customisations to keep, then this can be transferred over to FME into PythonCaller, or alternatively again script this into equivalent REST commands and call these via HTTPCaller. The differences in the source and destination schemas can be detected through two FeatureReaders to source and destination respectively and analysed with ChangeDetector.
I do this in SDE using Arcpy.
If I did it in Portal/AGOL, then I would be making use of api calls (but that is because I actually quite like Esri rest api). AddtoDefinition would be your friend: https://developers.arcgis.com/rest/services-reference/online/add-to-definition-feature-layer-.htm
But as other have alluded, it needs to be completed in two parts.
One will involve add, delete, alter field. I build that and run that in FME.
I use a “Reader as A Resource” to get the schema from SDE (annoying...while the schema is dynamic, Reader as a Resource requires that all relevant SDE tables are selected...no wildcald). Of course there are other options, such as schema features etc.
But the key part of the story is that this is certainly achievable.
Hello!
I believe there is a 'native' FME way of writing (updating/inserting) dynamic data (changing attributes/fields) into the Portal Feature Service. Since you are pushing FGDB or SDE into the Portal Feature Service, I'd recommend using the FeatureWriter transformer. In the Parameters... for the FeatureWriter, select "Overwrite Existing" for Feature Service Handling (the function of this option is essentially a "drop and create" operation) and specify the new/updated schema for the Dynamic Schema Definition.
Also, "Overwrite Existing" for Feature Service Handling was added in FME 2023.0, so you need to be on this or a newer build.
Using HttpCaller or PythonCaller may give you finer control, but I think it is achievable using the method I described.
Hope this helps!
I have not had success with the “native” FME way. We might move to the Python Caller since it seems FME can not achieve the schema changes using the as built transformers.
I received this error after trying the native way:
ArcGIS Portal Feature Service Writer: New feature service 'SWQF_COS' creation failed: '(layer SWQF) ArcGIS Portal Feature Service Writer: Encountered an unexpected error. The error code from the server was '500' and the message was: 'ERROR: Inconsistency in the Field Definitions found.'. Details: '''
On my side I had problems with two things :
1- I had to put all my fields in lowercase so Portal can accept them
2- I had to remove shape_length / shape_area from my attributes for all my layers. Portal will create those attributes itself.