@dan_mbc are you able to include a small sample of the data and your workspace? Also, did you use FME to create the Feature Class on your Portal, the first time you loaded the data? (if you can't make the data public you could send it to me at mark @ safe.com)
@dan_mbc are you able to include a small sample of the data and your workspace? Also, did you use FME to create the Feature Class on your Portal, the first time you loaded the data? (if you can't make the data public you could send it to me at mark @ safe.com)
Thanks for the response @Mark Stoakes . Sorry I didn't see this over the weekend!
I have attached a sample of data which has been anonymised. This is the CSV which is then used to create the feature class in Portal. So yes this feature is used to create the feature in Portal as a new feature.
I don't think I can share the workspace as that has the live data in it which is sensitive. If you need me to send anything else please let me know.
Thanks again.
Hi @dan_mbc Try to check the value in your csv file. may be the first time when you write , the value is small, Next, you are truncating the table, but the column definition is set to the old values(old fme run)that came in. you can create table every time, instead truncating.
Hi @dan_mbc Try to check the value in your csv file. may be the first time when you write , the value is small, Next, you are truncating the table, but the column definition is set to the old values(old fme run)that came in. you can create table every time, instead truncating.
Thank you for the suggestion.
The values arent changing at this stage, as I was just checking the workbench works. It works when creating it, but not when trying to update it/overwrite it. So I don't think it is a value size problem.
I could try just creating it everytime, but wont this mean a new version of the data everytime? And I will have to delete to previous version? The reason I was trying to overwrite is because I have the feature layer in maps and setup with symbology etc., so I am not sure if recreating it will break that link.
Thanks for the response @Mark Stoakes . Sorry I didn't see this over the weekend!
I have attached a sample of data which has been anonymised. This is the CSV which is then used to create the feature class in Portal. So yes this feature is used to create the feature in Portal as a new feature.
I don't think I can share the workspace as that has the live data in it which is sensitive. If you need me to send anything else please let me know.
Thanks again.
Hi @dan_mbc
I think the issue is tied to the JSON being sent to the Feature Service, when the Feature Service already exists. If you are able to cast the integers, using an AttributeManager as follows, I think it should work.
For example:
northings = @Evaluate(@int64(@Value(northings)))
eastings = @Evaluate(@int64(@Value(eastings)))
age = @Evaluate(@int8(@Value(age)))
Let me know if that works!
Thanks for the response @Mark Stoakes . Sorry I didn't see this over the weekend!
I have attached a sample of data which has been anonymised. This is the CSV which is then used to create the feature class in Portal. So yes this feature is used to create the feature in Portal as a new feature.
I don't think I can share the workspace as that has the live data in it which is sensitive. If you need me to send anything else please let me know.
Thanks again.
Hi @trentatsafe , thank you for the suggestion.
I have just tried this, and I don't think it is working or I have configured AttributeManager incorrectly (probably the latter!). I have tried the attached settings.
But it seems to rename the actual attributes, as shown in the snippet above, which then in turns makes the next step which creates the vertex fail. Have I just got the values wrong? I used the Attribute Manager after the AttributeRenamer and before the vertex creator. It runs through AttribureManager fine, but then obviously fails in the vertex creator due to the fields now looking like easting = 447758 etc.
Thanks
Thanks for the response @Mark Stoakes . Sorry I didn't see this over the weekend!
I have attached a sample of data which has been anonymised. This is the CSV which is then used to create the feature class in Portal. So yes this feature is used to create the feature in Portal as a new feature.
I don't think I can share the workspace as that has the live data in it which is sensitive. If you need me to send anything else please let me know.
Thanks again.
I have just updated the AttributeManager and think I now have it configured correctly, screenshot attached.
The error when trying to run for second time/overwrite is:
ArcGIS Portal Feature Service Writer: Encountered an unexpected error. The error code from the server was '500' and the message was: 'JSONObjectO"globalId"] not found.'. Details: ''
ARCGISPORTALFEATURES writer: A fatal error has occurred. Check the logfile above for details
... Last line repeated 2 times ...
Thanks
Dan
Thanks for the response @Mark Stoakes . Sorry I didn't see this over the weekend!
I have attached a sample of data which has been anonymised. This is the CSV which is then used to create the feature class in Portal. So yes this feature is used to create the feature in Portal as a new feature.
I don't think I can share the workspace as that has the live data in it which is sensitive. If you need me to send anything else please let me know.
Thanks again.
@dan_mbc It looks like your entering the functions as literal strings. In the AttributeManager, use the Arithmetic Editor to build the expression from the Math Functions:
Thank you for the suggestion.
The values arent changing at this stage, as I was just checking the workbench works. It works when creating it, but not when trying to update it/overwrite it. So I don't think it is a value size problem.
I could try just creating it everytime, but wont this mean a new version of the data everytime? And I will have to delete to previous version? The reason I was trying to overwrite is because I have the feature layer in maps and setup with symbology etc., so I am not sure if recreating it will break that link.
Hi @dan_mbc and others coming across this error,
In my case it turned out to be a string-value exceeding the size of the esriFieldTypeString(width) attribute in the Feature Type definition. The first run writing to a new feature service gives no problem. The second run writing the same data to the exisiting feature service throws this error.