Question

AGOL writer-is there a limit?

  • 11 April 2024
  • 6 replies
  • 56 views

Badge +3

Hi, 

I have one Form workspace that writes to a hosted feature service on AGOL fine. It is set at insert and Truncate. It writes a little more than 9,000 features and I run it every week without issue. 

However, I now have a different hosted feature service on AGOL that has +23,000 features. When I try the same process to update it, it drops all the data and only ends up writing out 15,000 records at most. The number of records diffs as well. I have tried adjusting the timeout and features per request, but have not been successful in achieving the full write out. Any suggestions would be appreciated. 

 


6 replies

Userlevel 1
Badge +12

Hi, no there is not a limit. I have written several hundred thousands features to ArcGIS Online. It is more likely that there is an issue with the data you are trying to load into the ArcGIS Online environment.

It might just be a process of elimination unfortunately. For instance, if the error occurs after the 15000. Don’t bother writing the first 15000 features and then adjust your writers commit to 100. Then when it errors, do further refinements until you find the feature and then try from there.

I have not tested the new ArcGIS Feature Service Writer in 2023 +, but it might be worth a try to see if it gives you a nice error message about the problem, rather than the arcgis online writer.

Badge +3

Thank you. It is good to here confirmation that it’s not the number of records. I will do as you suggested and change the level to debug to see what could be the cause. 

 

 

Badge

Yeah most likely the cause is an invalid value in a field type like a text in an integer field and the likes

 

Badge +4

Just as an idea if the Data isn’t the issue.

I have had issues in the past with writing large individual records to AGOL. 

The AGOL API has a fixed maximum size for a single upload. Just for clarity that “single upload” is every feature in the group uploaded at a single time (this is controlled by Features Per Request in the Writer Parameters and defaults to 1000).

 

We ran into issues with this when we were uploading polygons with very complex geometries with thousands of vertices, but it could be hit with large blob fields or even just datasets with a load of fields. 

We got around this by having a dynamic value for the “Features Per Request” parameter, however it might be worth also testing if the writing works if you reduce the value down to a set value of 100. 

   

 

Badge +3

Thank you @ponting13 , it does deem to be something with Features Per Request. I put the workspace on debugging, but it would run to success without failing (although even as running as Successful, not all the features would be on AGOL), there would be no indication of the data being the issue. I played with lowering the Features Per Request and lowered it all the way to 10 (took a couple hours to write). That wrote the most features, but still not all. I will continue to work with it. 

Badge +4

I found my old Workspace and as a very rough guide I limited to writer to 200,000 vertices per request (by using grouping). Not sure what size of request this translates to, but that was the number that seemed to work.  

Again this should be a last resort, follow the guides above to make sure the data is all in the right formats and field types first 😊

Hope this helps!

Edit:
This might help: developers.arcgis.com/rest/
I think the wall we hit was the “maxHttpPostSizeInBytes” 
“Introduced at 10.9.1. This property allows users to change the maximum size in bytes for POST requests sent to ArcGIS Server. The default maxPostSize for ArcGIS Server is 10485760 bytes (10 MB).”

Reply