Skip to main content

Hi I am compressing LAS to zLAS with ArcGIS Interoperability.

One of files is huge - 2700 million points and 89331057kb 

Software stops to write fail in folder without any comment at written file size 25Gb (Error running translation). There are no problems with disk space and memory! FME_TMEP is configure properly in 4Tb SSD store. Computer have 64Gb RAM.

I am do this translation twice, but result is same and file size is same.

Is it ESRI limitation on zLAS size, or something else?

Best regards,

Arvid

Not sure re:size limits. But just wondering if you have tried Esri own tool on the same file, and see if it compresses.

https://www.arcgis.com/home/item.html?id=787794cdbd384261bc9bf99a860a374f

 

The other thing to do is turn on debug logging in Interop, and see if if shows any further information in your transaction log file.


Not sure re:size limits. But just wondering if you have tried Esri own tool on the same file, and see if it compresses.

https://www.arcgis.com/home/item.html?id=787794cdbd384261bc9bf99a860a374f

 

The other thing to do is turn on debug logging in Interop, and see if if shows any further information in your transaction log file.

Awesome

 


Thanks Danilo,
As an FME fan, I usually solve all problems with Workbench. It didn't occur to me to check with saving the data with ArGIS Pro. I'll try it.
Another option is to try opening the data with the Data Inspector. I'll have to wait a couple of hours for the huge cloud of points to be read. Perhaps the Data Inspector shows something? Data error?
The last backup plan for me is to divide the LAS into smaller pieces, but in that case the 600x600m tile division is lost.


Dear Danilo,

Your suggestion to save LAS as zLAS - does not work. ArcGIS Pro died at the end of the process

I managed to open the 89Gb LAS with FME Data Inspector. I tried saving the file as zLAS with Save as. The result - exactly the same as in my first post.

 

Logfile doesn't say anything, it totally seems like he doesn't want to curse, but quietly dies!!!

...and also leaves behind dung in the disk (see T14)

To make my life more fun, one of the next files turned out to be even bigger - 118Gb LAS

Its compression was exactly the same as the previous file. The process died saying that I have done everything 100%.

As a result, a 32Gb large zLAS file - T20 was created, which is corrupted. The last case has one difference - Bach Deploy does not work

If no one has any idea what it is, I'll have to cut the LAS files


Reply