Skip to main content

Hi,

I'm trying to read in various Caris CSARPC files and some fail with the following errors: (Had to sanitises paths)

I've moved the FME temporary file location to a network share that has plenty of space as I thought it was running out on the C: drive, and it is writing the temp files in the correct location.

It also is not consistent, I can run other CSARPC files through reader and it works fine, some a lot big then others, not sure what is going on??

 

Output

"FME Configuration: Source coordinate system for reader CSARPC_11CSARPC] set to `_FME_0' as read from input data

Coordinate System `_FME_0' parameters: CS_NAME=`_FME_0' DT_NAME=`WGS84' PROJ=`MRCATK' QUAD=`1' SCL_RED=`1.0000000000' UNIT=`METER'

Storing feature(s) to FME feature store file `\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.ffsupdating'

FFS writer: Writing point cloud with 10028592 point(s) to file '\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.fps'

FFS writer: 840000 of 10028592 points written

FFS writer: 2040000 of 10028592 points written

CSARPC reader: Failed to read data for component 'XYZ' in dataset 'C:\\temp\\MattC\\Surface\\Deconflicted_Port_kembla_20191016_PCBrendan.csar'. The error was ''

FFS writer: Failed to get block of 10000 points starting at index 2530000

Stored 1 feature(s) to FME feature store file `\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.ffsupdating'

Saving spatial index into file '\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.fsi'

Finished saving spatial index into file '\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.fsi'

CSARPC Feature Recorder -1 2147549186(RecorderFactory): Failed to write feature data to `\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.ffsupdating'

Saving spatial index into file '\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.fsi'

Finished saving spatial index into file '\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.fsi'

Failed to write feature data to `\\\\XXXXX-share4\\XXX_GIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.ffsupdating'

Translation FAILED with 4 error(s) and 0 warning(s) (0 feature(s) output)

FME Session Duration: 11.7 seconds. (CPU: 0.8s user, 2.3s system)

END - ProcessID: 6488, peak process memory usage: 70544 kB, current process memory usage: 54888 kB

Failed to write feature data to `\\\\XXXXX-share4\\XXXGIS_Warehouse_Scratch\\temp\\wb-cache--zD5304\\Main_CSARPC -1 2 -1 0 7d1dc4d5b02aac6f53ce8974ab35efb9a70c671b.ffsupdating'

Program Terminating

Translation FAILED."

 

Any help would be greatly appreciated.

Brendan

Turn of the Feature Caching in FME - When working with big data you will quickly fill up your temp space and slow the whole process to a halt. Feature Caching will store a copy of your point cloud at every node which will quickly blow up.

 

In FME 2019 Feature Caching in on by default.

 

 

This is probably unrelated to why the Reader is failing, however, it is likley causing the other issues with the FFS stuff you see in the log.

 

 

 


Dear Brendan Neal,

 

did you find any solution to solve this problem? FFS writer: Failed to get block of 10000 points starting at index 2530000

In case yes, i'm intersted to have your feedback because I'm facing the same error (with CSAR Pointclouds). In case not, I will let you know if I find a solution


Hi everyone,

 

I have exactly the same problem with multiple .csar files.

 

"FFS writer: Failed to get block of 10000 points starting at index 2530000"

 

When I run the workspace without feature caching, it still doesn't work. Anyone who knows what the problem can be?

 

Thanks for your help,

Margot


Hello, 

I think (not sure) the problem is due to a problem of Indexation of the Input CSAR file. You can try to reindex the CSAR using CARIS software , maybe It will solve the problem.

 

To my part, I didn't try to solve this problem. If you use CSAR dataset, maybe you use CARIS Software? In case yes, you can use CSAR to LGZ (YXZ) CARIS batch, then you can read the results in FME using "Point. I advice not to use FME "CSV" reader (10 times slower). This method is quite fast and works 100%.

 

PS : The syntax of CARIS batch looks like that : 

""C:\Program Files\CARIS\BASE Editor\4.4\bin\carisbatch" --run ExportCoverageToASCII  --include-band Depth 3 --coordinate-format LLDG_DD --coordinate-precision 7 --z-axis-convention DOWN "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\CSAR\@Value(objnam).csar" "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\LGZ\ALL\@Value(objnam).csv""

 


Hello, 

I think (not sure) the problem is due to a problem of Indexation of the Input CSAR file. You can try to reindex the CSAR using CARIS software , maybe It will solve the problem.

 

To my part, I didn't try to solve this problem. If you use CSAR dataset, maybe you use CARIS Software? In case yes, you can use CSAR to LGZ (YXZ) CARIS batch, then you can read the results in FME using "Point. I advice not to use FME "CSV" reader (10 times slower). This method is quite fast and works 100%.

 

PS : The syntax of CARIS batch looks like that : 

""C:\Program Files\CARIS\BASE Editor\4.4\bin\carisbatch" --run ExportCoverageToASCII  --include-band Depth 3 --coordinate-format LLDG_DD --coordinate-precision 7 --z-axis-convention DOWN "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\CSAR\@Value(objnam).csar" "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\LGZ\ALL\@Value(objnam).csv""

 

Sorry technical problem when posting : 

I advice you to read XYZ file using Point Cloud XYZ" reader rather than "CSV" reader in FME, it is much faster.


Hello, 

I think (not sure) the problem is due to a problem of Indexation of the Input CSAR file. You can try to reindex the CSAR using CARIS software , maybe It will solve the problem.

 

To my part, I didn't try to solve this problem. If you use CSAR dataset, maybe you use CARIS Software? In case yes, you can use CSAR to LGZ (YXZ) CARIS batch, then you can read the results in FME using "Point. I advice not to use FME "CSV" reader (10 times slower). This method is quite fast and works 100%.

 

PS : The syntax of CARIS batch looks like that : 

""C:\Program Files\CARIS\BASE Editor\4.4\bin\carisbatch" --run ExportCoverageToASCII  --include-band Depth 3 --coordinate-format LLDG_DD --coordinate-precision 7 --z-axis-convention DOWN "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\CSAR\@Value(objnam).csar" "D:\TETHYS\EXPORT_BDBS\$(num_dalle)\SONDES\LGZ\ALL\@Value(objnam).csv""

 

Thank you for the answer! 

We received the data from a client, (and we do not use CARIS software). I will ask the client for an other format of the data that we can read in FME without problems. 

Thanks for the help and have a nice day!

Margot


ok.

If it is just one shot and makes you waist time, you can give me the data, I will transform it to ASCII and give you back tomorrow mourning. Up to you ;)


Hi Margot,

We were not able to resolve the issue of loading in CSARPC format. Like what Ronan mentioned, we changed the export out of CARIS software to an ASCII and then processed it through Point Cloud XYZ, which worked fine.

 

Since that though we have changed the process to load the CSAR Raster format in and have no issues with this to date, FME loads it fine every time. Saying that though we have upgraded to new version of FME 2021.2 since the original post, think we were on FME 2019.x. Not sure that has made a difference. I haven't gone back and checked if the CSARPC in the new version is working as our users are working with the Raster and prefer to load that straight in.

 

I would think the issue probably hasn't be fixed in the newer version as it is not a very common format used. Not really helping you but would suggest changing the format also.

Brendan


Reply