Skip to main content

I have around 1300 las (lidar) files. For these las files i have a few shp files with breaklines and polygons where i should remove the ground. (houses/lakes/rivers)

I need to export contour lines for each las files using breaklines and also remove the ground inside those polygons.

Each block should have a buffer around them to fix the problem on the edge.

So i need to load each block with the buffer around it and generate contours after removing the ground and also to use the breaklines.

But here is another problem. i need to drop the lines first and filter them out base on z difference of each vertexes and keep only the original line that fits the z difference.

Now what i know to do so far (on a single block):

i manage to filter the lines.

is there a transformer to simply this? right now i compare each line start z and end z against each other

i know how to generate contours with breaklines.

i know how to remove points inside polygons.

i know to create a buffer around a tile

What i don't really know is how to do all the above for each block with neighbors inside that buffer and save contours in each file. I'm guessing i need a loop but i don't know how it works or how i should define it

@loganj

ContourGenerator would be a good transformer to start with feeding the LAS files to the points/lines Input port and the breaklines that you have mentioned to the Breaklines Input port.

I am not sure what you mean by a Block, is it the individual LAS file?

One way to generate the buffer for the block is to first generate a bounding box for the block followed by buffer transformer to buffer the bounding box so that you get some extra clearance for each block.

If you need to clip the output contours to this block and output, you can use Clipper transformer and pass the contours to the Clippee port and the buffered bounding box to the Clipper port.

You can use the fanout option (either at featuretype level or dataset level, you have not mentioned the output format) to generate the output for each block (assuming the block name or number or id is an attribute in the LAS dataset)

Hope this helps to some extent to kick start the job!

Happy FME :-)

SRG


@loganj

ContourGenerator would be a good transformer to start with feeding the LAS files to the points/lines Input port and the breaklines that you have mentioned to the Breaklines Input port.

I am not sure what you mean by a Block, is it the individual LAS file?

One way to generate the buffer for the block is to first generate a bounding box for the block followed by buffer transformer to buffer the bounding box so that you get some extra clearance for each block.

If you need to clip the output contours to this block and output, you can use Clipper transformer and pass the contours to the Clippee port and the buffered bounding box to the Clipper port.

You can use the fanout option (either at featuretype level or dataset level, you have not mentioned the output format) to generate the output for each block (assuming the block name or number or id is an attribute in the LAS dataset)

Hope this helps to some extent to kick start the job!

Happy FME :-)

SRG

All these i know already how to do :)

 

The exported format will probably be dgn/dwg or shp files.For now i use dgn format

 

1 block=1 las file

 


At a quick glance I'd suggest using a File and Directory Path reader in a new workspace, to read the list of lidar files. Then pass each in turn to the main processing workspace using a WorkspaceRunner transformer.

That way each gets processed separately. Look in the knowledgecentre articles for some examples.

The only thing I am unsure about is the buffer part. Do you mean you are creating a buffer around each file and bringing in a swath of data from the surrounding files? If so, that's a little more tricky. But I won't try to answer that unless it's confirmed that's what you need.

Hope this helps.


At a quick glance I'd suggest using a File and Directory Path reader in a new workspace, to read the list of lidar files. Then pass each in turn to the main processing workspace using a WorkspaceRunner transformer.

That way each gets processed separately. Look in the knowledgecentre articles for some examples.

The only thing I am unsure about is the buffer part. Do you mean you are creating a buffer around each file and bringing in a swath of data from the surrounding files? If so, that's a little more tricky. But I won't try to answer that unless it's confirmed that's what you need.

Hope this helps.

Yes, i need to create a buffer around each las file and read all las files that touch that buffer to create contour lines. after that i need to cut those contour lines on the boundbing box of that las file.

 

The buffer part is needed to fix the issue at the edge of each las.

 


Yes, i need to create a buffer around each las file and read all las files that touch that buffer to create contour lines. after that i need to cut those contour lines on the boundbing box of that las file.

 

The buffer part is needed to fix the issue at the edge of each las.

 

OK. File/Directory Path reader to read a list of files. Send the file to the processing workspace using a WorkspaceRunner. The processing workspace reads the lidar file whose name is passed to it, extracts the bounding box of the data (BoundingBoxReplacer), creates a buffer around it, and sends the buffer into a FeatureReader transformer which reads all of the lidar files using the buffer as a spatial filter. I don't know if those surrounding files will be clipped or read in their entirety, so you might need to apply a Clipper (using the same buffer as the Clipper feature) after the FeatureReader.

 

 

That, I think, should do what you need.

 


OK. File/Directory Path reader to read a list of files. Send the file to the processing workspace using a WorkspaceRunner. The processing workspace reads the lidar file whose name is passed to it, extracts the bounding box of the data (BoundingBoxReplacer), creates a buffer around it, and sends the buffer into a FeatureReader transformer which reads all of the lidar files using the buffer as a spatial filter. I don't know if those surrounding files will be clipped or read in their entirety, so you might need to apply a Clipper (using the same buffer as the Clipper feature) after the FeatureReader.

 

 

That, I think, should do what you need.

 

Yeap, I think it should work. I've done something similar but all in 1 workspace.

 

I create a bounding box for each las files -> buffer it -> send the buffer to FutureReader->create conturs. Now my problem is that is taking way too long. i have to create surface 2 times with these points. one time i drape the breaklines that needs elevations and 1 time i have to create contur lines. Since this morning till now (13 hourse now) i still have no output file.

 

I guess working with 3-5 milions points/las file is way too much for this software. I'll have to use other softwares. for my project that has like 700-800 las files...this will take a few years

 


OK. File/Directory Path reader to read a list of files. Send the file to the processing workspace using a WorkspaceRunner. The processing workspace reads the lidar file whose name is passed to it, extracts the bounding box of the data (BoundingBoxReplacer), creates a buffer around it, and sends the buffer into a FeatureReader transformer which reads all of the lidar files using the buffer as a spatial filter. I don't know if those surrounding files will be clipped or read in their entirety, so you might need to apply a Clipper (using the same buffer as the Clipper feature) after the FeatureReader.

 

 

That, I think, should do what you need.

 

actually i think your way is better because the workspace will read only the buffer around the input file. i'll have to try it.

 

a loop would probably solve my workspace too...probably. but i have no clue how to create a loop. it seems that is a mystery for me the ways of the loop custom transformer

 


actually i think your way is better because the workspace will read only the buffer around the input file. i'll have to try it.

 

a loop would probably solve my workspace too...probably. but i have no clue how to create a loop. it seems that is a mystery for me the ways of the loop custom transformer

 

I don't know if a loop is the best way there. What are you trying to send around a loop? You just need each file processed separately, not multiple times. Group-by parameters will be a better solution if you want to keep everything in one workspace. So if you can give a name to each section of data, then use a group-by in the transformers that you process the data with. But you mentioned performance and I think that method would slow things down. Reading all 800 las files with 5 million points into one process would be seriously slow. But if you can split each off to a separate process - using the WorkspaceRunner - then performance should be drastically improved. You just need to tweak the number of jobs running so as not to use all the system resources at once. Using 64-bit FME instead of 32-bit could also be a huge help.

 

 


OK. File/Directory Path reader to read a list of files. Send the file to the processing workspace using a WorkspaceRunner. The processing workspace reads the lidar file whose name is passed to it, extracts the bounding box of the data (BoundingBoxReplacer), creates a buffer around it, and sends the buffer into a FeatureReader transformer which reads all of the lidar files using the buffer as a spatial filter. I don't know if those surrounding files will be clipped or read in their entirety, so you might need to apply a Clipper (using the same buffer as the Clipper feature) after the FeatureReader.

 

 

That, I think, should do what you need.

 

 

Ok, I'm trying to use the File/Directory Path reader which by the way i have no clue where to find it

 

But i've tried to add a normal reader, set format to las/laz set to read an entire directory and ive connect this reader to WorkspaceRunner. set with wait for job to complete=yes.

 

Now in this workspacerunner i 3 fields

 

1 for shps files that is in the workspace 2

 

1 for dgn folder where the result will be save (workspace 2)

 

1 for las files that i read for my project (inside the workspace 2)

 

 

I've tried to run it but failed.

 

it seems that it can't load the shp files that are defined in workspace 2

 

 

I've also added a path object from a example from this site (copy/paste cause i have no clue where to find it) I've change the extension and path to my files.

 

And the result was the same. no shp file found.

 

 

I've tried to set the shp files manually in the workspacerunner but ive got the same result.

 

I'm guessing that if this will succeed i might get another error when it will try to save the files.

 

 

So yeap....i have no clue what i'm doing

 


 

Ok, I'm trying to use the File/Directory Path reader which by the way i have no clue where to find it

 

But i've tried to add a normal reader, set format to las/laz set to read an entire directory and ive connect this reader to WorkspaceRunner. set with wait for job to complete=yes.

 

Now in this workspacerunner i 3 fields

 

1 for shps files that is in the workspace 2

 

1 for dgn folder where the result will be save (workspace 2)

 

1 for las files that i read for my project (inside the workspace 2)

 

 

I've tried to run it but failed.

 

it seems that it can't load the shp files that are defined in workspace 2

 

 

I've also added a path object from a example from this site (copy/paste cause i have no clue where to find it) I've change the extension and path to my files.

 

And the result was the same. no shp file found.

 

 

I've tried to set the shp files manually in the workspacerunner but ive got the same result.

 

I'm guessing that if this will succeed i might get another error when it will try to save the files.

 

 

So yeap....i have no clue what i'm doing

 

No problem. I'll see if I can put together a short movie demo.

 

 


 

Ok, I'm trying to use the File/Directory Path reader which by the way i have no clue where to find it

 

But i've tried to add a normal reader, set format to las/laz set to read an entire directory and ive connect this reader to WorkspaceRunner. set with wait for job to complete=yes.

 

Now in this workspacerunner i 3 fields

 

1 for shps files that is in the workspace 2

 

1 for dgn folder where the result will be save (workspace 2)

 

1 for las files that i read for my project (inside the workspace 2)

 

 

I've tried to run it but failed.

 

it seems that it can't load the shp files that are defined in workspace 2

 

 

I've also added a path object from a example from this site (copy/paste cause i have no clue where to find it) I've change the extension and path to my files.

 

And the result was the same. no shp file found.

 

 

I've tried to set the shp files manually in the workspacerunner but ive got the same result.

 

I'm guessing that if this will succeed i might get another error when it will try to save the files.

 

 

So yeap....i have no clue what i'm doing

 

I created a quick movie here: http://screencast.com/t/t9h9oCaLFF

 

 

It uses Shape contour tiles instead of point clouds, but the effect is much the same. I think performance will be OK because you are running one dataset at a time. One of the key parts is to set "Single Merged Feature Type" in the reader, because otherwise your data gets filtered out. Something called the Unexpected Input Remover. You'll also need to write the data out, using a fanout of some sort, so you don't write over the same dataset name again and again (if that makes sense).

 

 

Hope you find it useful.

 


Reply