Are you able to provide the raw data for the two areas above?
Pretty strange for sure. I definitely also recommend uploading the input data. If you really need to move past this you can see if the SpikeRemover helps. You can specify a max angle to remove. It should help remove (some of) the artifacts.
Ideally though we'd want to help you not create them in the first place....
The sample is attached. Might add that I ran the same data through QGIS contour tool without any issues.
The sample is attached. Might add that I ran the same data through QGIS contour tool without any issues.
wow - I see what you mean - but it looks a hellova lot better if you don't set the no data value. My guess is that for some reason the NoData is messing with the data.
If you look at the data values where that zoomed in shot of the spike it you can see that its an edge pixel with a value of 3 - greater that 2.5 which was for me, the lowest contour line after removing everything below or at 0 elevation. I guess Ideally you'd like to have a '0' contour line.
One thing which I played around with was to first set NoData to be -0.15 (as it is in the data) Then use the RasterExtentsCoercer (set to Data Extents) to extract polygons, buffer by 5 meters, dissolve to remove overlaps and then clip the input raster into 70 or so parts. Use a counter to add an id to each rater part and then run it though the CountourGenerator (grouping by the id). The whole process is still considerably faster than running the whole image through the contour generator. (15 seconds vs 50 seconds) and the result it pretty similar. The benefit of this approach lets you keep the 0 meter contour.
wow - I see what you mean - but it looks a hellova lot better if you don't set the no data value. My guess is that for some reason the NoData is messing with the data.
If you look at the data values where that zoomed in shot of the spike it you can see that its an edge pixel with a value of 3 - greater that 2.5 which was for me, the lowest contour line after removing everything below or at 0 elevation. I guess Ideally you'd like to have a '0' contour line.
One thing which I played around with was to first set NoData to be -0.15 (as it is in the data) Then use the RasterExtentsCoercer (set to Data Extents) to extract polygons, buffer by 5 meters, dissolve to remove overlaps and then clip the input raster into 70 or so parts. Use a counter to add an id to each rater part and then run it though the CountourGenerator (grouping by the id). The whole process is still considerably faster than running the whole image through the contour generator. (15 seconds vs 50 seconds) and the result it pretty similar. The benefit of this approach lets you keep the 0 meter contour.
Thanks for looking in to this. I generated the contour lines with a script instead. There where also a few artefacts there. But much easier to spot and remove. I will blame the whole thing onthe dataset for now.
wow - I see what you mean - but it looks a hellova lot better if you don't set the no data value. My guess is that for some reason the NoData is messing with the data.
If you look at the data values where that zoomed in shot of the spike it you can see that its an edge pixel with a value of 3 - greater that 2.5 which was for me, the lowest contour line after removing everything below or at 0 elevation. I guess Ideally you'd like to have a '0' contour line.
One thing which I played around with was to first set NoData to be -0.15 (as it is in the data) Then use the RasterExtentsCoercer (set to Data Extents) to extract polygons, buffer by 5 meters, dissolve to remove overlaps and then clip the input raster into 70 or so parts. Use a counter to add an id to each rater part and then run it though the CountourGenerator (grouping by the id). The whole process is still considerably faster than running the whole image through the contour generator. (15 seconds vs 50 seconds) and the result it pretty similar. The benefit of this approach lets you keep the 0 meter contour.
Garbage in = Garbage out
Slightly off topic, but I do find that sometimes FME is too "exact" and will highlight data "issues" like this more than other pieces of software than my apply some level of generalization or cleaning before performing operations