Friday 20 July 2018

delete - Editing LiDAR point cloud to remove noise/outliers present below and above ground?



I have "dirty" LiDAR data containing first and last returns and also inevitably errors under and over the surface level. (screenshot)


enter image description here


I have SAGA, QGIS, ESRI and FME at hand, but no real method. What would be a good workflow to clean this data? Is there a full automated method or would I somehow be deleting manually?



Answer



You seem to have outliers:



  • i) below the ground surface;

  • ii) above the ground surface and vertically among other above ground real features;

  • iii) above ground points with height greater than all objects of interest, for example the ones caused by clouds or birds (this is not shown in the picture, but I am assuming it might also be the case).



For 'i', the option is to use a ground filter algorithm that can take into account 'negative blunders' to get a clean LiDAR ground point cloud. See the Multiscale Curvature Classification (MCC) algorithm from Evans and Hudak (2007). It is said on page 4:



Negative blunders are a common occurrence in LiDAR data, which may be caused by the scattering of the photons in a returned laser pulse. Scattering lengthens the time for an emitted laser pulse to return to the aircraft sensor, inflating the calculation of distance traveled, hence causing a measurement error where the surface elevation is erroneously recorded as being below the surrounding measurements. It should be noted that curvature classification approaches can potentially remove valid returns surrounding negative blunders, which can expand the edge artifact around a negative blunder to create a distinct “bomb crater” effect. To address negative blunders, Haugerud and Harding suggested setting the curvature tolerance parameter to four times the interpolated cell size and selecting returns exceeding this negative curvature threshold. However, it should be noted that under certain circumstances, returns that appear to be negative blunders can be in fact valid returns (e.g., sinkholes). Therefore, the preceding suggestion to remove potential negative blunders can be implemented as an optional last model loop to employ at the discretion of the user if needed.



Below there is a post with an example about using MCC-LIDAR:



Once you have an accurate LiDAR ground point cloud to make an accurate DEM, it is possible to normalize the point cloud, and exclude points which are beneath the DEM surface (the ones with negative values). Using the same approach, it is also possible to address point number 'iii' removing points above some fixed threshold. See, for example:



Then, it leaves us with 'ii', which is addressed by AlecZ's answer recommending lasnoise from LAStools. It will also handle 'iii', and perhaps part of 'i' as well (LAStools requires a license though). Other tools specifically created for checking/removing outliers were cited here: PDAL's filters.outlier tool in Charlie Parr's answer which has a detailed explanation about how the tool works, and with the advantage PDAL is a free software.


Then, what is left from the automated process (if any outlier) can be removed manually. For example:






Evans, Jeffrey S.; Hudak, Andrew T. 2007. A multiscale curvature algorithm for classifying discrete return LiDAR in forested environments. IEEE Transactions on Geoscience and Remote Sensing. 45(4): 1029-1038.


No comments:

Post a Comment

arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...