I am using ArcGIS 10.1 to generate a DEM (bare earth) from LAS dataset. To accomplish this, I use Las Dataset to Raster
tool as described in Esri website.
I have read here and there in the web that the the cell size of the output raster should be set at a value corresponding to the average LAS points spacing; somewhere else, they suggest to use a value that is three times the average point spacing.
Now, since I wish to produce a bare earth DEM (i.e., with only ground returns), I am wondering if the cell size has to be set at the average point spacing of the whole LAS dataset (i.e., comprising non-ground, ground, etc), or exclusively of the ground points?
On the basis of the ArcGIS's point file information
tool (example), I managed to get statistics for each LiDAR class. The average spacing (whole dataset) is 1.5 m, while for ground points only is 0.67.
Any insight into the issue?
Answer
For comparative purposes, I produced two bare earth DEMs using as cell size value in one instance the average point value of the whole LiDAR dataset, in the other case the average spacing of only ground points (rounded up: it was 0.668, I entered the value 1, so obtaining a DEM with a resolution of 1 m).
The procedure was made via ArcGIS's Las dataset to Raster
tool.
I am noting a neat difference in favor of the second DEM. Focusing on the same portion of the landscape, I do note an improvement in details and quality of the representation of the terrain. So, I am very happy with the second DEM.
No comments:
Post a Comment