Monday 25 January 2016

software recommendations - What raster smoothing/generalization tools are available?



I have a DEM that I would like to smooth or generalize to remove topographic extremes (chop off peaks and fill-in valleys). Ideally, I would also like to have control over the radius or level of "blurriness". In the end, I will need a set of rasters that range from slightly blurry to really blurry. (Theoretically, the blurriest would be a constant raster of the arithmetic mean of all values).


Are there any tools or methods I can use (based on Esri, GDAL, GRASS)?


Do I need to home bake my own Gaussian blur routine?


Could I use a low-pass filter (e.g. ArcGIS's filter), and if so, would I need to run it a bunch of times to get the effect of a large radius?



Answer



Gaussian blur is just a weighted focal mean. You can recreate it to high accuracy with a sequence of short-distance circular neighborhood (unweighted) means: this is an application of the Central Limit Theorem.


You have a lot of choices. "Filter" is too limited--it's only for 3 x 3 neighborhoods--so don't bother with it. The best option for large DEMs is to take the calculation outside of ArcGIS into an environment that uses Fast Fourier Transforms: they do the same focal calculations but (in comparison) they do it blazingly fast. (GRASS has an FFT module. It's intended for image processing but you might be able to press it into service for your DEM if you can rescale it with reasonable precision into the 0..255 range.) Barring that, two solutions at least are worth considering:




  1. Create a set of neighborhood weights to approximate a Gaussian blur for a sizable neighborhood. Use successive passes of this blur to create your sequence of ever smoother DEMs.



    (The weights are computed as exp(-d^2/(2r)) where d is the distance (in cells if you like) and r is the effective radius (also in cells). They have to be computed within a circle extending out to at least 3r. After doing so, divide each weight by the sum of them all so at the end they sum to 1.)




  2. Alternatively, forget the weighting; just run a circular focal mean repeatedly. I have done exactly this for studying how derived grids (like slope and aspect) change with the resolution of a DEM.




Both methods will work well, and after the first few passes there will be little to choose between the two, but there are diminishing returns: the effective radius of n successive focal means (all using the same neighborhood size) is only (approximately) the square root of n times the radius of the focal mean. Thus, for huge amounts of blurring, you will want to begin over again with a large-radius neighborhood. If you use an unweighted focal mean, run 5-6 passes over the DEM. If you use weights that are approximately Gaussian, you need only one pass: but you have to create the weight matrix.


This approach indeed has the arithmetic mean of the DEM as a limiting value.


No comments:

Post a Comment

arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...