In R, it's relatively trivial to perform per pixel calculations based on a raster stack (e.g., get std.dev for each pixel on a 12 layer GeoTIFF). Unfortunately, the speed is less than desirable when working with large rasters.
Recently, I've been trying to use GDAL for this purpose, but I can't seem to get the expression down. I've tried the following, but it doesn't seem to work as advertised (or how I think it should work):
gdal_calc.py -A tmean_monthly_1980.tif --calc="std(A)" --outfile=tstdev_1980.tif --allBands=A --calc="std(A)" --NoDataValue=-9999
In the above example tmean_monthly_1980.tif
is a raster composed of 12 layers (Jan-Dec) of mean monthly temperatures in 1980 (stacked using gdal_merge
). Although the GDAL command produces an output, the results are definitely not the standard deviation per pixel.
Another option might be to use single band files, but gdal_calc.py
appears limited to "only" 26 inputs (A-Z; i.e., "-A file -B file -C file
etc."), and the resulting command line/script would be fairly cumbersome when working with large, multi-layered rasters (e.g., time series).
Is there a specific syntax for performing per-pixel raster calculations on a multi-band image using GDAL? If only individual files/bands can be specified using gdal_calc.py
, please provide a scripted example using Bash, Python, etc.
Specifically, my goal is to calculate the standard deviation and cv for each pixel in a multi-layer raster stack, or using individual files that make up the same raster stack.
Note: I'm holding out for a GDAL specific answer, but other FOSS solutions are welcomed and will be upvoted, especially if they are relatively compact and adaptable to large number of inputs.
No comments:
Post a Comment