I'm using the Sieve command within QGIS (Raster > Analysis > Sieve), but when I run the command with any variation of parameters, it completely changes the histogram of my raster data from between -0.140351 to 0.780933, to a binary output of 0 and 1.
This example was run with a threshold of 2 and pixel connections of 4, on a NDVI image calculated from Landsat 7 imagery.
Answer
I understand from the official documentation that gdal_sieve eliminates raster polygons if they are under a certain pixel size. It seems to work for image rasters but with a binary output.
I performed a 5 class cluster analysis classification in QGIS 2.4 Windows on the same NDVI image: Processing Toolbox > SAGA > Imagery - Classification > Cluster analysis for grids Then ran gdal_sieve on it for polygons less than 20 px:
gdal_sieve.bat -st 20 -of GTiff "C:\Users\Admin\Downloads\LS7_20020326_lat52lon42_r24p204_toa_Cardiff_NDVI (1).tif" D:/Alexandra/HelpingPeople/Tests/LS7_20020326_lat52lon42_r24p204_toa_Cardiff_NDVI_sieve_clusters_20.tif
No comments:
Post a Comment