Non-geographer learning GIS here. I'm using ArcGIS Desktop v. 10.2.2.
I have a 30-arc second Terrain Ruggedness Index (TRI) raster file, and I want to measure the average ruggedness within each polygon of shapefile I have (not public). The TRI data is available here.
The authors of these data point out that "it is important to take into account that the sea-level surface that corresponds to a 30 by 30 arcsecond cell varies in proportion to the cosine of its latitude." They provide a data file measuring the surface area of each cell, in square metres, which must be used to weight the TRI when averaging across polygons.
Questions:
- Instead, can I re-project the raster data to an equal area projection and then calculate an unweighted average raster value? In other words, does an equal area projection account for this surface variation in proportion to the cosine of its latitude?
- If the weighted average raster value route is a must, is the appropriate method to use the Times tool with the TRI raster and cell area raster, then calculate the average raster value of each polygon with Zonal Statistics?
No comments:
Post a Comment