I am converting vector polygons into raster datasets.
Is there a standard calculation for converting the scale of the vector into an appropriate pixel size (e.g. 25 metres)?
My vector has been derived from Aerial Photo Interpretation (API) at a scale of 1:50000.
Answer
From a cartographic point of view, it is commonly assumed that the human perception of a line position is around 0.3 mm. For a given map scale of 1:20,000 or smaller, the USGS’ NMAS has established that 90% of all the points tested must fall within 1/50 of an inch (0.5 mm) (as measured on the map) to their known positions on the planet (see here).
This can be used when you relate the pixel size to the scale of a map. You want a pixel size that does not degrade the precision of your map. The conversion algorithm usually assign the value of your cell the the cetner of your pixel. So the distance between the center of your pixel and one of the corner must be within your tolerance. this makes roughly sqrt(2)/2*pixel size between 0.3 mm and 0.5 mm. Roughly, we thus have a pixel size of about 0.5 mm "on the paper". Note that you can do the same kind of reasoning based on the resolution of your screen.
Once you know the pixel size that you need to keep most of your details, you convert it to "real size" pixels using the scale factor. In your case, 5e-4m * 50 000 = 25 m. No need to take smaller pixels, and it would still be OK up to 35 m.
No comments:
Post a Comment