I have a ArcGIS service configured as shown in this pastbin. Specifically, the first LOD looks like this:
{
"level": 0,
"resolution": 222.2222222222222,
"scale": 256000
}
I determined how to convert from the scale of 256000 to arrive at a resolution of 222.222:
var dotsPerInch = 96.0;
var inchesPerFoot = 12.0;
var dotsPerUnit = dotsPerInch * inchesPerFoot;
var scale = 256000;
var resolution = scale / dotsPerUnit;
But the associated WMTS service reports a "ScaleDenominator" value of 241904.7619047619:
0
241904.7619047619
-1.77905E7 4.643889999999999E7
256
256
3
3
What are the mathematics behind this value? That is, given a scale of 256000 how did ESRI arrive at a ScaleDenominator of 241904.7619?
Answer
WMTS assumes a DPI 90.7 instead of 96 as is clearly documented in the WMTSCapabilities document which states,
"The tile matrix set that has scale values calculated based on the dpi defined by OGC specification (dpi assumes 0.28mm as the physical distance of a pixel)."
0.28 mm per pixel = 0.0110236 inches per pixel or 90.71446714322 pixels per inch.
If you replace 96 in the equation above with 90.71428571429 you'll get the ScaleDenominator value so ESRI used a different conversion constant. After a little research I learned that
1 in = 2.54 cm (I thought this was an approximation but it's by definition)
Since there are 25.4 mm in one inch then 25.4 / .28 = 90.71428571429 DPI which is the value we're after for DPI. Here is a site which confirms this calculation.
No comments:
Post a Comment