Friday 31 March 2017

sequential number field in ArcGIS?


How can I get a sequence of numbers 1,2,3,4 from the model below Is there any code I can use in Arcgis to do this in the field Calculator


ObjectID   POINT_X    POINT_Y   N_Ordem
179705 277537,5975 167114,8149 1
179705 277546,9257 167111,2111 2
179705 277541,5203 167097,2191 3

179705 277522,8641 167104,4261 4
179705 277528,2697 167118,4183 5
179705 277537,5975 167114,8149 6
179708 277458,3343 167031,5693 1
179708 277452,8895 167017,5923 2
179708 277443,5715 167021,2221 3
179708 277449,0163 167035,1991 4
179708 277458,3343 167031,5693 5
179712 277482,4411 167038,2495 1
179712 277476,9701 167024,3099 2

179712 277467,6521 167027,9397 3
179712 277473,1321 167041,9029 4
179712 277482,4411 167038,2495 5
179717 277543,0029 167128,8071 1
179717 277537,5975 167114,8149 2
179717 277528,2697 167118,4183 3
179717 277533,6747 167132,4109 4
179717 277543,0029 167128,8071 5


qgis - How to convert data from a .gdb into a shapefile without ArcMap?


I have a .gdb folder with LOCK Files, FREELIST Files, ATX Files, GDBINDEXES Files, GDBTABLE Files, GDBTABLEX Files and, SPX Files.


I want to convert the parcel data within this folder into a shapefile so that I can load it into Arc GIS online. If I still had my ArcGIS desktop version this would be a simple task however, I no longer have access to this version of the software.


Any ideas on how to get this parcel data into a shapefile? I haven't tried downloading QGIS yet so I'm not sure if this software would have that capability as ArcGIS 10.1 did




qgis - WMS-Layers with Orthophotos with a resolution of 10cm are not shown in Qfield


I have a project with different layers in QGIS 3.6. Some of the layers are WMS layers with orthophotos. The orthophotos have different resolutions from 25 to 10cm. This works all very well.


But using this project in Qfield (System Android 4.4 and Qfield Lucendro 0.11.90), the WMS layers with orthophotos and a resolution of 12 or 10 cm are not shown. The sign which shows that the layer is loaded works normally but after a few seconds the orthophoto cannot be seen. Other layers with a resolution of 20 or 25cm are working correctly. I tried different options in QGIS but without success.





qgis - Merge all shapefiles in a directory - PyQGIS


I'm trying to merge all shapefiles in a directory using the following Python script from the console in QGIS 3.4.


The script was sourced from the following question:



Merging all vector files within directory with QGIS?


gis.stackexchange.com/questions/327398/merging-all-vector-files-within-directory-with-qgis


import glob, os, processing
path_to_shp = "P:/GIS/Basemapping/Topographic Area Schema 1/"
os.chdir(path_to_shp)
filelist = []
for fname in glob.glob("*.shp"):
uri = path_to_shp + fname
filelist.append(uri)


fileliststring = ';'.join(filelist)
processing.run("qgis:mergevectorlayers", fileliststring, "P:/GIS/Basemapping/Topographic Area Schema 1/merged.shp")

Running this results in:


TypeError: QgsProcessingAlgorithm.checkParameterValues(): argument 1 has unexpected type 'str'

The first time I ran into this I checked the input parameters for the algorithm using:


processing.algorithmHelp("qgis:mergevectorlayers")


Based on the algorithmHelp I assumed I should be passing a list of filepaths as input:


list[str]: list of layer sources


Running the following modification of the original script:


import glob, os, processing
path_to_shp = "P:/GIS/Basemapping/Topographic Area Schema 1/"
os.chdir(path_to_shp)
filelist = []
for fname in glob.glob("*.shp"):
uri = path_to_shp + fname
filelist.append(uri)


processing.run("qgis:mergevectorlayers", filelist, "P:/GIS/Basemapping/Topographic Area Schema 1/merged.shp")

results in the following error:


TypeError: QgsProcessingAlgorithm.checkParameterValues(): argument 1 has unexpected type 'list'

I feel I must be missing something obvious here. I've tried passing a list and a string of filepaths but neither seems to work. Can anyone help?



Answer



That's no how you call processing.run. You should check out the official QGIS documentation


Basicaly, processing.run expects the algorithm identifier, and a dictionnary containing the processing parameters. This should work:


processing.run(

"qgis:mergevectorlayers",
{
"LAYERS": filelist,
"OUTPUT": "P:/GIS/Basemapping/Topographic Area Schema 1/merged.shp",
},
)

mapinfo - Batch Script to merge tab fles in a directory


Very similar to this question Merging multiple .tab files? can someone please help me with a batch script for this?


Based on the answer in the above I have


#!/bin/bash
for i in *.tab;
do
ogr2ogr -update -append merge.tab i -f “MapInfo File” -nln merge
done


but when I run it it fails with


FAILURE:
Unable to open datasource `i' with the following drivers.

and then lists the drivers



Answer



For completeness, and based on the above comments, this should work:


ogr2ogr -update -append merge.tab $i -f “MapInfo File” -nln merge

arcgis 10.0 - How to attach GP script tools, standard tools, and Python script to a toolbar?


I need to build a custom toolbar that can launch the following items:



  • A Python script tool - shouldn't be a problem, as you can already attach these to a toolbar

  • Out of the box functions such as the Editor Toolbar maybe or a standard GP tool - again, should be do-able as you can already attach these to a toolbar

  • A Python script that launches a PyQT/wxPython/Tk GUI. This is the tricky one I think. Now, this GUI does not need to interact with ArcMap. My other custom tools will do all of that, this GUI is for the user to setup parameters that will get written to a XML config file. It will also do some arcpy calls in the background and slurp some data and values out of a geodatabase and push them to the same XML config file.



enter image description here


So the question is - how to do this as simply as possible? It appears I cannot attach a Python script (that is not encased in a tool) to a toolbar in ArcMap. I'm thinking I'm going to have to go the Add-In route and create a simple Toolbar with buttons to kick off each of the items listed above. VBA is not an option. Thoughts?



Answer



@ChadCooper, perhaps I'm overlooking something, but with regard to your 3rd case:



A Python script that launches a PyQT/wxPython/Tk GUI..



Is there a reason why you couldn't use the .Net Process API (see System.Diagnostics) to execute your custom Python script over Standard In/Out? Of course, this sort of architecture will assume a proper version of Python is installed and equipped with any additional libs you require. I think you'd just want to extend an ICommand object and use the Process API in its Click() event. This example (which was an experiment only) called a .py script that used dbfpy to read the columns in a DBF file and return them as a list to a .Net application.


private static List _readDbfOutput;


public static List ReadDbfTable(string dbfPath, int limit)
{
// start a fresh output object..
_readDbfOutput = new List();
_readDbfOutput.Clear();

// ready..
string executable = "C:\Python27\python.exe";
string fullUtilPath = "C:\some_app\utils\read_dbf_table.py";


// The path to the python file and it's arguments become a single input
// submitted to the python runtime. In this case, I've got somethign like:
// C:\>C:/Python27/python.exe C:\some_app\utils\read_dbf_table.py -f C:/data/some_shapefiles.dbf -l 100

// dbfPath and limit were method arguments..
string exeArguments = fullUtilPath + " -f " + dbfPath + " -l " + limit.ToString();

ProcessStartInfo startInfo = new ProcessStartInfo(executable, exeArguments);
startInfo.UseShellExecute = false;

startInfo.CreateNoWindow = true;
startInfo.RedirectStandardOutput = true; // necessary if you need to listen..
startInfo.RedirectStandardInput = false; // necessary if you send more inputs..
startInfo.StandardOutputEncoding = System.Text.Encoding.ASCII;

Process proc = new Process();

// ReadingDbfTable is my redirected standard input..
proc.OutputDataReceived += new DataReceivedEventHandler(ReadingDbfTable);
proc.StartInfo = startInfo;

proc.Start();
proc.BeginOutputReadLine();

// WaitForExit() may need to be wrapped in a Try or Using block.
// Will it run indefinitely if the script fails?
proc.WaitForExit();
proc.Close();

return _readDbfOutput;
}


/// Handle redirected STDOUT.
private static void ReadingDbfTable(object sendingProc, DataReceivedEventArgs stdOutput)
{
if ( ! String.IsNullOrEmpty(stdOutput.Data) )
{
_readDbfOutput.Add(stdOutput.Data);
}
}

javascript - Adding feature property values to filter drop down option in Leaflet, instead of keys?



How would I convert this function to store and utilize (feature.property.id) json values instead of keys in the dropdown menu within a leaflet map?



















with the corresponding JavaScript,


L.CountrySelect = {};

L.CountrySelect.countries = {"Afghanistan":{"type":"Feature","id":"AFG","properties":{"name":"Afghanistan"},"geometry":{"type":"Polygon","coordinates":[[[61.210817,35.650072],[62.230651,35.270664],[62.984662,35.404041],[63.193538,35.857166],[63.982896,36.007957],[64.546479,36.312073],[64.746105,37.111818],[65.588948,37.305217],[65.745631,37.661164],[66.217385,37.39379],[66.518607,37.362784],[67.075782,37.356144],[67.83,37.144994],[68.135562,37.023115],[68.859446,37.344336],[69.196273,37.151144],[69.518785,37.608997],[70.116578,37.588223],[70.270574,37.735165],[70.376304,38.138396],[70.806821,38.486282],[71.348131,38.258905],[71.239404,37.953265],[71.541918,37.905774],[71.448693,37.065645],[71.844638,36.738171],[72.193041,36.948288],[72.63689,37.047558],[73.260056,37.495257],[73.948696,37.421566],[74.980002,37.41999],[75.158028,37.133031],[74.575893,37.020841],[74.067552,36.836176],[72.920025,36.720007],[71.846292,36.509942],[71.262348,36.074388],[71.498768,35.650563],[71.613076,35.153203],[71.115019,34.733126],[71.156773,34.348911],[70.881803,33.988856],[69.930543,34.02012],[70.323594,33.358533],[69.687147,33.105499],[69.262522,32.501944],[69.317764,31.901412],[68.926677,31.620189],[68.556932,31.71331],[67.792689,31.58293],[67.683394,31.303154],[66.938891,31.304911],[66.381458,30.738899],[66.346473,29.887943],[65.046862,29.472181],[64.350419,29.560031],[64.148002,29.340819],[63.550261,29.468331],[62.549857,29.318572],[60.874248,29.829239],[61.781222,30.73585],[61.699314,31.379506],[60.941945,31.548075],[60.863655,32.18292],[60.536078,32.981269],[60.9637,33.528832],[60.52843,33.676446],[60.803193,34.404102],[61.210817,35.650072]]]}},"Angola":{"type":"Feature","id":"AGO","properties":{"name":"Angola"},"geometry":{"type":"MultiPolygon","coordinates":[[[[16.326528,-5.87747],[16.57318,-6.622645],[16.860191,-7.222298],[17.089996,-7.545689],[17.47297,-8.068551],[18.134222,-7.987678],[18.464176,-7.847014],[19.016752,-7.988246],[19.166613,-7.738184],[19.417502,-7.155429],[20.037723,-7.116361],[20.091622,-6.94309],[20.601823,-6.939318],[20.514748,-7.299606],[21.728111,-7.290872],[21.746456,-7.920085],[21.949131,-8.305901],[21.801801,-8.908707],[21.875182,-9.523708],[22.208753,-9.894796],[22.155268,-11.084801],[22.402798,-10.993075],[22.837345,-11.017622],[23.456791,-10.867863],[23.912215,-10.926826],[24.017894,-11.237298],[23.904154,-11.722282],[24.079905,-12.191297],[23.930922,-12.565848],[24.016137,-12.911046],[21.933886,-12.898437],[21.887843,-16.08031],[22.562478,-16.898451],[23.215048,-17.523116],[21.377176,-17.930636],[18.956187,-17.789095],[18.263309,-17.309951],[14.209707,-17.353101],[14.058501,-17.423381],[13.462362,-16.971212],[12.814081,-16.941343],[12.215461,-17.111668],[11.734199,-17.301889],[11.640096,-16.673142],[11.778537,-15.793816],[12.123581,-14.878316],[12.175619,-14.449144],[12.500095,-13.5477],[12.738479,-13.137906],[13.312914,-12.48363],[13.633721,-12.038645],[13.738728,-11.297863],[13.686379,-10.731076],[13.387328,-10.373578],[13.120988,-9.766897],[12.87537,-9.166934],[12.929061,-8.959091],[13.236433,-8.562629],[12.93304,-7.596539],[12.728298,-6.927122],[12.227347,-6.294448],[12.322432,-6.100092],[12.735171,-5.965682],[13.024869,-5.984389],[13.375597,-5.864241],[16.326528,-5.87747]]],[[[12.436688,-5.684304],[12.182337,-5.789931],[11.914963,-5.037987],[12.318608,-4.60623],[12.62076,-4.438023],[12.995517,-4.781103],[12.631612,-4.991271],[12.468004,-5.248362],[12.436688,-5.684304]]]]}},"Albania":{"type":"Feature","id":"ALB","properties":{"name":"Albania"},"geometry":{"type":"Polygon","coordinates":[[[20.590247,41.855404],[20.463175,41.515089],[20.605182,41.086226],[21.02004,40.842727],[20.99999,40.580004],[20.674997,40.435],[20.615,40.110007],[20.150016,39.624998],[19.98,39.694993],[19.960002,39.915006],[19.406082,40.250773],[19.319059,40.72723],[19.40355,41.409566],[19.540027,41.719986],[19.371769,41.877548],[19.304486,42.195745],[19.738051,42.688247],[19.801613,42.500093],[20.0707,42.58863],[20.283755,42.32026],[20.52295,42.21787],[20.590247,41.855404]]]}}

L.CountrySelect = L.Control.extend({
options: {
position: 'topright',
title: 'Country',
exclude: [],
include: [],
countries: L.CountrySelect.countries,
},

onAdd: function(map) {
this.div = L.DomUtil.create('div','leaflet-countryselect-container');
this.select = L.DomUtil.create('select','leaflet-countryselect',this.div);
var content = '';

if (this.options.title.length > 0 ){
content += '';
}

var countries = (Array.isArray(this.options.include) && this.options.include.length > 0) ? this.options.include : this.options.countries;


var countryKeys = Object.keys(countries).sort();
for (i in countryKeys){
if (this.options.exclude.indexOf(countryKeys[i]) == -1){
content+='';
}
}

this.select.innerHTML = content;


this.select.onmousedown = L.DomEvent.stopPropagation;

return this.div;
},
on: function(type,handler){
if (type == 'change'){
this.onChange = handler;
L.DomEvent.addListener(this.select,'change',this._onChange,this);
} else if (type == 'click'){ //don't need this here probably, but for convenience?
this.onClick = handler;

L.DomEvent.addListener(this.select,'click',this.onClick,this);
} else {
console.log('CountrySelect - cannot handle '+type+' events.')
}
},
_onChange: function(e) {
var selectedCountry = this.select.options[this.select.selectedIndex].value;
e.feature = this.options.countries[selectedCountry];
this.onChange(e);
}

});

L.countrySelect = function(id,options){
return new L.CountrySelect(id,options);
};

Mainly this function is pulling the super key county name, but this doesn't work well with a general JSON/GeoJSONs (please correct me if I'm wrong). I'd like to add feature.propoerty.id values to the dropdown list and have that filter/zoom to associated county based on that value instead of the super key.


Here's a demo: http://ahalota.github.io/Leaflet.CountrySelect/demo.html


I've been scratching my head for a couple hours on this and I'm pretty new to JavaScript.


More information on this function can be found here -> https://github.com/ahalota/Leaflet.CountrySelect




Answer



First it has to be noted that solution below is modification of Leaflet.CountrySelect plugin, written by Anika S. Halota.


Requirement that control would work with any JSON file is to broad, since file must have defined structure that plugin can work with. Since file has to include geometries, the obvious candidate is then GeoJSON feature collection, where each feature has geometry that can be displayed and some id by which it can be identified.


The following changes were made to control:



  1. Control was renamed to L.featureSelect

  2. Input parameter features was added to control. This is input GeoJSON object.

  3. New option lookupProperty was added. It specifies feature property name to be used for lookup. Deafult value is 'id'.

  4. New option lookupInFeatureProperties was added. If set to true, feature property specified with lookupProperty will be searched for in properties feature property.

  5. Options exclude, include and countries were removed, sonce thay are not needed any more.


  6. Lookup of selected feature is done by looping through the features in feature collection until the feature with desired lookup value in specified property is found.


So here is how new control looks like:


L.FeatureSelect = L.Control.extend({ 
options: {
position: 'topright',
title: 'Feature ID',
lookupProperty: 'id',
lookupInFeatureProperties: false
},

initialize: function (features, options) {
this.featureCollection = features;
L.Util.setOptions(this, options);
},
onAdd: function(map) {
this.div = L.DomUtil.create('div', 'leaflet-featureselect-container');
this.select = L.DomUtil.create('select', 'leaflet-featureselect', this.div);
var content = '';
if (this.options.title.length > 0 ) {
content += '';

}
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
content += '';
}
}
else {
for (var i = 0; i < this.featureCollection.features.length; i++) {
content += '';
}

};
this.select.innerHTML = content;
this.select.onmousedown = L.DomEvent.stopPropagation;
return this.div;
},
on: function(type,handler) {
if (type == 'change'){
this.onChange = handler;
L.DomEvent.addListener(this.select, 'change', this._onChange, this);
} else {

console.log('FeatureSelect - cannot handle ' + type + ' events.')
}
},
_onChange: function(e) {
var selectedItemKey = this.select.options[this.select.selectedIndex].value;
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i].properties[this.options.lookupProperty] == selectedItemKey) {
e.feature = this.featureCollection.features[i];
break;

}
}
}
else {
for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i][this.options.lookupProperty] == selectedItemKey) {
e.feature = this.featureCollection.features[i];
break;
}
}

}
this.onChange(e);
}
});

L.featureSelect = function(features, options) {
return new L.FeatureSelect(features, options);
};

Below is example of using the new control with original countries file world.geo.json, available at https://github.com/johan/world.geo.json (jQuery is used to load GeoJSON file from local server):



var baseLayer = L.tileLayer('http://{s}.basemaps.cartocdn.com/light_all/{z}/{x}/{y}.png',{attribution: 'Tiles © CartoDB'});
var map = L.map("map",{layers: [baseLayer], center: [-23.88, -62.75], zoom: 4});

$.getJSON( "world.geo.json")
.done(function(data) {
var select = L.featureSelect(data);
select.addTo(map);
select.on('change', function(e) {
if (e.feature === undefined) {
return;

}
var feature = L.geoJson(e.feature);
if (this.previousFeature != null) {
map.removeLayer(this.previousFeature);
}
this.previousFeature = feature;
map.addLayer(feature);
map.fitBounds(feature.getBounds());
});
});


In the above example feature property id would be used to select countries by id. Since each feature (country) has also country name stored properties property under name property, it could also be used to select countries by name by defining control like this:


var select = L.featureSelect(data, {
lookupProperty: 'name',
lookupInFeatureProperties: true,
title: 'Country'
});

spatial database - STDistance Unit in SQL Server 2008


I've got a SPROC in SQL Server 2008 where I'm trying to calculate the distance between a point and a polygon, and I'm getting unexpected results. I am passing a parameter (@Radius) in miles to the SPROC, but the distances don't seem to be in KM. I can't figure out what they are in. My SPROC is below:


DECLARE  @Point geometry;
SET @Point = geometry::STGeomFromText('POINT(' + CAST(@Longitude AS nvarchar(32)) + ' ' + CAST(@Latitude AS nvarchar(32)) +')', 4326);

SELECT [FS_Geometries].[Description], ([Geometry].MakeValid()).STDistance(@Point)

FROM [FS_Geometries]
WHERE (([Geometry].MakeValid()).STDistance(@Point) <= (@Radius * 1609.344))
AND [ParentId] = @CustomerId
AND [ParentType] = 80
ORDER BY (([Geometry].MakeValid()).STDistance(@Point)) ASC

In all my other queries, I need to convert my radius from miles to meters. When I do that here, the results are not correct. I have a result set of about 750 polygons that are separated by about 25 miles of geography, so when I pass "1.0", I'd expect to get less than my 750 polygons. Unfortunately, I get the full 750 in the results.


Is the STDistance() method in SQL Server 2008 based on something other than meters? Also, is the STDistance() method in this case measuring from the edge of the polygon or the centroid of the polygon?


Thanks!


UPDATE ---



My updated conversion function...


public static double ConvertMilesToDegrees(double miles, bool isLatitude)
{
return isLatitude
? miles * (((double)1) / 69) // 1 Deg. Latitude = 69 Miles...
: miles * (((double)1) / 61); // Avg of 69 and 53...

// NOTES: A degree of longitude is widest at the equator at 69.172 miles (111.321)
// and gradually shrinks to zero at the poles. At 40° north or south the distance
// between a degree of longitude is 53 miles (85 km).

}

Answer



I guess it works the same as PostGIS. If you are working in geometry type with geometry functions, the function just calculate with the unit the map has. In your case it seems to be lon lat degrees. Then your distance will not make much sense because the lat and lon degrees is of different length except on the equator.


So, what you have to do is transform your data to some meter based projection or use the geography type with geodetic functions.


HTH


Nicklas


Thursday 30 March 2017

raster - Python equivalent of gdalbuildvrt


Is there a way to perform the same task as the gdalbuildvrt utility using GDAL Python bindings? So far I have not found any way to do this other than creating a vrt of a single dataset and manually editing the xml. I would like to create a vrt from multiple rasters (essentially performing a mosaic). Is this possible using pure Python? My other option is to use subprocess to simply call gdalbuildvrt.



Answer



Honestly it's easier to do this by using gdalbuildvrt in a subprocess or os.system.



Should you wish to do this through Python it can be done. Using the standard dataset creation methods within GDAL Python we can easily create the base dataset VRT.


from osgeo import gdal

drv = gdal.GetDriverByName("VRT")
vrt = drv.Create("test.vrt", x_size, y_size, 0)

Note that we are creating the dataset with no bands initially. From the documentation on VRTs that VRT datasets are one of the few dataset types that can accept AddBand arguments.


vrt.AddBand(gdal.GDT_Float32)
band = vrt.GetRasterBand(1)


Now for each band we have to set the metadata items manually:


simple_source = '%s' % source_path + \
'%i' % source_band + \
'' % (x_size, y_size, x_block, y_block) + \
'' % (x_offset, y_offset, x_source_size, y_source_size) + \
'' % (dest_x_offset, dest_y_offset, x_dest_size, y_dest_size)
band.SetMetadataItem("SimpleSource", simple_source)
band.SetMetadataItem("NoDataValue", -9999)

SetMetadatItem takes two arguments, the first a string of the metadata item, the second the item itself. This means that you can't subset a metadata item, so for data sources you have to set the entire contents as a string.



Note that we can use this method to create complex sources (ComplexSource) that contain look-up-tables of values, Kernel filter sources (KernelFilteredSource) of arbitrary sizes and shapes, and Mask Bands (MaskBand).


qgis - Can I use a polygon to show a WMS-layer inside while maintaining transparency outside?


I am making a map where I want a polygon filled with data from a WMS-service. What is outside the polygon from the WMS should be transparent. I am using QGIS 2.18.4.



It is a bit similar to a question on How do I clip OSM basemap with a polygon? here on GIS Stackexchange, but the solution there (inverting a polygon manually) is not sufficient. Neither would the more up to date visualization tool "Inverted polygons" be of any immediate help.


Example 1) In this example I have two WMS-layers. One in colors and one in gray. I want to use the polygon to separate the gray and color WMS-layers so that the focus area is in colors while the area around is gray. It can be accomplished in part with setting the color to gray and then playing around with transparency and/or Layer blending mode. You get the idea, but the other (gray wms-layer) does not get into play.


enter image description here


Example 2) This is the clearest illustration. Here I have a marine management areas layer where I want to highlight fisheries activities. The fisheries activities are from a WMS-layer with the Norwegian Coastal Administration. I do not want to show the fisheries activities in the area outside the marine management areas.


Using the first example the below image shows how I am stuck at where I get to mask out the area around the focus area. I want the blue area to mask out the color wms, and only show the gray one.


enter image description here


I can not find any way of connecting the masking layer to just one of the layers underneath. Is there a way to do this?




Converting Entity to GeoJSON with Cesium


I know there is a way to load entities from a GeoJSON using GeoJsonDataSource but is there any way to do the opposite: get any entity as a GeoJSON representation?


My goal is to use Turf.JS with Cesium.JS geometries.



Answer




Here you have a solution to export to json and for export it to kml.


Everything is obtained from this official thread of Cesium.


Update:


They are links that talk about the question and there isn't method to do that native in Cesium, such as exportKml. To implement exportJson (for example), it should be done using pure js or Turf as you did.


You can use the exportKml code to help create the result you are looking for.


Another way you can try is to export to KML and convert the result xml to json. For example using togeojson



Cesium.exportKml({
entities: entityCollection
})
.then(function(result) {
// The XML string is in result.kml
// Convert Kml to GeoJSON (It's just an idea)
json = toGeoJSON.kml(result)
});

Note: If it is considered to be not a valid response, you can simply delete it or not accept it.



arcpy - Binding Objects to List for ArcGIS Python Script Tool Parameter?




I am working on a python based script/tool. I would like to make use of drop down lists for several of the parameters. This is very easy to do if the lists are just plain old strings.


Does anyone know if it is possible to bind objects to this list and display a certain property? Objects will be very simple; just key-value pairs.


So, I would like to display the key and be able to access the value somewhere in the ToolValidator, or better yet access the ToolValidator class from the tool script itself?



Answer



Since you know that you can provide a potential list of acceptable values to a tool parameter, you could try using the selected value from the domain, as the key to a dictionary residing in your main script.


Wednesday 29 March 2017

arcpy - Get extent of Georeferenced Rasters in Python and output to polygon shapefile


I would like to create a shapefile containing the extents of each of the rasters in a directory. Is it possible to capture the extent of a raster using Python?


I have tried





extent1=arcgisscripting.Raster.extent('stg1_05.jpg') Runtime error : 'getset_descriptor' object is not callable






and I can't seem to find any help on the module.


I also tried





arcpy.RasterToPolygon_conversion(inRaster, outPolygons, "SIMPLIFY", field)




Runtime error : ERROR 010247: The size of the output shapefile exceeds the 2 GB limit. ERROR 010067: Error in executing grid expression. Failed to execute (RasterToPolygon).




Anyway this will be a polygon of the whole raster file and not just the extents -even if this is generated, I guess we could then run a merge/dissolve but the files created are to big.


Another option I was thinking of was to convert the raster to layer


import arcpy, glob, os, sys
from arcpy import env, mapping
path = os.getcwd()
env.workspace = path
RasterType = 'TIF'
FileList = glob.glob(path + "\*." + RasterType)

print 'Reading files from ' + path

os.chdir(path)
#print FileList

x=0
z=1005
File=FileList[x]
LayerWorking=arcpy.mapping.Layer(File)
print File
LayerExtent=LayerWorking.getExtent()
XMAX = LayerExtent.XMax

XMIN = LayerExtent.XMin
YMAX = LayerExtent.YMax
YMIN = LayerExtent.YMin
pnt1 = arcpy.Point(XMIN, YMIN)
pnt2 = arcpy.Point(XMIN, YMAX)
pnt3 = arcpy.Point(XMAX, YMAX)
pnt4 = arcpy.Point(XMAX, YMIN)
array = arcpy.Array()
array.add(pnt1)
array.add(pnt2)

array.add(pnt3)
array.add(pnt4)
array.add(pnt1)
polygon = arcpy.Polygon(array)
ShapeFile = path + "\Polygon_Extent" + "_" + str(z) + ".shp"
print ShapeFile
print arcpy.GetMessages()
arcpy.CreateFeatureclass_management(path, ShapeFile, "POLYGON")
print arcpy.GetMessages()
arcpy.CopyFeatures_management(polygon, ShapeFile)


Gives following output



Reading files from C:\Python26\script_tests\rectify C:\Python26\script_tests\rectify\Stage1_01a.TIF C:\Python26\script_tests\rectify\Polygon_Extent_1005.shp Executing: CopyFeatures in_memory\fB4DC2172_7D02_44B9_B55C_9E71427AE96E C:\Python26\script_tests\rectify\Polygon_Extent_1004.shp # 0 0 0 Start Time: Thu Jul 14 16:11:38 2011 Failed to execute. Parameters are not valid. ERROR 000725: Output Feature Class: Dataset C:\Python26\script_tests\rectify\Polygon_Extent_1004.shp already exists. Failed to execute (CopyFeatures). Failed at Thu Jul 14 16:11:38 2011 (Elapsed Time: 0.00 seconds) Traceback (most recent call last): File "C:\Python26\script_tests\rectify\temp.py", line 36, in arcpy.CreateFeatureclass_management(path, ShapeFile, "POLYGON") File "C:\Program Files\ArcGIS\Desktop10.0\arcpy\arcpy\management.py", line 1539, in CreateFeatureclass raise e ExecuteError: ERROR 999999: Error executing function. Failed to execute (CreateFeatureclass).



using IDLE...Which is wierd as when you run it from the Python window in ArcMap it works fine when the Create/Copy feature commands are run individually.



ShapeFile = "Polygon_Extent" + "_" + str(z) + ".shp" arcpy.CreateFeatureclass_management(path, ShapeFile, "POLYGON")



Result 'C:\Python26\script_tests\rectify\Polygon_Extent_1005.shp'>




arcpy.CopyFeatures_management(polygon, ShapeFile)



Result 'C:\Python26\script_tests\rectify\Polygon_Extent_1005.shp'>




this seems to be a very convoluted process...


UPDATE: I am trying to add the filename to the table and for some reason it inserts a new row into the table and doesn't accept "updateRow(row)"? what am I doing wrong? Also the files don't seem to retain the projection I assign them.



Reading files from C:\Python26\script_tests\rectify C:\Python26\script_tests\rectify\Stage1_01a.TIF Created: Polygon_Extent_1.shp Filled in C:\Python26\script_tests\rectify\Stage1_01a.TIF Traceback (most recent call last): File "C:\Python26\script_tests\rectify\ResterExtent_toSHP.py", line 45, in rows.updateRow(row) File "C:\Program Files\ArcGIS\Desktop10.0\arcpy\arcpy\arcobjects\arcobjects.py", line 102, in updateRow return convertArcObjectToPythonObject(self._arc_object.UpdateRow(*gp_fixargs(args))) RuntimeError: ERROR 999999: Error executing function.




SCRIPT all working now.


import arcpy, glob, os, sys, arcgisscripting
from arcpy import env, mapping
path = os.getcwd()
env.workspace = path
env.overwriteOutput = True
RasterType = 'TIF'
FileList = glob.glob(path + "\*." + RasterType)
print 'Reading files from ' + path

os.chdir(path)
#print FileList
geometry_type = "POLYGON"
template = "C:\\Python26\\GDA_1994_MGA_Zone_55.shp"
has_m = "DISABLED"
has_z = "DISABLED"
# Creating a spatial reference object
spatial_reference = arcpy.SpatialReference("C:\\Python26\\GDA_1994_MGA_Zone_55.prj")

x=0

z=x+1
for File in FileList:
#File=FileList[x]
RasterFile = arcgisscripting.Raster(File)
RasterExtent = RasterFile.extent
print File
XMAX = RasterExtent.XMax
XMIN = RasterExtent.XMin
YMAX = RasterExtent.YMax
YMIN = RasterExtent.YMin

pnt1 = arcpy.Point(XMIN, YMIN)
pnt2 = arcpy.Point(XMIN, YMAX)
pnt3 = arcpy.Point(XMAX, YMAX)
pnt4 = arcpy.Point(XMAX, YMIN)
array = arcpy.Array()
array.add(pnt1)
array.add(pnt2)
array.add(pnt3)
array.add(pnt4)
array.add(pnt1)

polygon = arcpy.Polygon(array)
arcpy.CreateFeatureclass_management(path, "Temp_Polygon_Extent" + "_" + str(z), geometry_type, template, has_m, has_z, spatial_reference)
arcpy.CopyFeatures_management(polygon, "Temp_Polygon_Extent" + "_" + str(z))
ShapeFile = "Temp_Polygon_Extent" + "_" + str(z) + ".shp"
print "Created: " + ShapeFile
arcpy.AddField_management(ShapeFile,'FileName','TEXT')
desc = arcpy.Describe(ShapeFile)
print desc, ShapeFile
#rows = arcpy.InsertCursor(ShapeFile, desc)
rows = arcpy.UpdateCursor(ShapeFile)

#row = rows.newRow()
#row.FileName = str(File)
#row.FileName = File
print 'Filled in: ', str(File)
#rows.insertRow(row)
for row in rows:
row.FileName = str(ShapeFile)
rows.updateRow(row)
x=x+1
z=z+1


#cleaning up
arcpy.CreateFeatureclass_management(path, "Extent.shp", geometry_type, template, has_m, has_z, spatial_reference)
list =[]
lstFCs = arcpy.ListFeatureClasses("Temp_Polygon_Extent*")
print 'Merging Polygon_Extents* to Extent.shp'

for fc in lstFCs:
print fc
list.append(fc)


arcpy.Merge_management(list, "Extent.shp")
#print 'Deleting identical entries and temp files'
#arcpy.DeleteIdentical_management("Extent.shp", ["SHAPE"])
print 'Created Extent.shp and exiting'

for item in list:
arcpy.Delete_management(item)
del row, rows


GDAL Ver.


import os, gdal
gdaltindex Extent.shp *.tif


File "". line 1 SyntaxError: invalid syntax



BTW FTTools Python window is a pain as you can't copy/past code/errors from it to.



Answer



You almost had it right with your first attempt. What you are doing is trying to call the extent property with a filename as its parameter, when you need to construct a Raster object with that parameter.



In practical terms, that means:


extent1 = arcgisscripting.Raster('stg1_05.jpg').extent

Although it is usually better practice to break it down into two steps:


raster1 = arcgisscripting.Raster('stg1_05.jpg')
extent1 = raster1.extent

Note the lack of parentheses on extent, this is because it is a property rather than a method.


qgis - Copying attributes from one polygon layer to another?


I have a problem that I can't seem to get my head around. I have two polygon layers:



  • Polygon A - is a subset of polygon B with the same fields and has identical polygons to Polygon B

  • Polygon B - has the attribute data I want to be in Polygon A


How can this be done? I tried the QGIS tool "Join Attributes by Location" but as some of the polygons are within others, it tends to link to the first intersect it finds (the outer polygon).




coordinate system - Algorithm for offsetting a latitude/longitude by some amount of meters


I'm looking for an algorithm which when given a latitude and longitude pair and a vector translation in meters in Cartesian coordinates (x,y) would give me a new coordinate. Sort of like a reverse Haversine. I could also work with a distance and a heading transformation, but this would probably be slower and not as accurate. Ideally, the algorithm should be fast as I'm working on an embedded system. Accuracy is not critical, within 10 meters would be good.




Answer



If your displacements aren't too great (less than a few kilometers) and you're not right at the poles, use the quick and dirty estimate that 111,111 meters (111.111 km) in the y direction is 1 degree (of latitude) and 111,111 * cos(latitude) meters in the x direction is 1 degree (of longitude).


postgresql - How can I optimize pgrouting for speed?


I am using pgrouting on a postgis database created through osm2pgrouting. It performs very good on a limited dataset (3.5k ways, all shortest path A* searches < 20 ms).


However since I have imported a bigger bounding box (122k ways) from europe.osm the performance went down a lot (a shortest path costs around 900ms).


I would think that using A* most of those edges will never be visited as they are out of the way.


What I have done so far in an attempt to improve the speed:



  • Put an index on the geometry column (no noticeable effect)

  • Increased my memory from 8GB to 16GB


  • Change the postgresql memory settings (shared_buffers, effective_cache_size) from (128MB, 128MB) to (1GB, 2GB) (no noticeable effect)


I have a feeling that most of the work is being done in the C Boost library where the graph is being made so optimizing postgresql will not give me much better results. As I do minor changes to the set of rows I select for A* for every search I am a bit afraid that the boost library cannot cache my graph and has to rebuild all the 122k edges every time (even though it will only use a very limited subset every query). And I have no idea how much is spent doing that compared to the actual shortest path search.


Does any of you use pgrouting on a 122k or greater OSM dataset? What performance should I expect? What settings affect the performance most?



Answer



When faced with tasks like this your primary objective is to be rational. Don't change params based on 'gut feeling'. While the gut seems to works for Hollywood it does not for us who live in the real world. Well, at least not my gut ;-).


You should:




  1. establish a usable and repeatable metric (like the time required by a pgrouting query)





  2. save metric results in a spreadsheet and average them (discard best and worst). This will tell you if the changes you are making are going in the right direction




  3. monitor your server using top and vmstat (assuming you're on *nix) while queries are running and look for significant patterns: lots of io, high cpu, swapping, etc. If the cpu is waiting for i/o then try to improve disk performance (this should be easy, see below). If the CPU is instead at 100% without any significant disk acticity you have to find a way to improve the query (this is probably going to be harder).




For the sake of simplicity I assume network is not playing any significant role here.


Improving database performance



Upgrade to the latest Postgres version. Version 9 is so much better that previous versions. It is free so you have no reason not not.


Read the book I recommended already here.


You really should read it. I believe the relevant chapters for this case are 5,6,10,11


Improving disk performance




  1. Get an SSD drive and put the whole database on it. Read performance will most-likely quadruple and write performance should also radically improve




  2. assign more memory to postgres. Ideally you should be able to assign enough memory so that the whole (or the hottest part) can be cached into memory, but not too much so that swapping occurs. Swapping is very bad. This is covered in the book cited in the previous paragraph





  3. disable atime on all the disks (add the noatime options to fstab)




Improving query perfomance


Use the tools described in the book cited above to trace your query/ies and find stops that are worth optimizing.


Update


After the comments I have looked at the source code for the stored procedure


https://github.com/pgRouting/pgrouting/blob/master/core/src/astar.c



and it seems that once the query has been tuned there is not much more room for improvement as the algorithm runs completely in memory (and, unfortunately on only one cpu). I'm afraid your only solution is to find a better/faster algorithm or one that can run multithreaded and then integrate it with postgres either by creating a library like pgrouting or using some middleware to retrieve the data (and cache it, maybe) and feed it to the algorithm.


HTH


arcobjects - Algorithm to find out points of inflection for a polyline


I am trying to find out the points of inflection , i.e. points where the curves in a line start and end. If you look at the image, the green line may be a road or a stream, and the black points are the points where the curves start and end. enter image description here


What would be the high level steps to automate the generation of these points? I have ArcGIS desktop and am quite handy with ArcObjects.



Answer



When the curve is comprised of line segments, then all interior points of those segments are inflection points, which is not interesting. Instead, the curve should be thought of as being approximated by the vertices of those segments. By splining a piecewise twice-differentiable curve through those segments, we can then compute the curvature. An inflection point, strictly speaking, is then a place where the curvature is zero.


In the example there are longish stretches where the curvature is nearly zero. This suggests that the indicated points ought to approximate the ends of such stretches of low-curvature regions.



An effective algorithm will therefore spline the vertices, compute the curvature along a dense set of intermediate points, identify ranges of near-zero curvature (using some reasonable estimate of what it means to be "near"), and mark the endpoints of those ranges.


Here is working R code to illustrate these ideas. Let's begin with a line string expressed as a sequence of coordinates:


xy <- matrix(c(5,20, 3,18, 2,19, 1.5,16, 5.5,9, 4.5,8, 3.5,12, 2.5,11, 3.5,3, 
2,3, 2,6, 0,6, 2.5,-4, 4,-5, 6.5,-2, 7.5,-2.5, 7.7,-3.5, 6.5,-8), ncol=2, byrow=TRUE)

Spline the x and y coordinates separately to achieve a parametrization of the curve. (The parameter will be called time.)


n <- dim(xy)[1]
fx <- splinefun(1:n, xy[,1], method="natural")
fy <- splinefun(1:n, xy[,2], method="natural")


Interpolate the splines for plotting and computation:


time <- seq(1,n,length.out=511)
uv <- sapply(time, function(t) c(fx(t), fy(t)))

We need a function to compute the curvature of a parameterized curve. It needs to estimate the first and second derivatives of the spline. With many splines (such as cubic splines) this is an easy algebraic calculation. R provides the first three derivatives automatically. (In other environments, one might want to compute the derivatives numerically.)


curvature <- function(t, fx, fy) {
# t is an argument to spline functions fx and fy.
xp <- fx(t,1); yp <- fy(t,1) # First derivatives
xpp <- fx(t,2); ypp <- fy(t,2) # Second derivatives
v <- sqrt(xp^2 + yp^2) # Speed

(xp*ypp - yp*xpp) / v^3 # (Signed) curvature
# (Left turns have positive curvature; right turns, negative.)
}

kappa <- abs(curvature(time, fx, fy)) # Absolute curvature of the data

I propose to estimate a threshold for zero curvature in terms of the extent of the curve. This at least is a good starting point; it ought to be adjusted according to the tortuosity of the curve (that is, increased for longer curves). This will later be used for coloring the plots according to curvature.


curvature.zero <- 2*pi / max(range(xy[,1]), range(xy[,2])) # A small threshold
i.col <- 1 + floor(127 * curvature.zero/(curvature.zero + kappa))
palette(terrain.colors(max(i.col))) # Colors


Now that the vertices have been splined and the curvature computed, it remains only to find the inflection points. To show them we may plot the vertices, plot the spline, and mark the inflection points on it.


plot(xy, asp=1, xlab="x",ylab="y", type="n")
tmp <- sapply(2:length(kappa), function(i) lines(rbind(uv[,i-1],uv[,i]), lwd=2, col=i.col[i]))
points(t(sapply(time[diff(kappa < curvature.zero/2) != 0],
function(t) c(fx(t), fy(t)))), pch=19, col="Black")
points(xy)

Plot


The open points are the original vertices in xy and the black points are the inflection points automatically identified with this algorithm. Because curvature cannot reliably be computed at the endpoints of the curve, those points are not specially marked.



Converting/transforming coordinates from ETRS89 to WGS84 UTM 36N in QGIS


I am a student and just getting to know QGIS. At the moment I have to combine data which had been measured in ETRS89 with data in WGS84 UTM 36N. I created several vector layers for the data in ETRS89. Then I tried to transform the data into WGS84 by using "on the fly transformation" but it seemed to me that there was no change visible. The coordinates for one of our fixpoints are in WGS84 279000.000/4207000.000/1616.875 but QGIS shows me the coordinates 542000.00/4206000.00 (the height I can't even see).


How can I transform the coordinates?


I don't mean a change in projection I want to have a proper translation/conversion of the coordinates.


The more western points are the correct ones in the OpenStreetMap. I think we still not have solved the problem. May be I should explain it a little bit better... I have got some data of Turkish students. They contain measurement points which had been acquired with a total station. We also measured with a total station in the same area but we measured in WGS84 UTM 36N relying on fixpoints which had been set witch GPS.


If you now look at the data, there has to be a difference because the coordinates of our Turkish colleagues begin with 542.../420... and ours 279../420... and also the height is divergent. But I'm not quite sure whether the information of the Turkish student was correct that it is ETRS89.


I really don't know how I can bring this data together.




Tuesday 28 March 2017

How to place all unique values into a lookup array in JavaScript for a Leaflet map?


I've been working through a problem with others on 1) grabbing id's from a GeoJSON, 2) placing those id's into a dropdown menu on a Leaflet map, 3) and enabling functionality to allow the user to choose an id, after which the map will filter and zoom to the chosen location.


We've got the control,


L.FeatureSelect = L.Control.extend({ 

options: {
position: 'topright',
title: 'Name, Clear',
lookupProperty: 'name',
lookupInFeatureProperties: true
},
initialize: function (features, options) {
this.featureCollection = features;
L.Util.setOptions(this, options);
},

onAdd: function(map) {
this.div = L.DomUtil.create('div', 'leaflet-featureselect-container');
this.select = L.DomUtil.create('select', 'leaflet-featureselect', this.div);
var content = '';
if (this.options.title.length > 0 ) {
content += '';
}
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
content += '';

}
}
else {
for (var i = 0; i < this.featureCollection.features.length; i++) {
content += '';
}
};
this.select.innerHTML = content;
this.select.onmousedown = L.DomEvent.stopPropagation;
return this.div;

},
on: function(type,handler) {
if (type == 'change'){
this.onChange = handler;
L.DomEvent.addListener(this.select, 'change', this._onChange, this);
} else {
console.log('FeatureSelect - cannot handle ' + type + ' events.')
}
},
_onChange: function(e) {

var selectedItemKey = this.select.options[this.select.selectedIndex].value;
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i].properties[this.options.lookupProperty] == selectedItemKey) {
e.feature = this.featureCollection.features[i];
break;
}
}
}
else {

for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i][this.options.lookupProperty] == selectedItemKey) {
e.feature = this.featureCollection.features[i];
break;
}
}
}
this.onChange(e);
}
});


L.featureSelect = function(features, options) {
return new L.FeatureSelect(features, options);
};

And the map, which clears content when choosing the title (Name, Clear) using data from this location.


var baseLayer = L.tileLayer('http://{s}.basemaps.cartocdn.com/light_all/{z}/{x}/{y}.png',{attribution: 'Tiles © CartoDB'});
var map = L.map("map",{layers: [baseLayer], center: [-23.88, -62.75], zoom: 4});

$.getJSON( "world.geo.json")

.done(function(data) {
var select = L.featureSelect(data);
select.addTo(map);
select.on('change', function(e) {
if (e.feature === undefined) {
return;
}
var feature = L.geoJson(e.feature);
if (this.previousFeature != null) {
map.removeLayer(this.previousFeature);

}
this.previousFeature = feature;
map.addLayer(feature);
map.fitBounds(feature.getBounds());
});
});

The problem is that I'd like to feed the script a geojson with many non unique lookupProperty values. i.e. a polygon layer with many Brazil or many Canada values. Currently, the script will return all values under the "lookupProperty", meaning many duplicates end up in the dropdown menu.


How may I grab and place all non unique id's into dropdown menu array?


From my understanding, if the dropdown menu has all non unique id's, when a user selects the unique id, all features with that property value will be filtered and displayed on the map, which is what we want.




Answer



First it has to be noted that control mentioned in the question is modification of Leaflet.CountrySelect plugin, written by Anika S. Halota.


To achieve unique lookup keys in menu, the following changes were made to control:



  1. At the load time (onAdd method) all lookup keys are stored into internal array where they are sorted and made unigue.

  2. Internal method _sortUnique was added to get sorted and unique lookup keys.

  3. When features are selected to be displayed (_onChange method), loop goes through all the features to get possible multiple features that correspond to selected lookup value.


L.FeatureSelect = L.Control.extend({ 
options: {

position: 'topright',
title: 'Name, Clear',
lookupProperty: 'id',
lookupInFeatureProperties: false
},
initialize: function (features, options) {
this.featureCollection = features;
L.Util.setOptions(this, options);
},
onAdd: function(map) {

this.div = L.DomUtil.create('div', 'leaflet-featureselect-container');
this.select = L.DomUtil.create('select', 'leaflet-featureselect', this.div);
var content = '';
this.lookupArray = [];
if (this.options.title.length > 0 ) {
content += '';
}
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
this.lookupArray.push(this.featureCollection.features[i].properties[this.options.lookupProperty]);

}
}
else {
for (var i = 0; i < this.featureCollection.features.length; i++) {
this.lookupArray.push(this.featureCollection.features[i][this.options.lookupProperty]);
}
};
this.lookupArray = this._sortUnique(this.lookupArray);
for (var i = 0; i < this.lookupArray.length; i++) {
content += '';

}
this.select.innerHTML = content;
this.select.onmousedown = L.DomEvent.stopPropagation;
return this.div;
},
on: function(type, handler) {
if (type == 'change'){
this.onChange = handler;
L.DomEvent.addListener(this.select, 'change', this._onChange, this);
} else {

console.log('FeatureSelect - cannot handle ' + type + ' events.')
}
},
_onChange: function(e) {
e.features = [];
var selectedItemKey = this.select.options[this.select.selectedIndex].value;
if (this.options.lookupInFeatureProperties) {
for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i].properties[this.options.lookupProperty] == selectedItemKey) {
e.features.push(this.featureCollection.features[i]);

}
}
}
else {
for (var i = 0; i < this.featureCollection.features.length; i++) {
if (this.featureCollection.features[i][this.options.lookupProperty] == selectedItemKey) {
e.features.push(this.featureCollection.features[i]);
}
}
}

this.onChange(e);
},
_sortUnique: function(array) {
array.sort();
var last_i;
for (var i = 0; i < array.length; i++)
if ((last_i = array.lastIndexOf(array[i])) !== i) {
array.splice(i + 1, last_i - i);
}
return array;

}
});

L.featureSelect = function(features, options) {
return new L.FeatureSelect(features, options);
};

Since multiple selected features are now possible, adding and removing is done through L.featureGroup:


$.getJSON( "world.geo.json")
.done(function(data) {

var select = L.featureSelect(data);
select.addTo(map);
select.on('change', function(e) {
if (this.previousFeatures != null) {
map.removeLayer(this.previousFeatures);
}
if (e.features === undefined) {
return;
}
this.previousFeatures = L.featureGroup();

for (var i = 0; i < e.features.length; i++) {
this.previousFeatures.addLayer(L.geoJson(e.features[i]));
}
map.addLayer(this.previousFeatures);
map.fitBounds(this.previousFeatures.getBounds());
});
});

Add PostGIS layer to QGIS via Python Console


How can I add a PostGIS layer to QGIS via the Python Console?



With some server setups, especially with a large number of tables, it can be very slow to load the list of layers via the Browser or Data Source Manager.



Answer



Here are code snippets to add a layer.


I put table name in a variable since it's used twice and the geometry column because that's what I most often change.


QGIS 3.x


tablename = "thetable"
geometrycol = "geom"

from qgis.core import QgsVectorLayer, QgsDataSourceUri
uri = QgsDataSourceUri()

uri.setConnection("host", "5432", "db", "user", "pass")
uri.setDataSource ("schema", tablename, geometrycol)
vlayer=QgsVectorLayer (uri .uri(False), tablename, "postgres")
QgsProject.instance().addMapLayer(vlayer)

QGIS 2.x


tablename = "tmo_njcov2_60mbin_clean_goodbad_201801"
geometrycol = "geom"

from qgis.core import QgsVectorLayer, QgsDataSourceURI

uri = QgsDataSourceURI()
uri.setConnection("host", "5432", "db", "user", "pass")
uri.setDataSource ("schema", tablename, geometrycol)
vlayer=QgsVectorLayer (uri .uri(False), tablename, "postgres")
QgsMapLayerRegistry.instance().addMapLayer(vlayer)

Leaflet: popup showing information on the polygon



I have a Leaflet map. To the map I load two JavaScript files (A.js & B.js) which both consist of only polygons. Now my aim is, that when I click inside Leaflet on one of the polygons, that I get a popup showing the area of that polygon. The area is saved in the JavaScript file under Shape_Area.


"type": "Feature", 
"properties": { "OBJECTID": 1, "Id": 1, "gridcode": 1, "Shape_Leng": 1805.0000000099999, "Shape_Area": 8781.24999991, "LandUse": "Urban" }


My code until now is:


function onEachFeature(feature, layer) {
if (feature.properties && feature.properties.NAME) {
layer.bindPopup(feature.properties.NAME);
}
}
var LayerA = L.geoJSON(UF,{style: style},{onEachFeature: function(feature, layer) {layer.bindPopup(feature.properties.type);}}).addTo(mymap);;
LayerA.eachLayer(function (layer) {layer.bindPopup(UF.Shape_Area);});


I get already my popup, but I do not know how to exactly get the Shape_Area of my JavaScript-file.


Anybody has a hint for that please?



Answer



I believe you are just asking on how to add more attributes to the popup? For my example file, I have a Team field, a League field, and a Website field, so I create a variable called popupContent, so I could display all the attributes and add some html to help with the formatting. You could do something like this: feature.properties.Shape_Area +" Sq Meters"


function forEachFeature(feature, layer) {

var popupContent = "

The " +
feature.properties.Team + "
play here,
They are in the " +
feature.properties.League + "
" +
'Website

' ;


if (feature.properties && feature.properties.popupContent) {
popupContent += feature.properties.popupContent;
}
layer.bindPopup(popupContent);
};

Alright, the feature.properties comes from your GeoJSON file. Here is a working example. Right click on this link save as usa.json. It's a polygon GeoJSON file. http://www.gistechsolutions.com/leaflet/DEMO/Search/usa.json, put it in a web folder. Next, copy/paste the following code, it will read the usa.json file and create a web map. It has popups and even has highlighting on click. It's very basic. The code should help you learn it.






Example















arcpy - What would cause a geoprocessing service to say "Cannot Open 'FeatureClass' Failed to Execute"?



I have a really basic geoprocessing script that's published to ArcGIS Server 10.2.2.


It basically just reads in a json object that's passed in as a parameter. The json just has attribute names and values, as well as a geometry in wkid 4326.


All it does is open an InsertCursor to create features in one specific feature class.


Nevermind asking me "All you want to do is create a bunch of features, why don't you do this in the rest endpoint?" Long story short, I tried several times using the jsapi, and even building my own geometries. In both cases, I got "d is not applied."


Anyway. This Geoprocessing script does work. But sometimes I check the logs and see "SEVERE: 'Cannot open SCHEMA.FEATURECLASS'. Failed to execute (Name of GP service). Failed to execute (name of GP Service).


What would cause this?? I noticed that on times where it works, it was very early in the day (5 AM). And then someone tried to use it around 10 AM and the script fails. From my DBMS knowledge, the only reasons I can think of why someone can't open a feature class for inserting:



  • Someone has a schema lock (I find this less likely since you usually can't get a schema lock on anything if the ArcGIS Server is running)

  • A database transaction is in progress (I don't see any evidence of this since the only other system that loads data into this database shows no records of any loading happening around the time of the error log

  • does arcpy.da.insertcursor fail to open if there's another insertcursor open simultaneously? (I hope not...)



Other info:



  • This is not a versioned feature class

  • The arcpy.env.workspace is set to an sde file that connects as the geodatabase admin. This connection is set to use the default version


  • Here are the properties on the service enter image description here


    #CODE SNIPPET - This is 90% of the business logic here
    loadCursor = arcpy.da.InsertCursor(Feature_Name, fields)
    oids = []


    for feature in FeaturesToAdd:
    geom = feature["geometry"]
    point = (geom["x"], geom["y"])
    attributes = feature["attributes"].values()
    attributes.append(point)
    row = tuple(attributes)
    print len(row)
    oid = loadCursor.insertRow(row)
    oids.append(oid)



    del loadCursor

    print oids


What else would cause arcpy not to be able to open a feature class for inserting?


Here's the basic code snippet for doing that:


loadCursor = arcpy.da.InsertCursor('SCHEMA.FEATURECLASS', ['field1','field2', 'field3', 'shape@XY']


there's nothing else that opens up any cursors, lists fields, queries the db. etc. This is the only line (not including loadCursor.insertRow())



Answer




does arcpy.da.insertcursor fail to open if there's another insertcursor open simultaneously? (I hope not...)





Yes it will fail, as this guarantees database integrity. From the ESRI guide on Cursors and Locking:





Insert and update cursors honor table locks set by ArcGIS applications. Locks prevent multiple processes from changing the same table at the same time.


Update and insert cursors cannot be created for a table or feature class if an exclusive lock exists for that dataset. The UpdateCursor or InsertCursor functions fail because of an exclusive lock on the dataset. If these functions successfully create a cursor, they apply an exclusive lock on the dataset so that two scripts cannot create an update or insert cursor on the same dataset.



A cursor can released by one of the following:




  • Including the cursor inside a with statement, which will guarantee the release of locks regardless of whether or not the cursor is successfully completed




  • Calling reset() on the cursor




  • The completion of the cursor

  • Explicitly deleting the cursor using Python's del statement



Monday 27 March 2017

geoserver - Dynamic Symbolizers w/ CQL not URL encoded


I am trying user Dynamic Symbolizers in my SLD:





cite:test

Test







image/png

35









How if the value of 'some_type' has characters in it that need to be URL encoded the URL after parsing the CQL is invalid.


An easy example is if the value has a space in it i.e. 'Some Value'. Which produces the URL http://localhost/icons/Some Value, and it should probably produce: http://localhost/icons/Some%20Value.


Am I using CQL correctly in this instance. I looked through the provided functions here:



https://github.com/geotools/geotools/tree/master/modules/library/main/src/main/java/org/geotools/filter/function


and did not find one specific to URL encoding. That being said its not a huge deal to write a filter function to do this. However if you can use CQL in the href of an OnlineResource the core CQL parsing logic should probably handle this.


Is there anything else I might be missing? Crossing my fingers ;)




QGis Making Column Value of Attribute table unique


For a shapefile, I have a field named id with the following contents in it:


1
1

2
2

I want to make these values unique such as:


1
2
3
4

Is there any way to achieve this in Quantum GIS (using any plugin or other means) ?




Answer



You can use the Field Calculator in the Attribute table to assign the features unique values by checking "Update existing field", selecting the appropriate field, and entering $rownum in the Expression box. This will assign each feature its row number as its ID.


How do I find vector line bearing in QGIS or GRASS?


How can one extract the bearings from a set of lines in a vector file?


I'm trying to find the bearings of several straight vector line segments (simplified contour lines, actually) and can't figure out how to do it. The QGIS measure tool can tell me individual line bearing if I use a N/S grid overlay to provide a 0º reference, but my mouse precision is probably questionable, and doing that for every line in our sample is going to take longer than I can devote to this project.


Even if there is just a way to automatically pair line end-point nodes, get the XY data, and then let excel calculate the bearings for me, I would love to know.


I really like using QGIS, but am also OK with GRASS (though shaky on text interface), SAGA, and ArcGIS (but only if I really have no other choice), so if any of the programs have a clever way to do this, let me know.


-Nick



Answer




  1. With PyQGIS in the Python console, see How to add Direction and Distance to attribute table? for the azimuths of the segments of a line (with the azimuth functions of Points: point1.azimuth(point2))

  2. but you can use many other Python modules like Shapely and Fiona without using QGIS see Python: traitement des couches vectorielles dans une perspective géologique, lecture et enregistrement des couches sous forme de dictionnaires avec le module Fiona (in French)



    from shapely.geometry import Point, LineString, mapping
import fiona
import math

def azimuth(point1, point2):
'''azimuth between 2 shapely points (interval 0 - 180°, for my work)'''
angle = math.atan2(point2.x - point1.x, point2.y - point1.y)
return math.degrees(angle)if angle>0 else math.degrees(angle) + 180


def pair(list):
'''Iterate over pairs in a list '''
for i in range(1, len(list)):
yield list[i-1], list[i]

with fiona.collection('testline.shp', 'r') as input:
# copy of the input schema'
schema = input.schema.copy()
# creation of a new field for storing azimuth
schema['properties']['azimuth'] = 'int'

# copy of the original shapefile with the field azimuth to a new shapefile
with fiona.collection('testline_azim.shp', 'w', 'ESRI Shapefile', schema) as output:
for line in input:
# use of pair function to extract the line segments
for seg_start, seg_end in pair(line['geometry']['coordinates']):
line_start = Point(seg_start)
line_end = Point(seg_end)
segment = LineString([line_start.coords[0],line_end.coords[0]])
elem = {}
elem['properties'] = line['properties']

elem['properties']['azimuth'] = azimuth(line_start, line_end)
elem['geometry'] = mapping(segment)
output.write(elem)

Result from Portail SIG image from PortailSIG


With GRASS:



  1. split the polyline into individual segments with v.split vertices=1

  2. upload the azimuth of each segment into the attribute table with v.to.db option=azimuth



see Angle between segments of a polyline or azimuth of lines with v.to.db?


Update


in the Python console of QGIS with only PyQGIS


from PyQt4.QtCore import *
import math

def select_all(layer):
layer.select([])
layer.setSelectedFeatures([obj.id() for obj in layer])


# theoretical azimuth function but you can use `point1.azimuth(point2)`, look below
def azimut(point1, point2):
'''azimuth between 2 QGIS points ->must be adapted to 0-360°'''
angle = math.atan2(point2.x() - point1.x(), point2.y() - point1.y())
return math.degrees(angle)

def pair(list):
'''Iterate over pairs in a list '''
for i in range(1, len(list)):
yield list[i-1], list[i]


mylayer = qgis.utils.iface.activeLayer()
# select all the elements of the layer
select_all(mylayer)

# Memory layer
v_layer = QgsVectorLayer("LineString", "azimuth_lines", "memory")
pr = v_layer.dataProvider()
pr.addAttributes( [ QgsField("azimuth", QVariant.Int), QgsField("az_pt1pt2",QVariant.Int),QgsField("az_pt2-1",QVariant.Int)])


for elem in mylayer.selectedFeatures():
line = elem.geometry().asPolyline()
for seg_start, seg_end in pair(line):
line_start = QgsPoint(seg_start)
line_end = QgsPoint(seg_end)
seg = QgsFeature()
seg.addAttribute(0, QVariant(int(azimut(line_start,line_end))))
# with the functions of PyQGIS
seg.addAttribute(1, QVariant(int(line_start.azimuth(line_end))))
seg.addAttribute(2, QVariant(int(line_end.azimuth(line_start))))

seg.setGeometry(QgsGeometry.fromPolyline([line_start, line_end]))
pr.addFeatures( [ seg ] )
v_layer.updateExtents()
v_layer.updateFieldMap()

QgsMapLayerRegistry.instance().addMapLayers([v_layer])

Result enter image description here


and 360 -159 = 201 -180 = 21


python - Using for loop to execute query in ArcPy?


I'm trying to provide the breakdown of fire detection for an area by sorting of confidence levels. The dataset has 12 files (one for each month in a year) with fire records which all have a column called "confidence" with a value between 0 and 100. I have to use a for loop to query the files by 3 different confidence levels. Group 1: >85% confidence Group 2: 60-85% confidence Group 3: <60% confidence


I HAVE TO USE a for loop to execute this. Any ideas? I started by appending all the files into one list. I'm told that I could use lists to set up lower and upper ranges and SelectLayerByAttribute_management, and use that in a loop, but don't know how to do that.


This is my final code that works. I decided to merge the files together into one big file.


arcpy.env.workspace = "***********/data/" 

list1 = [];

for month in range(1,13):

inf = "2007" + str(month).zfill(2)+ "_rfe"
arcpy.DefineProjection_management(newpath + inf + ".shp", 4326)
list1.append(newpath + inf + ".shp")

out1 = "big2007" + "_rfe" + ".shp"
layer1 = "layer1" + ".shp"
arcpy.Merge_management(list1,out1)
arcpy.MakeFeatureLayer_management(out1, layer1)

lower = [0, 60, 86]

upper = [60, 86, 101]

for i in range(len(lower)):
arcpy.SelectLayerByAttribute_management(layer1, "NEW_SELECTION", """ "CONF" >= {} and "CONF" < {}""".format(lower[i],upper[i]))
result = arcpy.GetCount_management(layer1).getOutput(0)
result = int(round(result))

Answer



I assume that your files are in the same directory. I don't know why you Have to use a for loop, as it adds unnecessary complexity, but here is an example where you can create two lists (one with the lower ranges and the second with the upper ranges), that you insert into the where clause when creating the layer (you can also create a single layer without where clause, and use the same where clause to select by attribute within this layer)


minvals = [85, 60 , -1  ]
maxvals = [100, 85, 60]


fcs = arcpy.ListFeatureClasses()
for fc in fcs:
for i in range(len(minvals)):
arcpy.MakeFeatureLayer(fc, fc[:-4]+str(i+1), """ "confidence_field_name" > {} And "confidence_field_name" <= {}""".format(minvals[i],maxvals[i]) )

python - Cause of ArcGIS Raster Calculator Parsing error : invalid syntax?


This is my first attempt at much code in the raster calculator of ArcMap 10.0, but I think I'm close.


I have nine rasters (A-I below) and want a new raster with value = 1 where any of the rasters have field 1 >= 1 and <= 366 and field 2 not equal to 5.


Here's what I've got:


Con(([A].1|[B].1|[C].1|[D].1|[E].1|[F].1|[G].1|[H].1|[I].1 >= 1  &  <= 366) & ([A].2|[B].2|[C].2|[D].2|[E].2|[F].2|[G].2|[H].2|[I].2 != 5) , 1,0)

I get the little red circle with an X in the top left of the raster calculator window and an error message:



Parsing error : invalid syntax (line 1).




Anyone know what I've got wrong?




How do I import OS raster tiles into QGIS at the correct scale?


I have some tiles of OS raster mapping and their associated TFW georeferencing files. The GR files are in the same folder as the raster data and have the same names, ie TQ21.tif/TQ21.tfw.


In QGIS 1.8, I add a tile as a raster layer and choose the OSGB 1936/British National Grid CRS. When I load the next tile over, it loads in the correct place. Fine so far.


However, the problem is that the scale is completely out - to view a tile I have the onscreen scale indicator at around 1:365000000! If I go to Raster > Miscellaneous > Information there is nothing in the Raster Info box, but the fact that the tiles load in their correct place suggests QGIS is reading the GR files.


Am I missing something?



Answer




Make sure the project CRS is set to EPSG:27700 as well. The project CRS by default is WGS84. You can set the project CRS by right-clicking on a layer with a CRS of EPSG:27700 and select 'Set Project CRS from Layer'. You will probably have to Zoom To Layer after changing these settings. The best option is to set your project default CRS to 27700 if you are working mostly with UK OS data and it will save you some trouble (Go Settings->Options->CRS). I use these data regularly in this way and the scale shows correctly.


cartography - How to obtain overlapping semi-transparent circles effect in ArcGIS for Desktop?


I have been trying to obtain the following markers effect.


How can I do the same in ArcGIS?


This image is from a book.alt text



Answer



My thought is that this was not done with a point layer but a buffer of a point layer in order to achieve the transparency. I have been working with a layer to reproduce the affect and am not successful with any of the other methods listed in these answers.



I now beleive they are close though. Following instructions to use the graduated symbol and transparency does not ever produce the darker outline for me. When I get the transparency set so it looks like your example the outline doesn't show up as much as your example.


So perhaps a copy of the layer placed above it's parent, and then change so it's transparency is just less than the parent, and the fill is none with an outline of black.


Another method for this would be to use Illustrator an export an AI file or Avenza to access the data directly from Illustrator.


How to create a loop for multiple points with cost distance / cost path in ArcGIS 10.2 in python


I would like to know how to simplify the following code in order to execute it more efficiently than copy paste 82 processes (with a loop I guess?) but I don't know how to do it.


There are 4 main processes : Select analysis, cost distance, cost path, and raster to polylines. The study region is Uganda and there are 82 points and therefore 82 (or 81) routes to be founded. In the future, I would like to execute this code for other countries as well with more points. Many thanks in advance for your help :-)!





# Import arcpy module
import arcpy
arcpy.env.workspace = "E:\HEC\strategicmap\strategicmap.gdb"
arcpy.env.overwriteOutput = True

# Check out any necessary licenses
arcpy.CheckOutExtension("spatial")

# Set Geoprocessing environments

arcpy.env.scratchWorkspace = "E:\\HEC\\strategicmap\\scratch.gdb"
arcpy.env.snapRaster = ""
arcpy.env.extent = "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda"
arcpy.env.cellSize = "MAXOF"
arcpy.env.mask = ""

# Local variables:
WorldWGS84WM = "E:\\HEC\\strategicmap\\strategicmap.gdb\\WorldWGS84WM"
PRIOcAfricaWGS84 = "E:\\HEC\\strategicmap\\strategicmap.gdb\\PRIOcAfricaWGS84"
costsurface = "E:\\HEC\\strategicmap\\strategicmap.gdb\\costsurface"

Uganda = "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda"
outputcostdist = "E:\\HEC\\strategicmap\\strategicmap.gdb\\outputcostdist"
outputcostbacklink = "E:\\HEC\\strategicmap\\strategicmap.gdb\\outputcostbacklink"
PRIOUganda = "E:\\HEC\\strategicmap\\strategicmap.gdb\\PRIOUganda"
outputcostpath = "E:\\HEC\\strategicmap\\strategicmap.gdb\\outputcostpath"

#all files for Uganda (82 points)
route1= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route1"
route2= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route2"
route3= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route3"

route4= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route4"
route5= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route5"
route6= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route6"
route7= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route7"
route8= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route8"
route9= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route9"
route10= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route10"
route11= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route11"
route12= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route12"
route13= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route13"

route14= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route14"
route15= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route15"
route16= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route16"
route17= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route17"
route18= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route18"
route19= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route19"
route20= "E:\\HEC\\strategicmap\\strategicmap.gdb\\route20"
##etc...until route82

Uganda1= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda1"

Uganda2= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda2"
Uganda3= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda3"
Uganda4= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda4"
Uganda5= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda5"
Uganda6= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda6"
Uganda7= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda7"
Uganda8= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda8"
Uganda9= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda9"
Uganda10= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda10"
Uganda11= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda11"

Uganda12= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda12"
Uganda13= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda13"
Uganda14= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda14"
Uganda15= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda15"
Uganda16= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda16"
Uganda17= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda17"
Uganda18= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda18"
Uganda19= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda19"
Uganda20= "E:\\HEC\\strategicmap\\strategicmap.gdb\\Uganda20"
##etc... until Uganda82


arcpy.Select_analysis(WorldWGS84WM, Uganda, "\"FIPS10_4\"='UG'")
arcpy.Clip_analysis(PRIOcAfricaWGS84, Uganda, PRIOUganda, "")

arcpy.Select_analysis(PRIOUganda,Uganda1, "\"OBJECTID\" =1")
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath, "EACH_CELL", "OBJECTID")
arcpy.RasterToPolyline_conversion(outputcostpath,route1, "ZERO", "0", "SIMPLIFY", "VALUE")
arcpy.gp.CostDistance_sa(Uganda1, costsurface, outputcostdist, "", outputcostbacklink)
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath,"EACH_CELL","OBJECTID")
arcpy.RasterToPolyline_conversion(outputcostpath,route1, "ZERO", "0", "SIMPLIFY", "VALUE")


arcpy.Select_analysis(PRIOUganda,Uganda2, "\"OBJECTID\" =2")
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath, "EACH_CELL", "OBJECTID")
arcpy.RasterToPolyline_conversion(outputcostpath,route2, "ZERO", "0", "SIMPLIFY", "VALUE")
arcpy.gp.CostDistance_sa(Uganda2, costsurface, outputcostdist, "", outputcostbacklink)
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath,"EACH_CELL","OBJECTID")
arcpy.RasterToPolyline_conversion(outputcostpath,route2, "ZERO", "0", "SIMPLIFY", "VALUE")

arcpy.Select_analysis(PRIOUganda,Uganda3, "\"OBJECTID\" =3")
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath, "EACH_CELL", "OBJECTID")

arcpy.RasterToPolyline_conversion(outputcostpath,route3, "ZERO", "0", "SIMPLIFY", "VALUE")
arcpy.gp.CostDistance_sa(Uganda3, costsurface, outputcostdist, "", outputcostbacklink)
arcpy.gp.CostPath_sa(PRIOUganda, outputcostdist, outputcostbacklink, outputcostpath,"EACH_CELL","OBJECTID")
arcpy.RasterToPolyline_conversion(outputcostpath,route3, "ZERO", "0", "SIMPLIFY", "VALUE")

##etc... until 82


arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...