Thursday, 28 February 2019

Filling polygons with points or polygons using PyQGIS?



I am trying to find a method to, as efficiently as possible, fill irregular shape polygons with points (defined spacing) or alternatively, polygons of defined size ( which i can find the point centroid).


I am using QGis, Python



Answer



Next PyQGIS code uses a defined spacing of 50 m for generating points inside each feature of polygon layers.


layer = iface.activeLayer()

feats = [ feat for feat in layer.getFeatures()]


points = []

spacing_y = 50
spacing_x = 50

for feat in feats:
extent = feat.geometry().boundingBox()
xmin, ymin, xmax, ymax = extent.toRectF().getCoords()

rows = int(ymax - ymin)/spacing_y

cols = int(xmax - xmin)/spacing_x

x = xmin
y = ymax

geom_feat = feat.geometry()

for i in range(rows+1):
for j in range(cols+1):
pt = QgsPoint(x,y)

tmp_pt = QgsGeometry.fromPoint(pt)
if tmp_pt.within(geom_feat):
points.append(tmp_pt.asPoint())
x += spacing_x
x = xmin
y -= spacing_y

epsg = layer.crs().postgisSrid()

#points

uri = "Point?crs=epsg:" + str(epsg) + "&field=id:integer""&index=yes"

mem_layer = QgsVectorLayer(uri,
'point',
'memory')

prov = mem_layer.dataProvider()

feats = [ QgsFeature() for i in range(len(points)) ]


for i, feat in enumerate(feats):
feat.setAttributes([i])
feat.setGeometry(QgsGeometry.fromPoint(points[i]))

prov.addFeatures(feats)

QgsMapLayerRegistry.instance().addMapLayer(mem_layer)

After running the code at the Python Console of QGIS, with a two features polygon layer, I got:


enter image description here



google earth engine - "print()" alternative for nodeJS


i need run this code in my nodeJS application:


//Dates of Interest
var start = ee.Date("2014-10-01");
var finish = ee.Date("2018-05-01");


///--------------------- Landsat Collection ---------------------------------------///
var landsat = ee.ImageCollection('LANDSAT/LC08/C01/T1_TOA')
.filterDate(start, finish)
.filterBounds(roi);

// Year-Month-Day Extract function
function ymdList(imgcol){
var iter_func = function(image, newlist){
var date = ee.Number.parse(image.date().format("YYYYMMdd"));

newlist = ee.List(newlist);
return ee.List(newlist.add(date).sort())
};
return imgcol.iterate(iter_func, ee.List([]));
}

var ymd = ymdList(landsat);
print(ee.List(ymd).reduce(ee.Reducer.frequencyHistogram()));

But I do not know which function to use in place of "print ()", can anyone help me?





arcgis desktop - ERROR 000212: Cannot create XY event source Failed to execute (MakeXYEventLayer)?


I have a csv file that is 133mb in size and has over 1.3 million lines of data. Each line of data has its own lat/long and I was wondering what would be the best way to display those points based off the lat/longs in ArcGIS Desktop 10 and ultimately turn it into a point shapefile?


So far I have tried the "Make XY Event Layer" tool but that keeps failing...


ERROR 000212: Cannot create XY event source Failed to execute (MakeXYEventLayer).


The csv file is properly formatted and the lat/long fields are numeric so I have no idea why it keeps crashing.



Here is what the first 2 lines of my CSV look like, the first line is what should be the header:


"LAT","LONG","CUSTOMER_MASTER_ID","STORE_NBR","TRANSACTION_DT","SKU_DIVISION_ID","SKU_DEPARTMENT_ID","SKU_CLASS_ID","SKU_CATEGORY_ID","SKU_NBR","SALES_AMT"
"32.363544","-110.969778","2000000792627","2940","8/11/2010","2060","3920","5120","84021","5127866","13.99"

Any ideas?



Answer



Ths may be a bit more complicated, but if my two-cents are worth anything (and if you're using MS Office products), I'd recommend importing your .csv into an MS Access .mdb database as a table. (Note: there is a 2GB size limit for an .mdb database).


You can then add that Access .mdb table into your ArcMap document and do a right-click > "Display x,y data":


Rick-click


By keeping your data in a database and reading from that table to display your x,y data points, you can make changes to your data and those changes will automatically be reflected in the x,y data points the next time you refresh the map display instead of having to create a new shapefile or layer view every time. This also seems like a more robust way to manage such a large amount of data.



Resulting x,y data points


spatial analyst - Adding two rasters using map algebra in ArcPy?


I want to simply add two rasters within a python script using map algebra, which from what I understand is the equivalent to adding two rasters in the raster calculator.


If I do this using raster calculator and simply type "Raster1 + Raster2", it works perfectly. However, if I run this in a python script, I get error000732: Input Raster: Dataset Raster1 does not exist or is not supported.


Here is my code:


import arcpy
from arcpy import env
from arcpy.sa import *
raster1 = ("C:/raster1")

raster2 = ("C:/raster2")
raster3 = Raster("raster1") + Raster("raster2")

I set up the syntax based on the Esri help page for Map Algebra http://desktop.arcgis.com/en/arcmap/latest/extensions/spatial-analyst/map-algebra/an-overview-of-the-rules-for-map-algebra.htm


I'm not sure why my rasters are recognized within the raster calculator but not map algebra.



Answer



What @Joseph says. The Raster class needs a string containing the path to your raster. You've correctly assigned this to a variable, so you then need to pass those variables to instantiate your Rasters. This code should work:


import arcpy
from arcpy import env
from arcpy.sa import *

raster1 = ("C:/raster1")
raster2 = ("C:/raster2")

#Note lack of quotes in following line
raster3 = Raster(raster1) + Raster(raster2)

#Don't forget to save eg:
raster3.save("C:/raster3")

The reason the documentation you link shows the rasters being passed to map algebra statements in quotes is because they assume you are typing in the python console in ArcMap. If you're raster datasets are open in ArcMap you can refer to them by their name in the table of contents, using quotes.



In which case the necessary code would be


import arcpy
from arcpy import env
from arcpy.sa import *

#No need to assign paths to variables.
#NB You still need to call `Raster()` in order to tell Arcpy that the '+' is for raster math.
raster3 = Raster("raster1") + Raster("raster2")

#Don't forget to save eg:

raster3.save("C:/raster3")

carto - CartoDB torque chart stop from playing on page load


Is there a way to use a CartoDB torque chart such that the chart does not automatically play upon page load?




Wednesday, 27 February 2019

raster - Enabling georeferenced TIF file to rectify in ArcGIS for Desktop?


I have added a TIF file (size 1,433 KB) in ArcGIS 10.2.2 and defined numerous georeferencing points, with the aim of creating a raster file for conversion to polygons.


However, on clicking the “Rectify” and "Save" buttons, it returns the message “Error – Failed to save raster dataset”. The same occurred when I repeated the procedure with a BMP file.


Can anybody explain what I am doing wrong, or how I might get it to work?




leaflet - Adding one search box for multiple GeoJSON polygon layers?


Multiple GeoJSON markers can be combined to one group layer then search on a specific attribute but doesn't work for GeoJSON polygons.


How to put multiple GeoJSON markers and polygons in one search box?


 var map = L.map('map', {
zoom: 14,
center: new L.latLng(35.948782,8.139912),
layers: L.tileLayer('http://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png')

});
//Geojson points
var piky1=new L.geoJson(pol00);
var piky2=new L.geoJson(pol01);
//Geojson polygones
var piky3=new L.geoJson(pick1);
var piky4=new L.geoJson(pick2);

//group of Geojson points markers
var lay=L.featureGroup([piky1,piky2]);

//group of Geojson polygones
var pik=L.featureGroup([piky3,piky4]);

L.control.search({

//If we consider searshing layers as the marker group
//layer: pik,

//If we consider searshing layers as the polygones group
layer: lay,


propertyName: 'Name',
})
.addTo(map);

Answer



Searching in GeoJSON features on a specific "attribute", especially through a search box, is not part of standard Leaflet API as far as I know.


Do you mean you use a plugin that does not support polygons? If so, please can you kindly detail which plugin you are using?


I know Leaflet Control Search plugin has no problem searching into GeoJSON markers and other geometry types.


Demo with US States polygons: http://labs.easyblog.it/maps/leaflet-search/examples/geojson-layer.html





EDIT: following the sharing of your code


In your code, you build one GeoJSON Layer Group per GeoJSON data object (which contains a Feature Collection, with several points or polygons). Then you gather them into a single Feature Group, so that Leaflet Control Search can receive a single Layer Group in its layer option.


Unfortunately, current version of Leaflet Control Search badly handles nested GeoJSON Layer Groups: it correctly manages points (markers), but it fails on vector shapes because they do not have getLatLng method.


So you have 2 possible workarounds:



  1. Instead of building several GeoJSON Layer Groups and to nest them into a single Feature Group, directly build a single GeoJSON Layer Group (pass all your GeoJSON data objects as an array, like L.geoJson([pol00, pol01]). Then you can directly pass this Group into layer option of L.control.search factory. And Leaflet Control Search plugin will correctly search into the polygons.


// You can directly build a GeoJSON layer group with an array of GeoJSON objects:
var piky5 = L.geoJson([pol00, pol01]);



  1. Patch Leaflet Control Search plugin code to correctly handle nested Layer Groups with vector shapes inside. It is quite simple, even though it means many lines of code, but most of them are unchanged.


///////////////////////////
// Patch leaflet-search to manage nested GeoJSON shapes.
///////////////////////////

// Copy-paste into your script before instantiating L.control.search.
L.Control.Search.include({
_recordsFromLayer: function() { //return table: key,value from layer

var that = this,
retRecords = {},
propName = this.options.propertyName,
loc;

function searchInLayer(layer) {
if (layer instanceof L.Control.Search.Marker) return;

if (layer instanceof L.Marker || layer instanceof L.CircleMarker) {
if (that._getPath(layer.options, propName)) {

loc = layer.getLatLng();
loc.layer = layer;
retRecords[that._getPath(layer.options, propName)] = loc;

} else if (that._getPath(layer.feature.properties, propName)) {

loc = layer.getLatLng();
loc.layer = layer;
retRecords[that._getPath(layer.feature.properties, propName)] = loc;


} else {
throw new Error("propertyName '" + propName + "' not found in marker");
}
} else if (layer.hasOwnProperty('feature')) { //GeoJSON

if (layer.feature.properties.hasOwnProperty(propName)) {
loc = layer.getBounds().getCenter();
loc.layer = layer;
retRecords[layer.feature.properties[propName]] = loc;
} else {

throw new Error("propertyName '" + propName + "' not found in feature");
}
} else if (layer instanceof L.LayerGroup) {
//TODO: Optimize
layer.eachLayer(searchInLayer, this);
}
}

this._layer.eachLayer(searchInLayer, this);


return retRecords;
}
});

Updated JSFiddle: http://jsfiddle.net/gevzoxs1/15/


EDIT: mentioned in repository issue #95 and submitted PR #101. Hopefully the plugin will be corrected soon and it should work for you with the new version without having to "patch" it in your script.


algorithm - Calculating maximum distance within polygon in x-direction (east-west direction) in PostGIS?


I am interested in the maximum width of a polygon e.g. a lake, in east-west direction. Bounding boxes will help only in simple polygons but not in complex concave polygons.




import - Opening and extracting data from Shapefiles?



We are in need of US highway data.


We found it for Florida at the state GIS website.


We downloaded the data for Florida highway intersections from here (this is a zip file).


How do we use these files and extract the geocode values of the intersections (exits) from them?



Answer



You can download and use QGIS which is a simple and effective free G.I.S. software. http://www.qgis.org


google earth engine - Function takes too long to run (over 72 hours)


I have an errorless code, which works for 3 days of daily data. however I wish to use it for 6000 days. It has not stopped running (we are on hour 75) There are two if statements inside a function with which I change a value if it is below a maximum or minimum, which I thought would be the problem, however when I removed those, the code is still running for more than a day (and going). The piece of the code I am showing is the piece from which I expect the long runtime comes. It can however also be earlier on in the code.


var unweighted = function(img){var uni = img.reduceRegions({

reducer: ee.Reducer.sum().unweighted(),
collection: grid2,
scale:500}).map(function(feat){
var id = ee.String(feat.get('system:index')).cat('-').cat(ee.String(img.get('system:index')));
var date = ee.String(img.get('system:index'));
var bandNames = img.bandNames().map(function(name){return ee.Number.parse(name)});
var values = ee.Feature(feat).toDictionary(img.bandNames()).values();
var array = ee.Array.cat([values, bandNames], 1);
var min = array.reduce(ee.Reducer.min(2), [0], 1);
var key = ee.String(min.get([0,1]).toInt());

var minh = feat.get('min')
var maxh = feat.get('max')
if (key>maxh){var key = maxh}
else if (minh>key){var key = minh}
return ee.Feature(feat.setMulti(ee.Dictionary.fromLists(['Minimum','id'], [key, id])));
});
return uni;
};

Preferably I would still have a minimum and maximum check. What would be even better is that every band under the minimum and over the maximum are removed for each image seperatly. However at the moment I mostly have to tackle the runtime. Can anyone see why it runs for so long, and what can I do to streamline it?



Link to code




Tuesday, 26 February 2019

qgis - Defining Winkel Tripel in proj 4.8.0?


I have QGIS with proj 4.8.0 installed. I'd like to add Winkel Tripel projection which seems to be defined as


+proj=wintri


But that definition string does not work. Is this a bug in my proj version, or what's the correct definition string?



Answer



Summing up the discussion above:


While Winkel Tripel projection is defined in the proj library and can be called from the command line, it can't be used as a custom CRS in QGIS because there's no inverse transformation in the proj library.


The enhancement request to add this functionality has been closed since it seems that the inverse transformation isn't trivial.


symbology - qgis place marker in dashed line


I would like to create a marker line in which a font marker is placed within a dashed line space at equal intervals. Something like line-style-with-alternating-dots-and-long-dashes


c - c - c -



I can get this to somewhat work by creating a marker line with a custom dash pattern: Dash=10 Space=3 mm Then setting the font marker with a interval of 13mm and offset of 11.5 (midpoint of space).


The problem I have is that it appears that when the line become more complex there appears to be a misalignment of the dash space and marker offset


Example Marker Space Misalignment


Regards




Duplicating layer and applying filter using PyQGIS?


Is there a way, using Python, to duplicate a layer in QGIS as many times as the single occurrences of a variable contained in a field and with a filter applied on the duplicated layer based on the variable itself?


An example: I have layer0 with a field0 that contains var1, var2, var3, etc. I would like the layer0 duplicated to layer1, layer2, layer3 etc. so that layer1 has a filter where field0 = var1, layer2 has a filter where field0 = var2, etc.


The idea is similar to the LayersByField plugin but instead of creating a new layer for each occurrence of a variable I wish to just duplicate the "master" layer (so to avoid unnecessary files on disk)




Problems adding KML layer in OpenLayers


I'm stuggling to add a KML layer to a simple test OpenLayers map - no errors are shown, but the KML layer does not display on the map.


I've tried multiple samples including the Sundials on a Spherical Mercator sample and the Layers help page. I can open this example in Firefox and see the KML points on the map without error.


I copied this sample's source code and adjusted the paths to the CSS, JS and KML files to use full paths.


However, when opening this map the KML points are not displayed. Firebug shows GET sundials.kml on the working example from OpenLayers, but OPTIONS sundials.kml on my version.


Any advice on what I'm doing wrong? Thanks



















KML Layer Example





kml, popup, feature



Demonstrates loading and displaying a KML file on top of a basemap.










Answer



Firebug says that XMLHttpRequest cannot load http://openlayers.org/dev/examples/kml/sundials.kml. Origin null is not allowed by Access-Control-Allow-Origin. So you can move original kml file to the same host as your OpenLayers application or set up proxy.


postgresql - Import large CSV file into PostGIS


I am trying to import CSV files into PostGIS. Following this post, I have created tables before. I found other suggestions saying that I can run the copy command.


If I run this command:


COPY table FROM '/Users/macbook/file.csv' DELIMITERS ',' CSV HEADER;

it didn't copy the table at all. It says that "table" is not recognized.


I tried this:


COPY moulding

(Borough,Block,Lot,CD,CT2010,CB2010,SchoolDist,Council,ZipCode,FireComp,PolicePrct,Address,ZoneDist1,ZoneDist2,ZoneDist3,ZoneDist4,Overlay1,Overlay2,SPDist1,SPDist2,LtdHeight,AllZoning1,AllZoning2,SplitZone,BldgClass,LandUse,Easements,OwnerType,OwnerName,LotArea,BldgArea,ComArea,ResArea,OfficeArea,RetailArea,GarageArea,StrgeArea,FactryArea,OtherArea,AreaSource,NumBldgs,NumFloors,UnitsRes,UnitsTotal,LotFront,LotDepth,BldgFront,BldgDepth,Ext,ProxCode,IrrLotCode,LotType,BsmtCode,AssessLand,AssessTot,ExemptLand,ExemptTot,YearBuilt,BuiltCode,YearAlter1,YearAlter2,HistDist,Landmark,BuiltFAR,ResidFAR,CommFAR,FacilFAR,BoroCode,BBL,CondoNo,Tract2010,XCoord,YCoord,ZoneMap,ZMCode,Sanborn,TaxMap,EDesigNum,APPBBL,APPDate,PLUTOMapID,Version)
FROM
'/Users/macbook/file.csv'
DELIMITERS
','
CSV HEADER;

but didn't work either.


An example of such data set can be downloaded from this link:


Should I create a model and then execute it?




Answer



You are almost there but I think the problem might be the table you are loading into.


You must have already had a table created in PostGIS with the correct column types


For example


CREATE TABLE nycdata (
BOROUGH varchar,
BLOCK varch,
DATE date,
VERSION numeric);


But you need to match the column type with the same type of data in the CSV.


You can see all the Data Types here http://www.postgresql.org/docs/9.1/static/datatype.html


Once you have created the table you can then use the original command


COPY nycdata FROM '/Users/macbook/data.csv' DELIMITERS ',' CSV HEADER;

You will then need to create indexes and a geometry


arcgis 10.1 - Replacing non-English characters in attribute tables using ArcPy and Python?


I have a few shapefiles where some of the attributes contain the non-English characters ÅÄÖ. Since some queries doesn't work with these characters (specifically ChangeDetector), I tried to change them in advance with a simple script and add the new strings to another field.


However, change in characters works fine but not updating the field with arcpy.UpdateCursor.



What is an appropriate way of solving this?


I have also tried to do this via the Field Calculator while posting "code" to the codeblock, with the same error.


Error message:
Runtime error Traceback (most recent call last): File "", line 1, in File "c:/gis/python/teststring.py", line 28, in val = code(str(prow.Typkod)) UnicodeEncodeError: 'ascii' codec can't encode character u'\xc4' in position 3: ordinal not in range(128)


Code:


# -*- coding: cp1252 -*-
def code(infield):
data = ''
for i in infield:
## print i

if i == 'Ä':
data = data + 'AE'
elif i == 'ä':
data = data + 'ae'
elif i == 'Ã…':
data = data + 'AA'
elif i == 'Ã¥':
data = data + 'aa'
elif i == 'Ö':
data = data + 'OE'

elif i == 'ö':
data = data + 'oe'
else:
data = data + i
return data


shp = r'O:\XXX\250000\DB\ArcView\shape.shp'

prows = arcpy.UpdateCursor(shp)


for prow in prows:
val = code(unicode(str(prow.Typkod), "utf-8"))
prow.Typkod_U = val
print val
prows.updateRow(prow)

The values of Typkod are of the type: [D, D, S, DDRÄ, TRÄ] etc.


I use ArcMap Basic (10.1) on Windows 7.





New Error message:
Runtime error Traceback (most recent call last): File "", line 1, in File "c:/gis/python/teststring.py", line 29, in val = code(unicode(str(row.Typkod), "utf-8")) UnicodeEncodeError: 'ascii' codec can't encode character u'\xc4' in position 3: ordinal not in range(128)


>>> val 'DDRÄ'
>>> type(val) type 'str'




It appears as if the output from the function is wrong somehow. When there's ÅÄÖ involved it returns data = u'DDR\xc4' and not (as was my intention) data = 'DDRAE'. Any suggestions on what might cause this?



Answer



Turns out iterating over ÅÄÖ wasn't that easy. It is refered to as a unicode string, and when checking in the if-statements that has to be used instead of the literal ÅÄÖ. After I figured that out, the rest was a piece of cake :)


Resulting code:


# -*- coding: cp1252 -*-

def code(infield):
data = ''
for i in infield:
## print i
if i == u'\xc4': #Ä
data = data + 'AE'
elif i == u'\xe4': #ä
data = data + 'ae'
elif i == u'\xc5': #Ã…
data = data + 'AA'

elif i == u'\xe5': #Ã¥
data = data + 'aa'
elif i == u'\xd6': #Ö
data = data + 'OE'
elif i == u'\xf6': #ö
data = data + 'oe'
else:
data = data + i
return data



shp = arcpy.GetParameterAsText(0)
field = arcpy.GetParameterAsText(1)
newfield = field + '_U'
arcpy.AddField_management(shp, newfield, 'TEXT')

prows = arcpy.UpdateCursor(shp)

for row in prows:
row.newfield = code(row.field)

prows.updateRow(row)

Monday, 25 February 2019

How to add GeoServer wfs layer on OpenLayers?


I can't seem to add a WFS layer like this. It works when i copy all this code locally but as soon as i change the values to point to my own layer within my own GeoServer


this is the response viewed in firebug:



Could not locate {http://www.mydomain.com/geoserver/wfs/DescribeFeatureType?version=1.1.0&typename=catalog:dataSAR}dataSAR in catalog



This is the relevant part of my javascript:


    var wfs = new OpenLayers.Layer.Vector("WFS", {
strategies: [new OpenLayers.Strategy.BBOX()],

protocol: new OpenLayers.Protocol.WFS({
url: "http://www.mydomain.com/geoserver/wfs",
featurePrefix:"catalog",
featureType: "dataSAR",
featureNS: "http://www.mydomain.com/geoserver/wfs/DescribeFeatureType?version=1.1.0&typename=catalog:dataSAR",
geometryName: "bounds"

}),
styleMap: new OpenLayers.StyleMap({
strokeWidth: 3,

strokeColor: "#333333"
})
})

map.addLayers([basemap, wfs]);
map.setCenter(new OpenLayers.LonLat(lon, lat), zoom);

And this is what my DescribeFeature (featureNS) looks like:


   
-


-
-
-
-





































Answer



You need the actual URI/URL for your workspace. From your DescribeFeature Schema it looks like "catalog" which is weird. The other option is to go to the workspace menu in the UI and look there



enter image description here


qgis - Hillshade shows grid texture artifacts


I tried to make an hillshade and a slope map from a DTM using QGIS but the result shows some artifacts.


As you can see in the picture there is a grid texture in the hillshade and the slope map.


hillshade and slope


I downloaded the DTM files from:


http://opendata.regione.abruzzo.it/content/modello-digitale-del-terreno-risoluzione-10x10-metri.


The DTMs have the UTM-WGS84 coordinate system and a resolution of 10m. Every DTM is a GeoTIFF floating point 32 bits with a TFW associated.



I used the VRT builder to generate e virtual raster from the DTMs, then I used Warp to reproject the VRT into the same coordinate system of the original DTMs but using bilinear interpolation as "resampling method". I used the reprojected raster to make the hillshade but the grid texture was still there. Then I changed the "output raster type" to Int32 in the Warp tool, and I used the output to generate another hillshade but the texture was still clearly visible.


Do you have any idea why this is happening?




Preparing CSV files for use in ArcGIS Desktop?


How do I prepare CSV files for use in ArcGIS Desktop.


I ask because I have some troubles using CSV files because ArcGIS attributes wrong field types to my columns and also misinterprets special characters such as á or ê.


I have read in the Esri forum that there is a so-called schema.ini file that defines somehow the field types e.g "Col22=V002 Text" see here http://forums.esri.com/Thread.asp?c=93&f=1149&t=64464


That's kind of funny because I have often seen these .ini files on my disc but never actually wondered what they are good for. It is kind of weird that Excel stores such metadata in an extra file since other programs like R don't do so.


I already tried to manipulate this .ini file with little success since I didn't find out how to apply for example "string" type. There are some information on MS sites, see here: http://msdn.microsoft.com/en-us/library/windows/desktop/ms709353%28v=vs.85%29.aspx but I couldn't find a solution.


Also I didn't really like the idea to work with this .ini file because it is quite a bunch of work to define and type all the fieldnames when I have let's say 50 columns. And these .ini files might get lost, etc.



Answer



My quick fix is to create the first row all with dummy values, and then delete this row/record after bringing into in ArcGIS.


This first row contains representative values or often wildly different values (e.g. alphabetic characters even if the column contains numbers that I want to be text data type) and with the largest number of characters needed for that row (because text fields tend to get truncated).



Date/time values are subject to import errors (especially between Canada/U.S. default date formats) so my work around is to split the date/time parts in to separate columns (e.g. year, month, day, hour, minute), and then concatenate these in a new field calculation after successfully bringing into in ArcGIS.


The geographic coordinates tip from Jamie is also necessary - specify negative values for western hemisphere longitude and southern hemisphere latitude. And unicode takes care of special characters.


Lastly, if a field data type is still misinterpreted after bringing into ArcGIS I will add a new field in the correct data type and calculate/convert the values from the original field, but usually the dummy row/record takes care of most, if not all, problems.


clustering - Obtaining "nClusters" value in Google Earth Engine


After training the kmeans model, how can i obtain the number of classes of the model? i've tried with get(), getInfo() and doesnt work.


enter image description here


https://code.earthengine.google.com/ff0c7bd8102d17b2d267b96bd070342a


Value that must to be extracted in "//var to_print_2 = clusterer.get('nClusters')"


// STEP 1: Import image we wish to classify
var l8 = ee.ImageCollection('LANDSAT/LC08/C01/T1')

// Remove the clouds
var cloud_free = ee.Algorithms.Landsat.simpleComposite({

collection: l8.filterDate('2018-01-01', '2018-12-31'),
asFloat: true
})

// STEP 2: Draw your box which you will sample from!
// Make the training dataset.
var training = cloud_free.sample({
region: geometry,
scale: 30,
numPixels: 5000

});

// Step 3: Create the clusterer and train it, play around with
// number of classes
var clusterer = ee.Clusterer.wekaKMeans(3).train(training);

print('clusterer',clusterer instanceof ee.ComputedObject);

var to_print = clusterer.getInfo();


print('clusterer_info',clusterer.toString());

print(typeof clusterer);

var clusterer_list = ee.List(clusterer);

print('clusterer_list',clusterer_list);

print(typeof clusterer_list);



//var to_print_2 = clusterer.get('nClusters');

//print(to_print_2);

//var to_print_3 = clusterer.Filter.filterMetadata('nClusters','equals',3);

//print(to_print_3);

// Step 4: Classify our image

var classified = cloud_free.cluster(clusterer);
print(classified);
var cluster_1 = classified.eq(0);
print('min',cluster_1);


// Display
Map.setCenter(-122.31040963183364, 37.782065864682096, 11)
Map.addLayer(classified.randomVisualizer(), {}, 'clusters');


geoserver - Creating point symbology in SLD file


I would like to create a style file for a layer in Geoserver. I have point geometries and the SLD cookbook does give nice examples for the simple ways to create your symbology.


However, I would like to combine the attribute dependent way and the scale dependent way. My points need to be colored depending on some attribute value and change size depending on the zoom level.


Does anybody have an example on how to do this or even some link where more complicated version of SDL files are available?



Answer




With Rule based SLDs, you need to set up a rule for each case. You cannot nest rules.


In the following example, I want to use attribute based rendering for scale<25000. In this case if the height is deeper than -10, I want a red marker, otherwise I want a green marker.
For map scale>250000, I want a small blue marker.


Thus there need to be 3 rules.





GPS_points




GPS_points
GPS_points


height
-10


100

250000



circle

#cc0123
1.0



3
0




GPS_points
GPS_points



height
-10


100
250000



circle


#11cc99
1.0


3
0





GPS_points
GPS_points
250000



circle

#110cd3

1.0


1
0








Saving project with data source path as relative in QGIS?


Is there an option in QGIS to save the project with the data source being a relative path?



I don't see the option here. QGIS Wroclaw1.7.3enter image description here



Answer



Goto:



  1. Settings

  2. Project Properties..

  3. Now check the general tab


you should see save paths and besides relativ ( I believe its even standard since vers. 1.73, before 1.73 the default was absolut)


Kurt



ps. just take at look at the screenshot made by manning above ;-)


pps: your screenshot is from settings --> options , thats the wrong way


remote sensing - Radiometric calibration of Sentinel-2 products


How can I calibrate Sentinel-2 products, to extract the reflectance?


I've downloaded few Sentinel-2 products from https://scihub.copernicus.eu/s2/#/home, and opened them with SNAP but its radiometric calibration is dedicated only to SAR products.


On ENVI, the Radiometric calibration tool is seemingly only capable of extracting the radiance. What is the difference between the radiance and the reflectance? Is the estimation of the radiance counted as a radiometric calibration of a product?



Answer




Sentinel-2 Level 1C data are expressed in reflectance with a scaling factor, not in radiance. You have to divide by 10000 to get the reflectance. In the preliminary products shown this autumn, you had to divide by 1000.


The scaling factor is given in the xml file at the root of the product directory.


10000

There is no offset and the no_data value is 0.


esri grid format - Loading *.adf files into QGIS?


Is it possible to load .adf files to QGIS?



Answer



Arcinfo .adf files can be raster or vector.


Try QGIS' Add Vector Layer, select source types of Directory and ArcInfo Binary Coverage, and then select the directory containing the .adf files. There will be second dialog asking which sub-layers to add; coverages are a composite datatype that can contain any combination of points, lines, polygons, and annotation.


You can also add the .adf file directly, without choosing the directory type, but then you don't get to choose which geometry to load.


For background info see Arcinfo Binary Grid format and Arcinfo Binary Coverage format pages.


Plot shapefile with matplotlib


I am trying to read a shapefile and plot it using matplotlib. Here is the code:


import matplotlib.pyplot as plt
import shapefile


shpFilePath = "D:\test.shp"
listx=[]
listy=[]
test = shapefile.Reader(shpFilePath)
for sr in test.shapeRecords():
for xNew,yNew in sr.shape.points:
listx.append(xNew)
listy.append(yNew)
plt.plot(listx,listy)
plt.show()


However, i get lines connecting my polygons. How can I draw the polygons such that they are the way in the shapefile. Here are screenshots of the plot and the shapefile when it is opened with ArcGIS.Generated By Code Actual File



Answer



I will leave it to you how to collect the shapes but this is the principle


import numpy as np
from matplotlib import pyplot as p #contains both numpy and pyplot
x1 = [-1,-1,10,10,-1]; y1 = [-1,10,10,-1,-1]
x2 = [21,21,29,29,21]; y2 = [21,29,29,21,21]
shapes = [[x1,y1],[x2,y2]]
for shape in shapes:

x,y = shape
p.plot(x,y)
p.show()

Sunday, 24 February 2019

Showing only ONE layer from group layer in Leaflet?



I have the following code to solve my workflow. But it does not work properly.


The problem of the map behaviour appears to be within the map.getZoom function at the end of the code. That function should only be working when train layer is toggled on. But for now the getZoom is reacting on zoom even the train layer is not activated.


For clarification: there are two train layers which actually are the same layers coming from the same resource both. The difference is that aach of them is linked to a different style. Depending on zoom level one layer is displayed and the other is not. The both are grouped to a LayerGroup to have them both displayed as one layer in the LayerControl.


I am not a Developer.


Desired performance:




  1. Opening the map with basemap active

  2. Panning/zooming around

  3. Toggle on the train layer in LayerControl

  4. Train layer icons are displayed differently depending on the map's zoom level


Actual performance:



  1. Opening the map with basemap active

  2. Zooming leads to displaying the train layer automatically --> Note: displays without layer being toggled on in LayerControl


  3. If layer is toggled on via LayerControl it shows iconTrain1 + iconTrain2 --> two icons visible

  4. Toggling off the layer in LayerControl removes the icons, but zooming in/out displays them again (again without being toggled on in LayerControl)



Answer



To check whether a layer is toggled you can use the hasLayer() method. You have to modify your getZoom function slightly to get it to work; I added an two extra variables (for clarity) and modified your if statement from


if (currentZoom <= 13){ //do something }

to


var hasTrainLayer1 = map.hasLayer(train1);
var hasTrainLayer2 = map.hasLayer(train2);

if (currentZoom <= 13 && hasTrainLayer2){ //do something }

The complete getZoom function with the new part included would look as follows:


//getZoom function to load layer depending on zoom level
map.on('zoomend', function () {
var currentZoom = map.getZoom();
var hasTrainLayer1 = map.hasLayer(train1);
var hasTrainLayer2 = map.hasLayer(train2);

if (currentZoom <= 13 && hasTrainLayer2) {

map.removeLayer(train2);
map.addLayer(train1);
}
if (currentZoom >= 14 && hasTrainLayer1) {
map.removeLayer(train1);
map.addLayer(train2);
}

})


To solve the problem of displaying only of the two train layers depending on the map's zoom levels I added minZoom en maxZoom to the layer definitions.


var train1 = L.esri.featureLayer('http://geoportal1.stadt-koeln.de/ArcGIS/rest/services/WebVerkehr_Data/MapServer/8', {
useCors: false,
pointToLayer: function (feature, latlng) {
return L.marker(latlng, {
icon: icontrain1
});
},
maxZoom: 13
}),

train2 = L.esri.featureLayer('http://geoportal1.stadt-koeln.de/ArcGIS/rest/services/WebVerkehr_Data/MapServer/8', {
useCors: false,
pointToLayer: function (feature, latlng) {
return L.marker(latlng, {
icon: icontrain2
});
},
minZoom: 14
});


EDIT: updated my answer based on the updated JSFiddle provided in the comments below.


qgis - How do I eliminate the black border on a georeferenced map layer?


I have geo-referenced several scanned maps. Theses are placed side by side on separate layers in QGIS. They overlay other older maps of the same areas. Georeferencing caused the maps to have black borders. How can I change these black borders to transparent, or eliminate them, so that the underlying maps' data shows through the gaps in detail between the maps? Here is a screen shot:


enter image description here



Answer



Thanks to all for advice. I have now found a simple workaround for this problem that will eliminate the black borders between adjacent maps and restore the detail lost in the black borders: Make georeferenced rasters as usual, but leave the "Transparency/"No Data Value" box "checked". This will leave lots of drop out in the map detail. Then create a duplicate layer, but this time "uncheck" the "No Data Value Box". Do this for all the adjacent map layers. Group the Unchecked layers together below the grouped checked layers. The Unchecked layers will fill in the missing drop out data from the checked layers and the Checked layers will prevent the black borders from showing through. There is no apparent loss of quality in the finished result.



arcgis desktop - An equivalent for field statistics for records



I know how to obtain various statistics for a field, or "column", in an attribute table in ArcGIS. Is there any way to obtain a Mode figure for records, or the "rows" in a table? For instance, an attribute table of plots of land might contain 15 fields each showing the number of a particular type of tree on each plot. I need to find out, for a particular plot, what type of tree is the most common there. What is the easiest way to find out which of the 15 fields has the highest figure for that record? Am I right that the answer somehow involves the Field Calculator?



Answer



you are right that you can use the field calculator, but you could also do this with a cursor. With the field calculator, you can define a function that take all your fields as input and gives the position of the field as an output.


argmax(!field1!,!field2!,!field3!,...,!field15!)


def argmax(a,b,c,...o):
mylist = [a,b,c,...o]
return mylist.index(max(mylist))


The same function can be used with a cursor, but then you have the advantage of being ble to list the fields automatically (listFields) and to get the name of the field instead of the number of the field.


arcpy - Creating feature class in file geodatabase in ArcGIS Desktop with arcgisscripting?


I am using ArcGIS Desktop 9.3.


What I am looking to do is create a file geodatabase and add a feature class directly into that file geodatabase, as opposed to creating a feature class outside the file geodatabase and then importing it into the file geodatabase. The reason I need to be able to create the feature class directly in a file geodatabase is because I need to have attribute table columns that are nullable, whereas if I create a feature class outside the file geodatabase and then import it into the file geodatabase, nulls will not be allowed. Please correct me if I am wrong. Currently, I am able to create the file geodatabase, but I can't add a feature class directly to the file geodatabase. The code I currently have is:


 import os, arcgisscripting
gp = arcgisscripting.create(9.3)

gp.Overwriteoutput = 1
cwd = "C:\\Path\\To\\directory"
gp.workspace = cwd
gp.toolbox = "management"
number = '1'
fileGDB = "test_%s.gdb" % number
shpFile = "file_%s.shp" % number
gp.CreateFileGDB(cwd, fileGDB)
gp.CreateFeatureclass(fileGDB, shpFile, "POINT", '#', '#', '#', '#')
gp.addfield ( shpFile, fieldName, "FLOAT", "20","20", "#", "#", "NULLABLE", "#", "#")

...the rest of my code here

Answer



I think your first problem is that you're appending ".shp" to your output geodatabase featureclass. Remove this and your problem may be solved.


...     
....
shpFile = "file_%s" % number #--removed ".shp"
...
...the rest of your code here

arcgis desktop - How to get a list of map scales programmatically in ArcMap?


I need to get all map scales, which are shown in the ArcMap scalebar. I need list of value, something like:


1:1500
1:3000
1:6000
1:12000
1:24000
1:50000
1:100000


Does anyone know how to get it?



Answer



Sorry, maybe my question was not clear enough, but I found the solution, and it simple enough.


IDisplayTransformationScales interface provides access to this data.


using .NET it look like:


var mxDocument = (IMxDocument)_application.Document;
var activeView = (IActiveView)mxDocument.FocusMap;
var dts = (IDisplayTransformationScales)activeView.ScreenDisplay.DisplayTransformation;
var scales = Enumerable.Range(0, dts.UserScaleCount)

.Select(x => dts.GetUserScale(x))
.ToArray();

if you want use scale format defined in ArcMap you should use ScaleFormat class.


I would be very appreciated if someone tell me a way to get all scale formats.


Cannot see OpenLayers under Plugin in QGIS?


Although I have installed the OpenLayers plugin in QGIS 2.4, I cannot see it in the list when I click on Plugins.


Does anyone else have this issue?




Saturday, 23 February 2019

arcgis desktop - Execute multiple process in parallel of Arcpy script


I have seen some difficult ways to do parallel processing, but I wonder if it is possible to simply execute multiple process of the same ArcPy script at the same time.


My script makes some changes to the default geodatabase, so I thought of making a geodatabase copy for each process.


I have updated the script to copy the shared resources between the processes, so it copies the geodatabase and the mxd's related to it.


I have made a test to parallelize, using this script:


pool = multiprocessing.Pool(2)


pool.map(test_func, [1, 2] , 1)

pool.close()

I noticed when I browse RAM and CPU use, every process consumes 200Mb. So, if I have 6 Gb of RAM, I think I have to exploit 5Gb RAM of it by enlarging the pool size to:


5000 / 200 = 25


So, to exploit the whole power of the machine, I think I should use 25 as pool size.


I need to know if this is the best manner, or how I could measure the efficiency of this parallelization.


This an example of the code that I'm trying to parallelize. The whole script contains 1500 lines of code almost like this one:



def dora_layer_goned():
arcpy.Select_analysis( "layer_goned" , "layer_goned22" )
arcpy.MakeFeatureLayer_management("layer_goned", "layer_goned_lyr")
arcpy.SelectLayerByLocation_management("layer_goned_lyr" ,"WITHIN", "current_parcel" , "" , "NEW_SELECTION")
arcpy.SelectLayerByAttribute_management("layer_goned_lyr" , "SWITCH_SELECTION" )
arcpy.Select_analysis("layer_goned_lyr" , "layer_goned_2_dora2" )
arcpy.Clip_analysis("layer_goned_lyr" , "current_parcel_5m_2" , "layer_goned_2_dora" )
arcpy.SelectLayerByAttribute_management("layer_goned_lyr" , "CLEAR_SELECTION" )
arcpy.FeatureToPoint_management("layer_goned_2_dora","layer_goned_2_dora_point","CENTROID")
arcpy.MakeFeatureLayer_management("layer_goned_2_dora2", "layer_goned_2_dora_lyr")

arcpy.SelectLayerByLocation_management("layer_goned_2_dora_lyr" ,"INTERSECT", "layer_goned_2_dora_point" , "" , "NEW_SELECTION")
arcpy.DeleteFeatures_management("layer_goned_2_dora_lyr")
arcpy.FeatureVerticesToPoints_management("current_parcel","current_parcel__point", "ALL")
arcpy.FeatureVerticesToPoints_management("carre_line","carre_line__point", "ALL")
arcpy.CalculateField_management("current_parcel__point","id","!objectid!","PYTHON_9.3")
arcpy.SpatialJoin_analysis("carre_line__point" , "current_parcel__point" , "carre_line__point_sj","JOIN_ONE_TO_ONE" , "KEEP_COMMON" , "" , "CLOSEST")
arcpy.Append_management("current_parcel__point" , "carre_line__point_sj" , "NO_TEST") #
arcpy.PointsToLine_management("carre_line__point_sj", "carre_line__point_sj_line", "id")
arcpy.Buffer_analysis("carre_line__point_sj_line" , "carre_line__point_sj_line_buf" , 0.2)
arcpy.Erase_analysis("layer_goned_2_dora2" , "carre_line__point_sj_line_buf" , "layer_goned_2_dora_erz")

arcpy.MultipartToSinglepart_management("layer_goned_2_dora_erz" , "layer_goned_2_dora_erz_mono")
arcpy.MakeFeatureLayer_management("layer_goned_2_dora_erz_mono", "layer_goned_2_dora_erz_lyr")
arcpy.SelectLayerByLocation_management("layer_goned_2_dora_erz_lyr" ,"SHARE_A_LINE_SEGMENT_WITH", "current_parcel" , "" , "NEW_SELECTION")
arcpy.SelectLayerByLocation_management("layer_goned_lyr" ,"CONTAINS", "layer_goned_2_dora_erz_lyr" , "" , "NEW_SELECTION")
arcpy.DeleteFeatures_management("layer_goned_lyr")
arcpy.Append_management("layer_goned_2_dora_erz_lyr" , "layer_goned" , "NO_TEST") #
arcpy.SelectLayerByAttribute_management("layer_goned_2_dora_erz_lyr" , "CLEAR_SELECTION" )

Answer



See this blog post, it should cover it


http://blogs.esri.com/esri/arcgis/2012/09/26/distributed-processing-with-arcgis-part-1/



qgis - gdal_calc - How to calculate where raster A is not equal to raster B?


I have two binary rasters with values NaN and 254. I want to generate a new raster that consists of the pixels that have value NaN in raster A and 254 in raster B and vice versa - those should be 1, the rest should be NaN again.


I want to use gdal_calc.py in a bash script for that. My code snippet looks like this:


gdal_calc.py -A S2_2015_AOI_NDBI_bin.tif -B S2_2015_AOI_NDSI_bin.tif --outfile=S2_2015_AOI_diff_auto.tif --calc="1*(A!=B)" --NoDataValue=0


But all it returns is a new raster with NaN everywhere.


I also tried --calc="1*(not_equal(A,B), --calc="1*(logical_or(logical_and(A==254, not_equal(B, 254)), logical_and(B==254, not_equal(A, 254))))", and several other combinations I could come up with. Nothing worked yet.


Strange thing is, it works flawlessly in the QGIS Raster Calculator with the formula ("S2_2015_AOI_NDBI_bin@1" != "S2_2015_AOI_NDSI_bin@1") = 1, but I want this to be automated. Also, this calculator throws 0 instead of NaN, which is not suitable for further processing as I plan it.


The two input rasters were previously generated by:


gdal_calc.py -A S2_2015_AOI_NDBI.tif --outfile=S2_2015_AOI_NDBI_bin.tif --calc="254*(A>-0.15)" --NoDataValue=0

and


gdal_calc.py -A S2_2015_AOI_NDSI.tif --outfile=S2_2015_AOI_NDSI_bin.tif --calc="254*(A>0.4)" --NoDataValue=0`

The rasters all have the same CRS. I'm running GDAL v. 1.11.3, it's a bit outdated, might this be the problem?





UPDATE


I solved the problem by skipping the step of creating the "binary" input rasters with 254 and NaN values (see question's second and third code blocks). The code looks like this now:


gdal_calc.py -A S2_2015_AOI_NDBI.tif -B S2_2015_AOI_NDSI.tif --outfile=out.tif --calc="1*(logical_and(A>-0.15, B<0.4))" --NoDataValue=0

This delivers the result I wanted.


While this is a suitable workaround for me personally, it still does not answer the original question.




QGIS Line width transition


I have a shapelayer with lines (streets), each line has a value for the traffic on it. The value is shown by the width of each line. Now I want to smooth the transition between one segment that has e.g. the value 500 which goes into a segment with value 100. Any ideas? I want to achieve more or less the same result, as in this post, only automatically for a larger dataset … Blend differing line thicknesses in QGIS?


Maybe a little graphic helps: Above what I got. Bottom, what I want to have. Each line segment has only 1 value, and I want to smooth the segments into each other. enter image description here




arcpy - arcgis python script to tool issue


I have a script to investigate mxd's and save them as lyr symbology/kmz and sld which works fine when run from the .py but does not run when used as a tool from arctoolbox. I have tried arcpy.AddMessage statements to see what's wrong but it just does not work.


Please help...


SCRIPT


# Author: George Corea, Atherton Tablelands GIS

# georgec@atgis.com.au; info@atgis.com.au

# Licence:Creative Commons

import arcpy, string, datetime, shutil, os
import arcpy.mapping as MAP

#Read input parameters from script tool
MXDList = string.split(arcpy.GetParameterAsText(0), ";")
ProjectPath = arcpy.GetParameterAsText(1)


#MXDList=r'P:\2012\183_TownPlanning_Symbology\Working\TP_Biodiversity.mxd'
#ProjectPath=r'P:\2012\183_TownPlanning_Symbology\Working'

count=0
#Loop through each MXD and print
#for MXDPath in MXDList:
#MXDFile=r'P:\2012\189_Townplanning_scheme\Working\TRC_PlanningScheme_LayoutMaster_v10.mxd'

count=count+1

arcpy.AddMessage('starting...'+str(MXDList))
##try:
for MXDFile in MXDList:

mxd=arcpy.mapping.MapDocument(MXDFile)
outPath = ProjectPath+'\\'+mxd.filePath[mxd.filePath.rfind('\\')+1:mxd.filePath.rfind('.')]
arcpy.AddMessage('working on 1...'+str(mxd)+str(outPath))
try:
os.mkdir(outPath)
except:

pass
dfList = arcpy.mapping.ListDataFrames(mxd)
arcpy.AddMessage('working on 2...'+str(dfList)+str(outPath))
for df in dfList:
#msd = outPath+'.msd'
#arcpy.mapping.ConvertToMSD(mxd, msd, df, "NORMAL", "NORMAL")
#arcpy.AddMessage(str(count)+'...' +str(outPath)+'...'+'\n')
lyrList=arcpy.mapping.ListLayers(mxd, "", df)
outPath = ProjectPath+'\\'+mxd.filePath[mxd.filePath.rfind('\\')+1:mxd.filePath.rfind('.')]+'\\'+str(df.name)
#os.mkdir(outPath)

arcpy.AddMessage('working on 3...'+str(lyrList)+str(outPath))
for lyrFile in lyrList:
arcpy.AddMessage(str(lyrFile))
if lyrFile.isFeatureLayer == True:
if lyrFile.visible == True:
arcpy.AddMessage(str(lyrFile)+' is visible')
#print str(lyrFile)+' is visible...exporting dataset'
outFileN=str(arcpy.ValidateTableName(lyrFile.longName[lyrFile.longName.rfind('\\')+1:]))
try:
arcpy.FeatureClassToFeatureClass_conversion(lyrFile.dataSource, outPath, outFileN)

arcpy.SaveToLayerFile_management(lyrFile,outPath+'\\'+outFileN+'.lyr', "ABSOLUTE")
arcpy.LayerToKML_conversion(outPath+'\\'+outFileN+'.lyr', outPath+'\\'+outFileN+'.kmz')
except:
errorm=arcpy.GetMessages()
arcpy.AddMessage('Error...'+str(errorm)+' ...continuing')


arcpy.AddMessage(str(lyrFile)+' is visible. Not Processing')
#print str(lyrFile)+' is not visible'
##except:

## errorM=arcpy.GetMessages()
## arcpy.AddMessage(str(count)+str(errorM)+'\n continuing...\n')
## print errorM
## #MAP.PrintMap(MXD, printer)
#Remove variable reference to file
del mxd, msd

ERROR enter image description here and that's it. The text 'working 2...' from the arcpy.AddMessages doesn't work.


When run in the Python Interpreter window you get


>>> import os

>>> os.getcwd()
'Q:\\scripts\\py'
>>> MXDFile=r'P:\2011\Job_031_TownPlanning_SeriesProduction\Working\mxd\Nov14_Landslide\18_TownPlanning_B&L_Overlay_Ver4a_Chillagoe.mxd'
>>> mxd=arcpy.mapping.MapDocument(MXDFile)
>>> ProjectPath=r'Q:\scripts\py\junk'
>>> outPath = ProjectPath+'\\'+mxd.filePath[mxd.filePath.rfind('\\')+1:mxd.filePath.rfind('.')]
>>> dfList = arcpy.mapping.ListDataFrames(mxd)
>>> for df in dfList:
... lyrList=arcpy.mapping.ListLayers(mxd, "", df)
... print lyrList

...
[, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
[, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ]
[, , , ]
>>>

TOOL PARAMS enter image description here


=== RESPONSE to Answers/Comments in post ===


Messages
Executing: InvestigateMXDs P:\2012\096_CCRC_2_Biodiversity\Working\BiodiversityOverlay_70k_Series3b.mxd D:\junk

Start Time: Mon Aug 06 09:01:22 2012
Running script InvestigateMXDs...
Now processing ...
Map: P:\2012\096_CCRC_2_Biodiversity\Working\BiodiversityOverlay_70k_Series3b.mxd

So basically it hangs on the df list. There is no further error message even if you try to capture it.


Test on simple mxd is below. Note that this works fine when run as a python script directly.


enter image description here


Messages
Executing: InvestigateMXDs P:\2012\183_TownPlanning_Symbology\Working\test.mxd P:\2012\183_TownPlanning_Symbology\Working\junk

Start Time: Tue Aug 07 09:11:35 2012
Running script InvestigateMXDs...
Now processing ...
Map: P:\2012\183_TownPlanning_Symbology\Working\test.mxd

When run as a python we get the following in less than 30s


starting...['test.mxd']
working on 1...P:\2012\183_TownPlanning_Symbology\Working\test

working on 2...[]P:\2012\183_TownPlanning_Symbology\Working\test

working on 3...in: P:\2012\183_TownPlanning_Symbology\Working\test\Layers
TRC_Boundary is visible PROCESSING...
TRC_Boundary is not visible. Not Processing
Map_Extents is visible PROCESSING...
Map_Extents is not visible. Not Processing

and directory structure enter image description here


##Issue with modified script from Polygeo Same problems.


enter image description here === END Response ===



Answer




I tried it again by using a new script tool -till now I was just changing the link to the python script from the old tool. Now it works fine. For anyone interested the updated code and tool set up is below.


# Copies, clips and creates symbology of all visibile layers in mxd's in the current directory. Also creates a text file with metadata for use.
# Author: George Corea, Atherton Tablelands GIS
# georgec@atgis.com.au; info@atgis.com.au

# Licence:Creative Commons

import arcpy, string, datetime, shutil, os, glob
import arcpy.mapping as MAP


arcpy.env.workspace = os.getcwd()
arcpy.env.outputCoordinateSystem=r'L:\Vector_Data\Administrative\Boundaries\Local_Govt\TRC\trc_boundary_Polygon.prj' #to maintain all datasets in the same projection
arcpy.env.overwriteOutput = True

MXDList=string.split(arcpy.GetParameterAsText(0), ";")
ProjectPath=arcpy.GetParameterAsText(1)

rootPath=ProjectPath
#MXDList=glob.glob('*.mxd')
#ProjectPath=r'P:\2012\183_TownPlanning_Symbology\Working' # root output directory

clip_features=r'L:\Vector_Data\Administrative\Boundaries\Local_Govt\TRC\trc_boundary_Polygon.shp' # polygon to clip data to AOI
AOI='_trc' #appended as suffix to dataset
xy_tolerance=1 #clip tolerance

# No edits should be required below this line.

VisibleLyrList = []
count=1



def layer_details(outPath, outFileN, lyrFile, type): #Generates the metadata

descLayer = arcpy.Describe(lyrFile.dataSource)
ReviewLog=outPath+'\\'+type+'_'+outFileN+'_log.txt'
f = open(ReviewLog, 'a')
f.write(str(lyrFile.name)+': name{'+str(lyrFile.datasetName)+\
'}; query{'+str(lyrFile.definitionQuery)+\
'}; source{'+str(lyrFile.dataSource)+\
'}; description{'+str(lyrFile.description)+\
'}; symbology{'+ str(lyrFile.symbologyType)+\

'}; original projection{'+str(descLayer.spatialReference.name)+\
'}; extent(x,y){'+str(descLayer.extent.XMax)+','+str(descLayer.extent.XMin)+','+str(descLayer.extent.YMax)+','+str(descLayer.extent.YMin)+\
'}; format{'+str(descLayer.shapeType)+\
'}; size(bytes) ~{'+str(os.path.getsize(lyrFile.dataSource))+\
'} @{'+str(datetime.datetime.now())+'}'\
)
f.close()




print 'starting...'+str(MXDList)

for MXDFile in MXDList:

mxd=arcpy.mapping.MapDocument(MXDFile)
outPath = ProjectPath+'\\'+mxd.filePath[mxd.filePath.rfind('\\')+1:mxd.filePath.rfind('.')]
arcpy.AddMessage ('Working on file #' + str(count) +' ...'+str(mxd.filePath))
try:
os.mkdir(outPath)
except:

pass
dfList = arcpy.mapping.ListDataFrames(mxd)
#print arcpy.GetMessages()
#print 'working on 2...'+str(dfList)+str(outPath)

for df in dfList:
#msd = outPath+'.msd'
#arcpy.mapping.ConvertToMSD(mxd, msd, df, "NORMAL", "NORMAL")
#print (str(count)+'...' +str(outPath)+'...'+'\n')
arcpy.AddMessage ('Working on dataframe ... ' +str(df.name))

lyrList=arcpy.mapping.ListLayers(mxd, "", df)
outPath = ProjectPath+'\\'+mxd.filePath[mxd.filePath.rfind('\\')+1:mxd.filePath.rfind('.')]+'\\'+str(df.name)
try:
os.mkdir(outPath)
except:
pass
print ('working on 3...in: '+str(outPath))
for lyrFile in lyrList:
#print (str(lyrFile))
if lyrFile.isFeatureLayer == True:

if lyrFile.visible == True:
if lyrFile.name not in VisibleLyrList:
VisibleLyrList.append(lyrFile.name)
arcpy.AddMessage (str(lyrFile)+' is visible PROCESSING...')
outFileN=str(arcpy.ValidateTableName(lyrFile.longName[lyrFile.longName.rfind('\\')+1:]))
try:
rows = arcpy.SearchCursor(lyrFile.dataSource)
row = rows.next()
if row:
arcpy.FeatureClassToFeatureClass_conversion(lyrFile.dataSource, outPath, outFileN)

#arcpy.Copy_management(lyrFile.dataSource, outPath+'//'+outFileN, "")
arcpy.Clip_analysis(outPath+'\\'+outFileN+'.shp', clip_features, outPath+'\\'+outFileN+AOI+'.shp', xy_tolerance)
#updateLayer = outPath+'\\'+outFileN+'.lyr'
sourceLayer = arcpy.mapping.Layer(outPath+'\\'+outFileN+'.shp')
sourceLayer_AOI = arcpy.mapping.Layer(outPath+'\\'+outFileN+AOI+'.shp')
#arcpy.mapping.UpdateLayer(df, updateLayer, sourceLayer, True)
arcpy.SaveToLayerFile_management(lyrFile,outPath+'\\'+outFileN+'_sym.lyr', "ABSOLUTE")
arcpy.SaveToLayerFile_management(sourceLayer_AOI,outPath+'\\'+outFileN+AOI+'.lyr', "ABSOLUTE")
arcpy.ApplySymbologyFromLayer_management (outPath+'\\'+outFileN+AOI+'.lyr', outPath+'\\'+outFileN+'_sym.lyr')
descLayer = arcpy.Describe(sourceLayer)

layer_details(outPath, outFileN, lyrFile,"COMPLETED")
arcpy.Delete_management(outPath+'\\'+outFileN+'_sym.lyr')
arcpy.Delete_management(outPath+'\\'+outFileN+'.shp')

#arcpy.LayerToKML_conversion(outPath+'\\'+outFileN+'.lyr', outPath+'\\'+outFileN+'.kmz')
else:
arcpy.AddMessage ("!!!Datasource Issue!!!...continuing")
layer_details(outPath, outFileN,lyrFile,"ISSUE")
IssueLog=rootPath+'\\'+'Issue_log.txt'
f = open(IssueLog, 'a')

f.write(str(lyrFile.name)+': name{'+str(lyrFile.datasetName)+ '}; source{'+str(lyrFile.dataSource)+'}'+'\n')
f.close()
except:
errorm=arcpy.GetMessages()
arcpy.AddMessage ('!!! ERROR !!!!...'+str(errorm)+' ...continuing')
layer_details(outPath, outFileN, lyrFile,"ERROR")
#break
ErrorLog=rootPath+'\\'+'Error_log.txt'
f = open(ErrorLog, 'a')
f.write(str(lyrFile.name)+': name{'+str(lyrFile.datasetName)+ '}; source{'+str(lyrFile.dataSource)+'}'+'\n')

f.close()


else:
pass
else:
arcpy.AddMessage (str(lyrFile)+' Is NOT Visible. Not Processing')
else:
arcpy.AddMessage (str(lyrFile)+' Is NOT Feature Layer. Not Processing')
count=count+1

#print str(lyrFile)+' is not visible'

#Remove variable reference to file
del mxd, outPath, lyrFile

enter image description here


python - Efficiently reading and reclassifying many rasters in R?


I've been tasked to create a suitability analysis of wave conditions in the Gulf of Mexico. I have 2 thousand or so raster files that are about 8 MB each (2438 columns, 1749 rows, 1km cell size). The parameter in the raster files is wave period and I'd like to reclassify all of the rasters such that if 4<= wave period <=9 then make cell = 1, else cell = 0. Then sum up all the rasters into a final raster and divide by the total number of rasters to give a total percentage of suitable observations and export result into some ESRI-compatible format...maybe something that can support floats if need be. I've not worked much at all with either Python or R, but after searching online it seems to make sense to perform this process in one of those languages. I've come up with some code so far in R, but am confused about how to make this work.



library(rgdal)
raster_data <- list.files(path=getwd()) #promt user for dir containing raster files
num_files <- length(raster_data)
for (i in raster_data) { #read in rasters
my_data <- readGDAL(raster_data[i])

At this point I'm confused as to whether I should also reclassify and start summing the data within this loop or not. My guess would be yes since otherwise I think I would possibly run out of memory on my computer, but just not sure. I'm also not sure about how to reclassify the data.


In researching online I think I would use reclass(my_data, c(-Inf,3,0, 4,9,1, 10,Inf,0)), but does that look right?


And for summing would I use sum(stack(my_data)) and somehow output that. Also...if this might be more efficiently performed or written in Python I'd be open to that as well. I truly am a beginner when it comes to programming.




Friday, 22 February 2019

symbology - Are there any icons for GIS mapping?



I am trying to find alternative symbology for GIS maps and web apps I occasionally produce. Esri has done a great job collecting and categorizing thousands of symbols, markers, line and polygon styles. For instance, here one can find all 9.3 symbology available in ArcGIS. They are available as individual categories here, too. I have also seen a great resource page on Esri Mapping Center with lots of additional styles. Quite many links on symbology standards here, too.


I am however looking for any good resources with symbology that can be not necessary related to mapping/physical geography, but rather something that describes world around us. The one I liked really much is available here, which is part of this library. I fully understand that there are plenty of resources like mapicons. Any other website worth visiting when looking for icons for map symbols?



Answer



There's the Maki Icon Project, its GitHub project page also has ArcGIS .style and .serverstyle files.


Spatial 1 to 1 join by proximity in sf and R?


I'm trying to join a Simple Feature of POINT geometry type to another Simple Feature of POLYGON geometry type using st_join and nngeo::st_nn as the person asking this other question also did. The difference is that I want a 1:1 match from both sides (i.e. one point per polygon).


enter image description here



Above is a picture of how my data looks like when mapped. My code looks like:


matched<-st_join(buildings, addresses, join=nngeo::st_nn, maxdist=50,k=1)

Instead of a 1 to 1 match, I keep getting more than one point merged to the same polygon or more than one polygon merged to the same point depending on how order the arguments to st_join, with repeated points or polygons accordingly.


What I want this to do is to match each polygon to the closest point which also has that polygon as the closest to itself (i.e., to match polygons to points when they are mutually the closest to each other), and to randomise matching when more than one point or polygon meet those criteria.


Is there any way of doing that with sf or sp in r? I have thought about extracting the distance matrices and doing a two-sided matching between the two sets of features (like a Gale- Shapley algorithm), but I want to give it a shot with what is already available before putting many hours into coding that.




python - Cannot get arcpy.MosiacToNewRaster to work with list



Below I have a snippet of a much larger code that I cannot get to work and I am not sure why. It seems that I always have trouble using arcpy functions with rasters. Can anyone help me get this code to work? Thanks!


outpath = 'G:\\PROJECTS\\Cedar\\Environmental\\FEMA\\Results\\Processing.gdb'
arcpy.env.workspace = outpath

# List the relative elevation rasters in gdb and Mosiac
rasters = arcpy.ListRasters('Rel_elv*')
arcpy.MosaicToNewRaster_management(rasters, outpath, 'Relative')

I have tried all sorts of different ways to do this and the mosiac to new raster function does not seem to like my list.



Answer




I figured out what the problem was...I did not read the tool help too closely the first time. It was not working because the number of bands was not an optional parameter. Once I filled it in my original code worked.


This worked:


    outpath = 'G:\\PROJECTS\\Cedar\\Environmental\\FEMA\\Results\\Processing.gdb'
arcpy.env.workspace = outpath

# List the relative elevation rasters in gdb and Mosiac
rasters = arcpy.ListRasters('Rel_elv*')
rcpy.MosaicToNewRaster_management(rasters, outpath, "Relative",
"", "32_BIT_FLOAT", "", 1, "LAST","FIRST")

qgis - How to increase label size of OpenLayers plugin basemap layers?


For QGIS 2.2.0-Valmiera, how does one increase the label size of OpenStreetMap layer or any other basemap layer? Presently the labels are way too small to read. For example, I find that for the same zoom level of a map in www.openstreetmap.com vs the same map in QGIS2.2, the map in QGIS2.2 has labels much smaller than the one from www.openstreetmap.com. For illustration, click here: https://drive.google.com/file/d/0B27Gf4KIu-6-NFVhSV9zUFBDZGM/edit?usp=sharing




How to delete empty fields from a Shapefile with QGIS or ogr2ogr?



I have a Shapefile with a lot of empty columns but many of them are empty. Using QGIS or ogr2ogr, how do I remove all attribute columns where everything is NULL?




qgis - Spatial join of a single point to multiple polygons



I have a point layer which I want to spatially join to a polygon layer (bring in the attributes).


This works fine unless there are overlapping polygons, if so the QGIS spatial join tool will only bring back the features from the first one found (or some average). An average is not suitable for text attributes.


What I would prefer to happen is that if there is multiple polygons it can be joined to, it will, and result in a duplication of the point. Essentially the ArcGIS: join_operation = JOIN_ONE_TO_MANY


I am working in QGIS 2.8+.


Happy with a script/PyQGIS solution, as I don't think there are any ready made tools for it. But would prefer not to have a SpatiaLite solution as I want to keep it simple for the end user.


Ultimately if there is a solution to this problem that someone has already implemented I would love to see it. If not I will try to come up with the solution myself.



Answer



You are correct (at least as far as I know) that QGIS' spatial join does not offer the relationship parameter available in ArcGIS' spatial join.


In your case, since you describe a many-to-one rather than one-to-many relationship (polygons to points), there is a workaround. If you had many points you wanted to join to each polygon it wouldn't work of course, because you'd have more than one value to fill the same available slot (that would require a relate, or duplicating the polygons). But you've got multiple slots to match and only a few points, so a quick workaround is to duplicate the points - each one time for each of the polygons it touches. The Intersect tool should do this.


Once you've intersected the two layers, you should have a new set of points with at least one point for each intersecting polygon, and the points will have the ID attribute of the polygons they intersect. Points that don't intersect will be dropped of course. You should then be able to do a regular attribute/table join between the two based on the polygon ID field.



Note that if you do have multiple point matches for a single polygon, you'll have issues with this join and be back where you started - the polygon will take the first point found and none of the others. We do have some other questions for that situation.


arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...