Tuesday 31 December 2019

javascript - Kml not overlaying onto Google Map


I created a basic dashboard with MVC 4, visual studio 2012, The dashboard displays a Google map. Here is an image of the dashboard: enter image description here


My intention is to be able to take this Kml file: enter image description here


And successfully overlay it, onto the Google map, on the dashboard.


I have heard about the 'Geomxl3', the Kml map parser, below is my layout of how i used the Geoxml3: enter image description here



However, it built without a problem and the dashboard opened up without a problem, but, there was no kml to be seen,the kml had not been overlayed. Is there a mistake in my script? am i not using the correct method of overlaying kml files onto a a Google map? must i use the Xml schema of my Kml file instead of the kml file itself? any advice would be helpful.


Please note: I have the Google Maps Api Javascript V3, I am using MVC4, visual studio 2012, My Kml is stored on a local server.



Answer



Changed browser from Google chrome to firefox it all works now


google earth engine - How to get several property values at once so there is only one call to .`evaluate`


I have a large FeatureCollection (100,000+ polygons) and want to get several values from it at once to display in a ui.Label with only one call to .evaluate. I have found the following way to do it that works but want to check if this is the BEST way:


var selectedstr = (ee.Feature(selectedstate.first()).select(['NAME', 'GEOID', 'ALAND', 'AWATER'], null, false)).toDictionary().values(['NAME', 'GEOID', 'ALAND', 'AWATER']);
var selectedarray;
selectedstr.evaluate(function(result) {
if (result) {
selectedarray = result.toString().split(",");

mylabel.setValue(selectedarray[0] + ", " + selectedarray[1] + ", " + selectedarray[2] + ", " + selectedarray[3]);
// do other stuff
}
});

Here's a working example on a much smaller layer.



Answer



You've got the right general idea — you should bundle all the results you want into one ComputedValue and evaluate that. To simplify and improve your code so far, you can delete .toString().split(","). The result is already a JavaScript array so doing this is at best redundant and at worst corrupts the data (e.g. if one of the values is a string containing a comma).


But in the bigger picture: You don't need to convert the properties to a dictionary and then to an array — just evaluate the feature itself. (But it is good to still use .select(), like you already are, so you don't compute and download any properties or geometry you don't actually plan to use.)


In order to handle the case when the collection is empty without encountering an error, we do .select() on the features before taking the first feature of the collection. (This will not waste computation because only the features needed for the final result are actually evaluated.)



var selectedFeature = selectedstate
.map(function(feature) {
return feature.select(['NAME', 'GEOID', 'ALAND', 'AWATER'], null, false);
})
.first();
selectedFeature.evaluate(function(evaluatedFeature) {
if (evaluatedFeature) {
var properties = evaluatedFeature.properties;
mylabel.setValue(
properties.NAME + ", " +

properties.GEOID + ", " +
properties.ALAND + ", " +
properties.AWATER);
} else {
mylabel.setValue("Nothing selected");
}
});

How to convert QGIS generated tiff images into jpg/jpeg using GDAL command line utilities?


I want to develop a web portal that will show the images dynamically produced by QGIS, but QGIS provided tiff images are too big in size and it is not possible to show it on website using image viewers. That's why I have decided to convert those tiff images into jpg/jpeg to reduce in size as well as browser friendly so that I can show those images on my website easily. But I don't know how to convert tiff images into jpg/jpeg using gdal commands.


Please help me in this regard.




postgis - Open-source GIS implementation of the Huff model


This might be a stretch, but I was wondering if anyone implemented gravity modeling tools or scripts for Retail Market Analysis (like the Huff Model), to analyze spatial data for customer-store relationships?


There are a couple ESRI sources (including Business Analyst extension), but I have not seen any for the open-source world. It seems like a place where PostGIS could really be leveraged for its database qualities.



Reference docs (outdated, but relevant articles explaining the concept):




Answer



As scw says in his comment the code itself seems to make use of some basic processing and loops so could probably be rewritten quite quickly in Python and Shapely.


However if you are looking for a script take a look at the following written in R..and German: http://www.reymann.eu/wp-content/uploads/2010/06/GravitationsgesetzHuff.R


Google Translate seems to indicate it provides the "Calculation of the purchase probability Huff's law of gravitation"



Linked to from http://www.reymann.eu/wettbewerbsanalysen/einzugsgebiet


It does have a copyright notice on it so maybe contact the author for further details. If you take out all the lines that print to the screen it seems R can implement it very concisely.


qgis - Grass 7 no longer works via the Processing Toolbox on Ubuntu


When I try to execute a Grass 7 algorithm via the Processing Toolbox in QGIS 2.10, 2.8.2 or 2.8.1, I get the following error:



Missing dependency. This algorithm cannot be run :-(



QGIS erroneously claims Grass 7 is not installed, but it was working fine a few months ago. I'm on Ubuntu 14.04.


I installed Grass 7.0.0-1~exp2~trusty3 on my desktop and laptop using the UbuntuGIS Unstable PPA. I installed QGIS 2.10.0 (desktop) and 2.8.2 (laptop) using the software repositories published on the QGIS website.


In an effort to track down the problem, I downgraded my laptop from QGIS 2.8.2 to 2.8.1 using the UbuntuGIS Unstable PPA; the same PPA that provides Grass 7. However, I still get the same error. So I suspect problem is with Grass 7.0.0-1~exp2~trusty3 that was compiled at the end of March.


For a time a few months ago I had Grass 6 and 7 installed along side each other, I was able to access both via Processing in QGIS. Grass 6 is no longer available in the UbuntuGIS Unstable PPA, so downgrading Grass 7 may be impossible.


Does anyone have a fix for this or are you experiencing the same problem?





arcgis desktop - Concatenating numbers and adding in leading zeros?


I have three text fields populated as follows: 1, 9, 1 that I need to concatenate into a field that looks like 01.09.01. Getting to 1.9.1 is easy but can anyone help out with how to formulate it so that each value has two places (i.e. add in those zeroes). I used to do this in Excel but can't figure out how to work in ArcMap 10.




Monday 30 December 2019

google earth engine - Error -Collection query aborted after accumulating over 5000 elements


I have created a code that suppose to take sentinel images and calculate the NDVI for a speficif polygon I have. I need it to run it for 2 years, but I have realized that for some reason the dataset is way too big and I get wrong numbers, for example, if I ask for the size of the dataset between 01/01/2018 to 03/01/2018 I get that the size of the dataset is 5099 images which doesn't make any sense.


this is the code I have up to now:


/**
* Function to mask clouds using the Sentinel-2 QA band
* @param {ee.Image} image Sentinel-2 image
* @return {ee.Image} cloud masked Sentinel-2 image
*/
function maskS2clouds(image) {
var qa = image.select('QA60');


// Bits 10 and 11 are clouds and cirrus, respectively.
var cloudBitMask = 1 << 10;
var cirrusBitMask = 1 << 11;

// Both flags should be set to zero, indicating clear conditions.
var mask = qa.bitwiseAnd(cloudBitMask).eq(0)
.and(qa.bitwiseAnd(cirrusBitMask).eq(0));

return image.updateMask(mask).divide(10000);

}

// Map the function over one year of data and take the median.
// Load Sentinel-2 TOA reflectance data.
var dataset = ee.ImageCollection('COPERNICUS/S2')
.filterDate('2018-01-01', '2018-01-03')
// Pre-filter to get less cloudy granules.
.filter(ee.Filter.lt('CLOUDY_PIXEL_PERCENTAGE', 20))
.select('B2','B3','B4','B8','QA60')
.map(maskS2clouds);


var rgbVis = {
min: 0.0,
max: 0.3,
bands: ['B4', 'B3', 'B2'],
};

var clippedCol=dataset.map(function(im){
return im.clip(geometry);
});


//test if clipping the image collection worked
Map.centerObject(geometry);
Map.addLayer(clippedCol.median(), rgbVis, 'RGB');

// Get the number of images.
var count = dataset.size();
print('Count: ',count);
print(clippedCol);//here I get the error messege "collection query aborted after accumulation over 5000 elements
print(dataset,'dataset');//the same error here



//function to calculate NDVI
var addNDVI = function(image) {
var ndvi = image.normalizedDifference(['B8', 'B4']).rename('NDVI');
return image.addBands(ndvi);
};

//NDVI to the ckipped image collection
var withNDVI = clippedCol.map(addNDVI);


// Test the addNDVI function on a single image.
var ndvi1 = withNDVI.select('NDVI').mean();


var colorizedVis = {
min: 0.0,
max: 1.0,
palette: [
'FFFFFF', 'CE7E45', 'DF923D', 'F1B555', 'FCD163', '99B718', '74A901',

'66A000', '529400', '3E8601', '207401', '056201', '004C00', '023B01',
'012E01', '011D01', '011301'
],
};


Map.addLayer(ndvi1,colorizedVis,'test');
Map.centerObject(geometry);



my goal is in the end to see the NDVI mean value for every image.



Answer



First to address your question, the Error occurs when you are trying to print more than 5000 elements in your console which it says. Since you have 5099 images in specified date range, the error arises. Since you don't want to do a global analysis as you are clipping images, you can actually use


.filterBounds(geometry)

on your image collection along with your other filters so that you only get images within your region which should be much lower. As for viewing mean NDVI, i can see your layer without any issue. however if you are not seeing the NDVI layer at all then that is probably because the images in that date range in your area has cloud cover more than 20 percent which is filtered out by


.filter(ee.Filter.lt('CLOUDY_PIXEL_PERCENTAGE', 20))

in your code. You could try increasing the value 20 to something higher but you might still get empty pixels because of the cloud masking. So it might be a good idea to see the images first without removing clounds and then play with dates so that you get a good number of non-cloudy pixels.


python - gdal.VectorTranslate returns an empty file



When testing the following code:


from osgeo import gdal
import json


def read_file(filename):
f = gdal.VSIFOpenL(filename, "rb")
if f is None:
return None
content = gdal.VSIFReadL(1, 10000, f).decode('UTF-8')

gdal.VSIFCloseL(f)
return content


gdal.VectorTranslate('out.json', """{
"type": "FeatureCollection",
"features": [
{ "type": "Feature", "geometry": { "type": "Polygon", "coordinates": "coordinates":[[[-188.56933593749997,61.58549218152362],[-173.9794921875,61.58549218152362],[-173.9794921875,66.00015035652659],[-188.56933593749997,66.00015035652659],[-188.56933593749997,61.58549218152362]]] } },
]
}""", format = 'GeoJSON', layerCreationOptions = ['RFC7946=YES', 'WRITE_BBOX=YES'])


got = read_file('/vsimem/out.json')
gdal.Unlink('/vsimem/out.json')

print json.loads(got)

from this answer, the file out.json is empty. The folder is with all permissions for writing allowed. When testing the method gdal.OpenEx, is possible to load an object, but still the method gdal.VectorTranslate returns an empty geojson when trying to pass this new loaded object as a srcDS object. Any hint on solving this issue?



Answer






  1. Your JSON is invalid - you have entered the "coordinates" key twice.




  2. You are trying to read from a different file than you are writing to. You are trying to write to out.json which will be a file on disk in the current working directory. You need to write to '/vsimem/out.json'




To pick up these issues more easily, use gdal.UseExceptions() so GDAL raises an exception instead of returning None when it can't open something.


I've also updated your read_file script to find the length automatically.


from osgeo import gdal
import json


def read_file(filename):
vsifile = gdal.VSIFOpenL(filename,'r')
gdal.VSIFSeekL(vsifile, 0, 2)
vsileng = gdal.VSIFTellL(vsifile)
gdal.VSIFSeekL(vsifile, 0, 0)
return gdal.VSIFReadL(1, vsileng, vsifile)

gdal.UseExceptions() #Fail when can't open!


gjs = '''{
"type": "FeatureCollection",
"features": [
{ "type": "Feature", "geometry": { "type": "Polygon", "coordinates":[[[-188.56933593749997,61.58549218152362],[-173.9794921875,61.58549218152362],[-173.9794921875,66.00015035652659],[-188.56933593749997,66.00015035652659],[-188.56933593749997,61.58549218152362]]] } },
]
}'''

ds = gdal.VectorTranslate('/vsimem/out.json', gjs, format = 'GeoJSON', layerCreationOptions = ['RFC7946=YES', 'WRITE_BBOX=YES'])

print(ds)


got = read_file('/vsimem/out.json')

#Works
print(got)

#Fails as the json module can't read you json (nor can https://jsonlint.com)
print(json.loads(got))

And once you have valid GeoJSON, gdal.VectorTranslate will still return an empty dataset. This is a known python GDAL "gotcha". The data doesn't get written until the dataset is closed and dereferenced.



This works:


from osgeo import gdal
gdal.UseExceptions()
srcDS = gdal.OpenEx('input.geojson')
ds = gdal.VectorTranslate('output.geojson', srcDS=srcDS, format = 'GeoJSON', layerCreationOptions = ['RFC7946=YES', 'WRITE_BBOX=YES'])

#Dereference and close dataset, then reopen.
del ds
ds = gdal.OpenEx('output.geojson')


print(srcDS.GetLayer(0).GetFeatureCount())
print(ds.GetLayer(0).GetFeatureCount())

arcobjects - Printing MXD at A3 always prints at B5


Context


ArcGIS 9.2 Build 1500



  • ArcObjects Windows Forms application running outside of ArcMap, similar to ArcEngine product.


  • Accessing mxd to print out page layout to A3 paper. Mxd is saved with same settings as printer, see below:

  • Original code was copied from PrintActive View sample on ESRI site. This sample does run in the ArcMap environment, but I don't think this should have a bearing on my problem.


enter image description here


Actual Problem


For some strange reason when my code prints out the map - it always prints to B5 which is obviously wrong as it should be A3. Printer complains that it doesn't have B5 - no map gets printed and misery ensues!
On further inspection of the code it seems that something strange is happening, the code below is a cut down version of the sample code just to show how you get the problem:


IPrinter docPrinter;
IPaper docPaper;


/* printdocument is from the .NET assembly system.drawing.printing */
System.Drawing.Printing.PrintDocument sysPrintDocumentDocument;

/* Now we need to get the default printer name. Since this is a generic command,
* we can't use the printername property of the document. So instead, we use the
* System.Drawing.Printing objects to find the default printer.
*/
docPrinter = new EmfPrinterClass();
sysPrintDocumentDocument = new System.Drawing.Printing.PrintDocument();
docPaper = new PaperClass();


/* testing to see if printer instantiated in sysPrintDocumentDocument is the
* default printer. It SHOULD be, but this is just a reality check.
*/
bool isDefault = sysPrintDocumentDocument.PrinterSettings.IsDefaultPrinter;

if (isDefault)
{
//FIRST STEP
//Set docPaper's printername to the printername of the default printer

docPaper.PrinterName = sysPrintDocumentDocument.PrinterSettings.PrinterName;

//Need to have the same paper form as the mxd
docPaper.FormID = (short) mapDocument.PageLayout.Page.FormID;
}

/* Now assign docPrinter the paper and with it the printername. This process is two steps
* because you cannot change an IPrinter's printer except by passing it as a part of
* the IPaper. That's why we setup docPrinter.Paper.PrinterName first.
*/

//SECOND STEP
docPrinter.Paper = docPaper;

when the code sets the docPaper.FormID (it's defaulted to 1 which is esrPageFormLetter) and overwrites this value with the value in the mxd (IMapDocument) mapDocument.PageLayout.Page.FormID - in this case its 13 (esrPageFormSameAsPrinter), that makes sense as we saved the mxd to "Use Printer Paper Settings" (see image above).


The 13 value that is passed seems to cause some conflict in translation to the IPaper object - because it was passed 13 it thinks that is B5 and not to try and infer out the printer settings which should be A3.


The best I can make out of this - is that it must have something to do with Windows paper size constants e.g. DMPAPER_B5 - 13 - B5 (JIS) 182 x 257 mm.


Does anyone have a clue how to get round this problem?


enter image description here




I cannot install QGIS 2.4 on my Windows 7, 64 bit machine




I am running Windows 7, 64 bit and cannot install QGIS 2.4 after several days of trying. I have also tried downloading 2.2 with no avail. I keep getting this error:


NSIS error


The text of the error is:



NSIS Error


Installer integrity check has failed. Common causes include incomplete download and damaged media. Contact the installer's author to obtain a new copy.


More information at: http://nsis.sf.net/NSIS_Error





openlayers 2 - Popup is stationary does not move with the location


I am using the GeoExt example and used the same code in my application. This is my code


The problem I am facing is that the Popup box in my application on http://128.196.142.94/geo/test/test.html is stationary. Where ever you may click, the popup simply does not move. Not sure where I am going wrong?




Answer



anchored: false should be anchored: true:


if (!popup) {
popup = new GeoExt.Popup({
title: "Popup",
width: 200,
maximizable: true,
collapsible: true,
map: mapPanel.map,
anchored: true,

listeners: {
close: function() {
// closing a popup destroys it, but our reference is truthy
popup = null;
}
}
});
}

And remove location: loc.xy from popup definition.



loc.xy is undefined and if you have a look into GeoExt Popup.js source you can find that in this case anchored property automatically sets to false.


UPDATE:


It looks like you are using old version of geoext library. I've adapted your example for working with this version:


var mapPanel, popup;

Ext.onReady(function() {

function addToPopup(loc) {

if (!popup) {

popup = new GeoExt.Popup({
title: "Popup",
width: 200,
maximizable: true,
collapsible: true,
map: mapPanel.map,
/* Fix #1
loc - object of OpenLayers.Pixel class
*/
location: loc,

anchored: true,
listeners: {
close: function() {
popup = null;
}
}
});
}

popup.add({

xtype: "box",
autoEl: {
/* Fix #2
Convert pixel coordinates to LonLat
*/
html: "You clicked on (" + mapPanel.map.getLonLatFromViewPortPx(loc).lon.toFixed(2) + ", " + mapPanel.map.getLonLatFromViewPortPx(loc).lat.toFixed(2) + ")"
}
});

/* Fix #3

It is not necessary because location was defined within constructor
*/
//popup.location = loc;

popup.doLayout();
popup.show();
}

var mapPanel = new GeoExt.MapPanel({
title: "Map",

renderTo: "container",
width: 650, height: 356,
layers: [
new OpenLayers.Layer.WMS(
"Global Imagery",
"http://maps.opengeo.org/geowebcache/service/wms",
{layers: "bluemarble"}
)
],
center: [0, 0],

zoom: 2
});

var control = new OpenLayers.Control.Click({
trigger: function(evt) {
/* Fix #4
Get pixel coordinates from click and send to popup constructor
*/
var loc = evt.xy;
if (popup) popup.close();

addToPopup(loc);
}
});

mapPanel.map.addControl(control);
control.activate();

});

arcgis 10.0 - Permanently replace/update server/service in each ArcSDE Layer in MXD?


Using the mxd.findAndReplaceWorkspacePaths method, if and MXD is updated and the path the the .sde connection file moves again all of the data sources will break again.


Is there a method to permanently update each SDE layer in an MXD Table of Contents to permanently store the SDE Server and Service? Ie. Just replace the SERVER and INSTANCE for each SDE layer in an MXD using python or ArcObjects in Pyhton?



Answer




I have found a solution to breaking the dependence on path to the .sde connection file using ArcPy and the following method.


This way works:


lyr.replaceDataSource(sdeConn, "SDE_WORKSPACE", lyr.datasetName, False)
table.replaceDataSource(sdeConn, "SDE_WORKSPACE", lyr.datasetName, False)

You can rename or delete the connection file after the MXD has been repaired and saved. It will continue to open without issue after the connection file is no longer there.


These methods DO NOT work:


mxd.findAndReplaceWorkspacePaths
lyr.findAndReplaceWorkspacePath


You CANNOT rename or delete the connection file after the MXD has been repaired. The MXD will look for the .sde connection file in the same location always each time it's opened.


qgis - Join Point to the Nearest Polygon


I'm using QGis 2.18.3 on Windows 10.


I'm trying to join a point layer with a polygon layer. However, since it is a city grid map, points that do not fall within the polygons do not get joined.


My expectation was that QGis could join those points to the nearest polygon.


Can somebody help me with that?


Here's a picture of my map.






Join a CSV file to shapefile using gdal/ogr?


I have a shapefile with several attributes, for example YEAR, COUNTY, and AREA. I also have a CSV file with more fields that I want in the shapefile, such as POPULATION. Both the shapefile and the CSV file have a field GISJOIN. I know how to do a join in QGIS. But how can I make a permanent join and write to a shapefile using ogr2ogr or one of the other tools in GDAL/OGR?



Answer



The ogr2ogr utility supports a limited sql syntax. You can join your CSV to the shapefile using something like the following:


ogr2ogr -sql "select inshape.*, joincsv.* from inshape left join 'joincsv.csv'.joincsv on inshape.GISJOIN = joincsv.GISJOIN" shape_join.shp inshape.shp

Adding new field with expression in pyqgis


I want to use PyQGIS in QGIS 2.18 to add a new field to a vector (line) layer and calculate the bearings for each feature as i have been able to do easily in the field calculator using the expression


concat(floor(degrees(azimuth(start_point($geometry), end_point($geometry)))), '° ', 
floor(degrees(azimuth(start_point($geometry), end_point($geometry)))*60 % 60), ''' ',
degrees(azimuth(start_point($geometry), end_point($geometry)))*3600 % 60, '''')

The result of this expression is a string that displays the degrees, minute, seconds value


After a little research, i tried to carry out the task with this script which produced an error.(Actually i was able to use this script to create and update a field that shows the length of lines of the layer, using QVariant.Double and $length for the expression)


from PyQt4.QtCore import QVariant

from qgis.core import QgsField, QgsExpression, QgsFeature

vl = iface.activeLayer()
vl.startEditing()

myField = QgsField( 'Bearing', QVariant.String)
vl.dataProvider().addAttributes([myField])
vl.updateFields()
idx = vl.fieldNameIndex('Bearing')


#next step
exp = QgsExpression('
concat(floor(degrees(azimuth(start_point($geometry), end_point($geometry)))), '° ',
floor(degrees(azimuth(start_point($geometry), end_point($geometry)))*60 % 60), ''' ',
degrees(azimuth(start_point($geometry), end_point($geometry)))*3600 % 60, '''')')
exp.prepare( vl.pendingFields() )

for f in vl.getFeatures():
f[idx] = exp.evaluate( f )
vl.updateFeature( f )


vl.commitChanges()

Where am I going wrong?



Answer



You don't need this kind of expression if you want to work directly with PyQGIS. Following code does the same and it is more legible.


from PyQt4.QtCore import QVariant
import math

layer = iface.activeLayer()


myField = QgsField( 'Bearing', QVariant.String)

layer.dataProvider().addAttributes([myField])
layer.updateFields()

idx = layer.fieldNameIndex('Bearing')

layer.startEditing()


for feat in layer.getFeatures():
pt1 = feat.geometry().asPolyline()[0]
pt2 = feat.geometry().asPolyline()[1]
az = pt1.azimuth(pt2)
if az < 0:
az += 360
minutes = az%1.0*60
seconds = minutes%1.0*60
string = str(int(math.floor(az))) + "\xb0 " + str(int(math.floor(minutes))) + "' " + str(seconds) + "''"


feat[idx] = string
layer.updateFeature( feat )

layer.commitChanges()

I tried it out with shapefile of following image where it can be observed, after running above script, that 'with_form' field (calculated with your field expression) and 'Bearing' field are identical.


enter image description here


Sunday 29 December 2019

algorithm - Dividing polygon into specific sizes using ArcGIS Desktop?


I have several thousand irregularly shaped polygons in a shapefile. I want to be able to split each polygon into three areas, and to specify what the size of those areas (they sum to the previous total area). It does not matter what the sub-polygon shape is, as this is for visualization purposes.


How can I go about doing this? Is there a standard algorithm that exists that I can use?




One approach I considered was to get all the points that make up the polygon. Then, I would join two randomly together using a straight line, split the polygon, then check if the area was within a satisfactory tolerance. If it was too small, I would change the point in one direction; if it was too large, I would change to a point in the opposite direction.



Answer



This problem has many valid solutions. One of them works a little like your description, but instead of slicing the polygons at "random" locations you can do it purposefully in a way designed to minimize the amount of computation.


Here is the basic algorithm. Its input consists of any plane sweep direction, a polygon P of nonzero area, a target area a between zero and the polygon's area, and a nonnegative threshold t (in units of area). Its purpose is to split P with a line perpendicular to the sweep direction into two parts, one to the right of the line and the other to the left of the line, such that the difference between the righthand area and the target area a is no greater than t.


Let L be any oriented line perpendicular to the sweep direction. Define f(L) to be the area of P found at the right of L, minus a. In these terms the task is to find a zero of f. Because f is unlikely to be differentiable, but is continuous, use either a bisection method, the secant method, or--my favorite--Brent's method. All are simple and guaranteed to converge. Use t for the convergence tolerance for the argument.



That's it. Let's consider what goes into coding this. The root finding is routine--you can use a generic chunk of code for it--so the GIS work comes down to coding f. Doing so requires


1.  Splitting the polygon by a line.
2. Computing the area of the piece(s) to the right of the line.

Both operations are implemented in almost any vector-based GIS. If not, you can replace the line by a very large rectangle representing the half-plane to the right of the line. Step 1 becomes


1'. Clip the polygon to the rectangle.

That is a really basic operation.


To get started with root finding, you need to find an interval in which the zero of f is guaranteed to lie. This is easy: project the polygon's envelope ("bounding box") into the direction of the line sweep. The projection is the interval you want.


This question has a long history. I implemented this algorithm for ArcView 3.x long ago and described it many times in the old ESRI user forums. Google




huber split polygon site:forums.esri.com



for discussions, links to code, enhancements and variations (such as splitting polygons into parts of desired sizes which are as compact as possible) and algorithms for raster data.


Here is what the continental US states look like (in an equal area projection) with the bottom third of each state shaded. Evidently the sweep direction was vertical.


alt text


shapefile - What is the cause of 'tearing' of polygons (artifacts) using R, ggplot and geom_polygon?


Thanks to the answer given in this question I have been able to subset and draw a map of electoral divisions in part of the UK, in this case Pembrokeshire. The resulting data frame is large and contains Ordnance Survey data so it would be difficult to post here, but the data frame looks like this:


> str(bar)

'data.frame': 134609 obs. of 7 variables:
$ long : num 214206 214203 214202 214198 214187 ...
$ lat : num 207320 207333 207339 207347 207357 ...
$ order: int 1 2 3 4 5 6 7 8 9 10 ...
$ hole : logi FALSE FALSE FALSE FALSE FALSE FALSE ...
$ piece: Factor w/ 12 levels "1","2","3","4",..: 1 1 1 1 1 1 1 1 1 1 ...
$ group: Factor w/ 82 levels "Amroth ED.1",..: 1 1 1 1 1 1 1 1 1 1 ...
$ id : chr "Amroth ED" "Amroth ED" "Amroth ED" "Amroth ED" ...

I fed the resulting data frame to ggplot using the following code:



ggplot(bar, aes(x = long, y = lat, group = group)) +
geom_polygon(colour = "black", fill = "grey50")

This generates the following image, which looks nice and clean. map of electoral divisions


Then I combined this with a data frame containing population data, which looks like this:


> str(mydf)
'data.frame': 60 obs. of 22 variables:
$ ward.code : chr "00NSPH" "00NSPJ" "00NSPK" "00NSPL" ...
$ id : chr "Amroth ED" "Burton ED" "Camrose ED" "Carew ED" ...
$ la : chr "Pembrokeshire" "Pembrokeshire" "Pembrokeshire" "Pembrokeshire" ...

$ total : num 1237 1737 2458 1570 1976 ...
$ age.0.4 : num 34 86 81 92 107 76 131 77 90 95 ...
$ age.5.9 : num 45 93 83 80 138 82 111 85 132 75 ...
$ age.10.14 : num 65 116 123 103 111 79 151 80 135 83 ...
$ age.15.19 : num 69 90 161 126 117 93 150 87 139 103 ...
$ age.20.24 : num 42 63 116 58 81 63 120 58 114 79 ...
$ age.25.29 : num 46 63 73 60 86 56 90 51 108 67 ...
$ age.30.34 : num 38 60 87 72 99 54 115 62 76 42 ...
$ age.35.39 : num 53 105 104 82 110 81 91 76 121 82 ...
$ age.40.44 : num 70 142 128 107 116 88 161 89 151 92 ...

$ age.45.49 : num 71 138 172 122 128 109 192 116 190 104 ...
$ age.50.54 : num 93 136 204 108 133 119 168 125 174 99 ...
$ age.55.59 : num 126 129 235 125 149 108 179 137 175 106 ...
$ age.60.64 : num 139 162 248 170 194 129 236 183 199 136 ...
$ age.65.69 : num 110 110 205 95 129 143 172 128 167 130 ...
$ age.70.74 : num 81 85 174 52 100 75 110 88 113 128 ...
$ age.75.79 : num 78 54 130 58 74 70 72 68 119 114 ...
$ age.80.84 : num 38 50 84 33 56 43 63 42 94 62 ...
$ age.85.plus: num 39 55 50 27 48 42 36 55 85 84 ...


...using the following code:


foo <- merge(mydf, bar)

and plotted the result like this:


ggplot(foo, aes(x = long, y = lat, group = group)) + 
geom_polygon(colour = "black", fill = "grey50")

The problem is that the resulting plot has artifacts as shown in the image below:


map with artifacts


So, the original data frame subset from the shapefile is fine, but the merged data file has 'issues'.



Q. What might be the cause of this kind of artifact? I understand that without the full code and data this is guesswork and I apologise in advance for this but the object is very large and there may also be redistribution issues. Any hints, pointers, suggestions as to where to start looking would be appreciated.



Answer



I have belatedly realised that the sort part of the merge call is to blame. If I use:


foo <- merge(mydf, bar, sort = FALSE)

The polygons plot correctly, at least in this particular case. Thanks to everybody for their input.


Making polygon from intersecting lines in QGIS using Polygonize or similar?


In QGIS, I want to make polygons from a set of intersecting lines. Looking at the screenshot here, I want chop the purple line (it's a closed line, not a polygon) where it crosses the green and blue lines. The result hopefully would be a polygon where I have hatched in red. I have quite a few to do.


Polygonize problem


I have tried the QGIS tool previously called Polygonizer (see An Introduction to the Polygonizer Plug-in), but now called Polygonize. This is a tool that, given a layer containing a set of crossing lines arranged essentially like a # symbol, should create a polygon from the enclosed space in the middle.


(In QGIS 2.01, Polygonize is in the top menu bar: go to Processing -> Commander, then type "Polygonize" on the command line, select "Processing algorithm: Polygonize" from the list that appears, and press Enter.)


At the momemt Polygonize doesn't work properly. Is there another way to do what I want to do?


For example, wouldn't it be great if the QGIS 'Split Feature' command was able to work like MapInfo's 'Polyline Split', where you select the polygons to split, then select an already-drawn polyline to split by, and voila. But Split Feature seems not to work that way.





UPDATE: Detail of problems with running Polygonize.


I have 64-bit QGIS 2.01 installed to Windows 7 from the installer package, QGIS-OSGeo4W-2.0.1-3-Setup-x86_64.exe.


The routine processed an output from my shapefile, but did not create a polygon from the region enclosed by intersections between the blue, green and purple lines. (Note: in the actual input file, all these lines are on the same layer; the screenshot just colours them differently for illustrative purposes). What I got instead was a simple conversion of the purple outline to a polygon, as though the program completely ignored the blue and green lines.


When I node-edited the closed purple line, to open it so that I had a set of open crossing lines like in the # symbol, and then ran Polygonize again, nothing happened. No crash, but no visible polygon in the output file.


When I used Nick Hopton's own Laxton.kml data, I got a crash; see screenshot:


Laxton data crash




cartography - Mapping parliamentary results (categorical data) by share of polygon in QGIS


I'm trying to recreate the fill effect in this map:Slovak Parliamentary Eleciton Where each polygon is filled based on the percentage of vote each parliamentary party gained. The author wrote "I converted the inner polygons into raster data format and applied a pixel-counting technique to create subdivisions of accurate size." Source


I'm hoping to do this with QGIS, I have polygons with the number of votes each party has gotten in a different election. I can convert them to raster polygons, but I'm not sure how to color them and achieve the "stripe" effect.


Also I believe this data is "categorical" but correct me if there is a different term I should be using.




qgis - qgis2web not taking my rule-based styles


It is the first time I post here. Is qgis2web working with rule-based styling from QGIS? I've created a few rules to have a dynamic style: polygons when the scale is under 12,500 and points beyond that scale. However, when exporting it to webmap it is not taking the styles. Am I missing something out?


Style in QGIS working as expected


Style in Leaflet - only polygons and it should be dynamic depending on the scale


Please see below how I have styled it. Unfortunately, I don't think creating a point layer is going to work for this project. Just to give you some context... my polygon layer is originally stored in a Oracle database, published in Geoserver and added to QGIS as WFS. We will be updating the polygons so I don't want to be updating the points every time something changes.


How I have styled the layer




arcgis desktop - Python script for if/elif condition in field calculator



enter image description here


I need some help in calculating the last column (OD_pairs.Ai_) based on the condition; if travel_cost has a value below 10, OD-pairs.Ai_ takes the value of the first column(Aij_OjFCij..) if travel_cost has a value above 10, then return 0 in OD_pairs.Ai. So basically the third field to be calculated is either the corresponding value in the first field or zero, depending on the value in second field.



Answer




My solution is in Python, don't forget to switch to Python Parser in the Field Calculator window.


You asked for:


Pre-Logic Script Code:


def calcColumn(cost, Aji):
if cost < 10:
return Aji
elif cost > 10:
return 0

OD_pairs.Ai_ =



calcColumn(!Travel_Cost!, !Aij_OjFCij_Income4!)

Bare in mind if you have cost = 10, it won't work, as it only handles larger than or smaller than 10. You can also instead use:


Pre-Logic Script Code:


def calcColumn(cost, Aji):
if cost < 10:
return Aji
else:
return 0

arcgis desktop - How to change text legend orientation from right to left reading with arcpy


I working on mxd with ArcMap 10.3 . The default's legend properties is- Layout - left to right reading (Marked with a red ellipse) :


enter image description here


when it unmarked i get this legend: enter image description here


i want to know if there a way to mark the default box automatic from right to left reading option so the result will be like that (Very important is that all the legend items will be in straight line from left side):


enter image description here


Does anyone know of a way to do this with arcpy?


UPDATE:


I tried MrBubbles answer and the result was:



enter image description here


while i wanted result like that (i made it manually for the example): enter image description here


The answer of MrBubbles didn't help me because it very important that all the words in the legend will be right to left configuration
How can i do it?




How to setting proxy for osmosis?


I'm trying to use osmosis to update my data ( http://wiki.openstreetmap.org/wiki/Minutely_Mapnik ) but I'm facing the follow error:


org.openstreetmap.osmosis.core.OsmosisRuntimeException: Unable to read the state from the server.
at org.openstreetmap.osmosis.replication.common.ServerStateReader.getServerState(ServerStateReader.java:116)

at org.openstreetmap.osmosis.replication.common.ServerStateReader.getServerState(ServerStateReader.java:50)
at org.openstreetmap.osmosis.replication.v0_6.BaseReplicationDownloader.runImpl(BaseReplicationDownloader.java:290)
at org.openstreetmap.osmosis.replication.v0_6.BaseReplicationDownloader.run(BaseReplicationDownloader.java:383)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: Server returned HTTP response code: 407 for URL: http://planet.openstreetmap.org/replication/hour/state.txt
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1840)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1441)
at org.openstreetmap.osmosis.replication.common.ServerStateReader.getServerState(ServerStateReader.java:97)
... 4 more


jul 28, 2016 8:26:55 AM org.openstreetmap.osmosis.core.Osmosis main
GRAVE: Execution aborted.
org.openstreetmap.osmosis.core.OsmosisRuntimeException: One or more tasks failed.
at org.openstreetmap.osmosis.core.pipeline.common.Pipeline.waitForCompletion(Pipeline.java:146)
at org.openstreetmap.osmosis.core.Osmosis.run(Osmosis.java:92)
at org.openstreetmap.osmosis.core.Osmosis.main(Osmosis.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)

at org.codehaus.plexus.classworlds.launcher.Launcher.launchStandard(Launcher.java:330)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:238)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:415)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:356)
at org.codehaus.classworlds.Launcher.main(Launcher.java:47)

I'm already done export JAVACMD_OPTIONS="-Dhttp.proxyHost=MY_PROXY -Dhttp.proxyPort=6060 -Dhttp.proxyUser=MY_USER-Dhttp.proxyPassword=MY_PASSWORD" but with no success. I'm able to access this URL using the browser.


I know... 407 is for Authentication. The setting I'm using is correct for sure or my browser will give same error.


One more information: as seen in (How to configure Geoserver to use a proxy when accessing external services?) I'm facing same problem in Geoserver. SOME external WMS services gives me same 407 error, others don't.


I think this is because of proxy auth implementation inside the code (osmosis and geoserver may be using same method): https://stackoverflow.com/questions/17183277/http-error-407-proxy-authentication-required





Using ogr2ogr to convert GML to shapefile in Python?


I am trying to convert a GML to an ESRI shapefile using ogr2ogr utility in a python script.


I have successfully installed the GDAL/OGR package via osgeo but I am now struggling to find/understand any details on the syntax for using ogr2ogr in python.


After importing ogr, all I have found is ogr2ogr -f "ESRI Shapefile" output.shp input.gml


How do I use this in python? Is it as simple as assigning the input and output files?


Everything I try I just get syntax errors. Any pointers in the right direction to get me started would be great.





Saturday 28 December 2019

geoserver - Can a layer's style be modified dynamically using an SLD in OpenLayers?


I am a complete novice with GeoServer and OpenLayers. I'm trying to ascertain at what point a style is applied to a GeoServer layer which is drawn in OpenLayers, when using an SLD.


The Introduction to SLD topic says:



When adding a layer and a style to GeoServer at the same time, the style should be added first, so that the new layer can be associated with the style immediately



which implies that the style is stored with the layer in GeoServer. Does this mean that I can't change the layer's style dynamically in OpenLayers?


The Attribute-Based Polygon example shows how to symbolize a polygon based on its attributes. Could a user change this symbology dynamically, within an OpenLayers map which is running on their browser?


The OpenLayers Styling page says:




When a feature is added to a map, its style information can come from one of three sources:



Does this mean that it would be necessary to remove a layer, and add it to the map with a new SLD, in order to change its style?


Thanks



Answer



If we are talking about styling in general it differs for a WMS layer and for a feature (WFS let's say) If your layer is WMS then you have to create your sld request (you can do that dynamically) and attach it to your WMS request. Then WMS renders the layer based on your SLD. If your layer is a feature then rendering is a responsibility of the Openlayers. There you can give a style to it (so dynamically) by using the information in the link that you've given yourself (Openlayers Styling).


And about the sentence : "When adding a layer and a style to GeoServer at the same time, the style should be added first, so that the new layer can be associated with the style immediately" It is not relevant in this case. Geoserver and all WMS servers in general can render WMS layers dynamically based on your SLD request attached to the main request.


clip - Crop simple features object in R


Is there a function to crop sf map object, similar to maptools::pruneMap(lines, xlim= c(4, 10), ylim= c(10, 15)) used for SpatialPolygon or SpatialLine?


I am considering st_intersection() but there can be proper way.



Answer



st_intersection is probably the best way. Find whatever way works best to get an sf object to intersect with your input. Here's a way using the convenience of raster::extent and a mix of old and new. nc is created by example(st_read):


st_intersection(nc, st_set_crs(st_as_sf(as(raster::extent(-82, -80, 35, 36), "SpatialPolygons")), st_crs(nc)))

I don't think you can coax st_intersection to not need an exact matching CRS, so either set both of them to NA or make sure they are the same. There's no easy tools for bbox/extent afaik, so using raster is good way to make things easy-ish.


career - What are useful websites for scouting GIS related jobs?



What are useful websites for scouting GIS related jobs?



Answer






GIS Jobs Clearinghouse is a great site for both posting and finding Geospatial related jobs. I have used it to fill my last three vacancies. You don't need to create an account to search the jobs which is a big bonus. They even have an option to display job postings via a map as well as a twitter account announcing new postings.


geoext - Get mouse position in OpenGeo


I'm using GeoExt as part of OpenGeo Suite using code generated by the Open Geo suite SDK.


I would like to display the current mouse coordinate and don’t really know how to do it. As far as I know GeoExt has no plugin for this and don’t know how to add the OpenLayers.Control.MousePosition() to mi code. I found a simillar question here but in my case I dont have a GeoExt.MapPanel, instead I have a gxp.Viewer. Here is my code


/**

* Add all your dependencies here.
*
* @require RowExpander.js
* @require widgets/Viewer.js
* @require plugins/LayerManager.js
* @require plugins/AddLayers.js
* @require plugins/RemoveLayer.js
* @require plugins/FeatureManager.js
* @require plugins/FeatureGrid.js
* @require plugins/OLSource.js

* @require plugins/OSMSource.js
* @require plugins/WMSCSource.js
* @require plugins/WMSGetFeatureInfo.js
* @require plugins/Zoom.js
* @require plugins/ZoomToExtent.js
* @require plugins/ZoomToLayerExtent.js
* @require plugins/Legend.js
* @require plugins/GoogleGeocoder.js
* @require plugins/GoogleSource.js
* @require widgets/CrumbPanel.js

* @require plugins/QueryForm.js
* @require plugins/ZoomToSelectedFeatures.js
* @require plugins/LoadingIndicator.js

*/

var app = new gxp.Viewer({
portalConfig: {
layout: "border",
region: "center",


// by configuring items here, we don't need to configure portalItems
// and save a wrapping container
items: [{
id: "centerpanel",
xtype: "panel",
layout: "fit",
region: "center",
border: false,
items: ["mymap"]

}, {
// container for the layer manager etc.
id: "westpanel",
region: "west",
xtype: "gxp_crumbpanel",
collapsible: true,
//collapseMode: "mini",
//hideCollapseTool: true,
split: true,
border: true,

width: 200
}, {
// container for the FeatureGrid
id: "south",
region: "south",
xtype: "gxp_crumbpanel",
collapsible: true,
collapseMode: "mini",
collapsed: true,
hideCollapseTool: true,

split: true,
border: true,
height:200
}],
bbar: {id: "mybbar"}
},

// configuration of all tool plugins for this application
tools: [{
ptype: "gxp_layermanager",

outputConfig: {
id: "tree",
title: "Capas",
tbar: [] // we will add buttons to "tree.bbar" later
},
outputTarget: "westpanel"
}, {
ptype: "gxp_addlayers",
actionTarget: "tree.tbar"
}, {

ptype: "gxp_removelayer",
actionTarget: ["tree.tbar", "tree.contextMenu"]
}, {
ptype: "gxp_zoomtolayerextent",
actionTarget: ["tree.tbar", "tree.contextMenu"]
}, {
ptype: "gxp_zoomtoextent",
actionTarget: "map.tbar"
}, {
ptype: "gxp_legend",

actionTarget: "tree.tbar"
}, {
ptype: "gxp_zoom",
actionTarget: "map.tbar"
},{
ptype: "gxp_wmsgetfeatureinfo",
outputConfig: {
width: 400
}
}, {

ptype: "gxp_googlegeocoder",
outputTarget: "map.tbar",
outputConfig: {
emptyText: "Buscar(Google)..."
}
}, {
ptype: "gxp_featuremanager",
id: "featuremanager",
paging:false,
autoSetLayer: true,

//autoLoadFeatures:true
}, {
ptype: "gxp_featuregrid",
featureManager: "featuremanager",
outputConfig: {
id: "featuregrid"
},
outputTarget: "south"
}, {
ptype: "gxp_queryform",

featureManager: "featuremanager",
outputConfig: {
title: "Busqueda",
width: 400
},
actionTarget: ["tree.contextMenu", "map.tbar"],//"featuregrid.bbar"
appendActions: false,
autoExpand: "south",
}, {
ptype:"gxp_zoomtoselectedfeatures",

featureManager: "featuremanager",
outputTarget: "featuregrid.bbar"
}, {
ptype: "gxp_loadingindicator",
loadingMapMessage:"Cargando mapa..."
}],

// layer sources
defaultSourceType: "gxp_wmscsource",
sources: {

local: {
ptype: "gxp_wmscsource",
url: "/geoserver/wms",
version: "1.1.1"
},

osm: {
ptype: "gxp_osmsource"
},


google: {
ptype: "gxp_googlesource"
}
},

// map and layers
map: {
id: "mymap", // id needed to reference map in portalConfig above
title: "Mapa",
projection: "EPSG:900913",

center: [-10764594.758211, 4523072.3184791],
zoom: 4,
//controls: [pos],
layers: [{
source: "google",
name: "SATELLITE",
group: "background"
}, {
source: "google",
name: "ROADMAP",

group: "background"
}, {
source: "osm",
name: "mapnik",
group: "background",
selected: true
}, {
source: "local",
name: "CONAGUA:estados",
selected: true

}, {
source: "local",
name: "CONAGUA:Sitios",
selected: true
}],
items: [{
xtype: "gx_zoomslider",
vertical: true,
height: 100
}]

}

});


data - What to consider when hiring aerial LiDAR survey?



My organization is considering hiring a company to capture new LiDAR data to help with current stormwater issues that we are currently experiencing. I am a GIS analyst that has been assigned with this task.


Our current quote is for roughly 70 square km and the project deliverables will include:




  • 1m resolution orthophoto TIFF imagery.

  • 50cm resolution orthophoto TIFF imagery.

  • Tiled 11cm resolution orthophoto TIFF imagery - each image delivered in ATS sections.

  • LiDAR data collected with approximate 7 points per sq meter.

  • Raster bare earth DEM produced from the LiDAR data. Two files will be delivered:

    • 1 - 50cm grid spacing.

    • 1 - 1m grid spacing.




  • 6 new survey checkpoints.


We have been quoted roughly $30000 CAD (canadian dollars).



  1. Is cost of new LiDAR data reasonable for deliverables provided?

  2. What else should be considered when hiring aerial LiDAR survey?


Any feedback would be great. I have never had the luxury of being included in data purchases so this is all new to me.




javascript - Clipping TileLayer with vector polygon (clipping mask)


I would like to add a mask to clip layer (Tile) in a way that only the part bounded by a ol.Feature (polygon or multipolygon or layer) is visible and the rest is hidden (left transparent). The geometry is generated dynamically so I need a way to do that programatically.


I presented the question Clipping TileLayer with georeferenced polygon (clipping mask) (see).

Clipping map with vector layer (only layers zIndex == 0):


var style = new ol.style.Style({
fill: new ol.style.Fill({
color: 'black'
})
});

var clipLayer = new ol.layer.Image({
source: new ol.source.ImageVector({
source: new ol.source.Vector({

url: '//openlayers.org/en/v4.6.5/examples/data/geojson/countries.geojson',
format: new ol.format.GeoJSON()
}),
style: style
})
});
clipLayer.on('postcompose', function(e) {
e.context.globalCompositeOperation = 'source-over';
});
clipLayer.on('precompose', function(e) {

e.context.globalCompositeOperation = 'destination-in';
});

Example result: OSM (BaseLayer), TileLayer (Stamen) and vector layer


World map with a clipped overlay on Australia


Two layers were clipped (!) - OSM and TileLayer.


I need clipping TileLayer with vector layer (support zIndex).
Example: OSM (BaseLayer), TileLayer and vector layer


Australia map with a clipped overlay on South Australia


One layer were clipped - TileLayer.



Does anybody have some clues or examples to achieve this?



Answer



Solution 1:


You will find an example to do this using ol.source.Raster, quickly illustrated with an animated GIF.


Filter a particular layer with a vector source


I'm using it to combined an ol.source.ImageVector with an ol.source.Stamen (for this demo but any raster ol.source.* would work) FYI, the function ol.source.ImageVector is normally deprecated but with the new recommanded way, I don't see how to do it as I can't access to a source to mix both raster and vector layers sources within ol.source.Raster. (any clue is welcome to upgrade)


Solution 2:


Use an OpenLayers ol-ext plugin, see the demo that do more or less the same as solution 1 I've done (but I didn't look how it's done)


Friday 27 December 2019

gdal - Are there examples of high-level interfaces to NetCDF


The R raster package and GDAL both have independent wrappers around NetCDF-4* that treat an array variable as a "complete" dataset, i.e. they provide objects or interfaces that seamlessly give access to a variable inclusive with:




  • dimension variables

  • coordinate system metadata

  • value units

  • general other metadata


* I'm also interested in HDF4 and HDF5 extensions, but it seems less likely than for NetCDF-4.


I often see code that uses these at the lowest level, and I'm hoping to find more examples like those in raster and GDAL that have abstracted this away:


## psuedo code
load NetCDF

nc = open("file.nc")

v = getvar(nc, "somevariable")

x = getvar(nc, "somevariable_lon")
y = getvar(nc, "somevaraible_lat")

etc.


In R's raster and in GDAL it is much more like an extension of a traditional GIS raster, i.e. all the spatial context and metadata is present, and it extends to 3D and above. (There are problems with different conventions for rectilinear and curvilinear coordinates, but that's not my concern here).


Are there other open-source examples in widely used languages that provide the high-level interfaces like R's raster and GDAL?



Is there capacity in the NetCDF library itself for this higher level access? (I know how to use the lower level to build this, I want existing examples).


I'm also excluding uses of GDAL in Python, Perl, C# etc. - that is all just GDAL as far as I'm concerned. I'm also not interested in GIS interfaces that provide these via GDAL.


raster:
http://cran.r-project.org/web/packages/raster/index.html


GDAL: http://www.gdal.org


edit (2017):


Python's xarray is a good example




arcgis desktop - Calculating how many of certain feature lie within various buffer rings?


I am using ArcMap and I have a series of real estate parcels mapped along with a series of wind turbines. I have created multiple buffer rings around each real estate parcel with distances (0.5, 1, 1.5, and 2) miles.


What I am trying to do is determine how many turbines fall within each buffer ring for EVERY real estate parcel. The near tool is almost what I need, but not quite. I am trying to get an output of something like this.




And so on.


Is there a simple tool that can make this happen?



Answer



you can create your buffers, then use the Spatial join tool, with the join one to one option. In addition to the optional field statistics, you will have a count field.



JOIN_ONE_TO_ONE —If multiple join features are found that have the same spatial relationship with a single target feature, the attributes from the multiple join features will be aggregated using a field map merge rule. For example, if a point target feature is found within two separate polygon join features, the attributes from the two polygons will be aggregated before being transferred to the output point feature class. If one polygon has an attribute value of 3 and the other has a value of 7, and a Sum merge rule is specified, the aggregated value in the output feature class will be 10. The JOIN_ONE_TO_ONE option is the default.



Then you join by attribute for each buffer on your real estate parcels.


pyqgis - Checking/unchecking rule-visibility of layers at once in QGIS


I've got several rule-based layers in my project (qgis 2.18.13). Is it possible to check/uncheck the visibility of all rules (not the visibility of the layer itself) at once? If yes - is it possible to do this on all layers of the project at once too?



Answer



You could use something like the following to check or uncheck all rule-based style categories from all layers loaded in your project:



from PyQt4.QtCore import Qt

for layer in QgsMapLayerRegistry.instance().mapLayers().values():
renderer = layer.rendererV2()
if renderer.type() == 'RuleRenderer':
ltl = QgsProject.instance().layerTreeRoot().findLayer(layer.id())
ltm = iface.layerTreeView().model()
legendNodes = ltm.layerLegendNodes(ltl)
for ln in legendNodes:
ln.setData(Qt.Unchecked, Qt.CheckStateRole)

#ln.setData(Qt.Checked, Qt.CheckStateRole)

style - Blend differing line thicknesses in QGIS?


I'm mapping traffic flow along a series of roads (links), styling the line thickness to reflect traffic flow. Along the same road (adjoining links), traffic flow can differ, as other roads feed traffic in or out. I've currently got something that looks like the top part of the image below, but I want something that looks like the middle or bottom bit (where the two line thicknesses merge). Is this possible in QGIS (2.18.6), and if so, how?



enter image description here



Answer



You can use the "Arrow" option in the Symbol layer type. enter image description here Be changing the setting in the toolbar you can control the shape of the line and transfer it into the arrow shape you are looking for. enter image description here


dem - QGIS 2.2: GDAL Grid (Interpolation) Error


As much as i understand Raster -> Analysis -> Raster (Interpolation) can provide me with a terrain file out of a shape file with a Z coordinate. I select the input and output file and a Z field. It gives me this error: ERROR 6: GDALDriver::Create() ... no create method implemented for this format.


Unable to create target dataset "D:/shp/terrain.dem". GDALDriver::Create() ... no create method implemented for this format.



Tried some other formats (I'm aiming for either .dem, .hgt or .xyz) but got the same error everytime. What am I missing here?


The Vector layer contains X coord, Y coord, Z coord, number and name fields.




pyqgis - Adding multiple plugins to custom menu in QGIS


I am new to Python and QGIS and attempting to create some plugins to replicate in-house tools used in other GIS programs. I have added a custom menu using Python and added the plugin to the menu using the following code within the initGui method:


self.menu = QMenu("&Menu Name", self.iface.mainWindow().menuBar())
actions = self.iface.mainWindow().menuBar().actions()
lastAction = actions[-1]

self.iface.mainWindow().menuBar().insertMenu(lastAction,self.menu)
self.action = QAction(QIcon(":/plugins/Trial/icon.png"),"Trial plugin", self.iface.mainWindow())
self.action.triggered.connect(self.run)
self.menu.addAction(self.action)

However, if I try and add another plugin to the same menu I end up with duplicated menu item names with only one plugin in the menu, rather than all the plugins under the same menu. Is there a way of adding multiple plugins to an existing custom menu item?



Answer



There is a way. You need to check whether your menu is already present in the QGIS Menu Bar. If so, you can reuse it, otherwise, you create it.


In the initGui method of each of your plugins, add the following code (see comments for details):


# Check if the menu exists and get it

self.menu = self.iface.mainWindow().findChild( QMenu, '&My tools' )

# If the menu does not exist, create it!
if not self.menu:
self.menu = QMenu( '&My tools', self.iface.mainWindow().menuBar() )
self.menu.setObjectName( '&My tools' )
actions = self.iface.mainWindow().menuBar().actions()
lastAction = actions[-1]
self.iface.mainWindow().menuBar().insertMenu( lastAction, self.menu )


# Finally, add your action to the menu
self.menu.addAction( self.action )

shapefile - Atlas map generation in QGIS using point layer only


I am trying to automate generating maps with Atlas using points only. As the snapshot shows the data points for each village is shown by a color. Every four villages fall under one health area. There are more than 100 health areas in my dataset. I am trying to create one map per health area. Is that possible? I know that there is one way doing this by constructing a polygon for each health area, but I rather not to do it because there is no specific border between some villages and they overlap in some areas.


enter image description here



Answer



One possible way is that you create a polygon covering each group of your points based on the field in the attribute table that defines the health areas.


To create a polygon shapefile that covers points, you can use concave hull plugin which will exist in the processing toolbox after installing the plugin.


enter image description here


For example, I have a point shapefile in which a new ID was given for each zone. The new ID field is named NEW_ID2 (in your case select the field that represents health areas).


enter image description here



Using the Processing toolbox -> Concave hull by k-nearest neighbors plugin, select the Concave hull k-nearest neighbors highlighted in blue color above and open the tool:


enter image description here



  • Input file: select the point layer (health area points)

  • Under field: select the field that represents the health areas (in this example New_ID2)

  • Method: select Create concave hulls based on field

  • Give the output polygon shapefile a name


  • Run the tool


    This is the output:





enter image description here


Use this output polygon shapefile as a coverage area in the atlas. The output from the concave hull tool creates a new field that defines each polygon in the attibute table of the polygon shapefile. Use that field to define the page name in the atlas:


enter image description here


Here is the atlas output:


Page: 1


enter image description here


Page: 2


enter image description here



You can hide the polygon by selecting Hidden coverage layer as you can see in the above atlas tool image. In the exercise, I am showing the polygon so you can clearly see the coverage.


dem - Seeking elevation data for UK?


I'm looking for digital elevation data for UK.


Does anybody know if there is a good service providing data for free that are of better accuracy than ASTER-DEM that is around 30m and, if not, what is the best service for paid data with accuracy around 10m?




esri geodatabase - How to update length property of feature class field in ArcGIS Desktop?


I have a feature class with a string field that I want to update the length from 10 to 25.


Is there a way or a tool to update this property without having to create a new field and using field calculator to move over the records from old to new field?




This script below does what I want, the only thing is it pushes the updated field to the end of the table (not keeping the original field order).


http://arcscripts.esri.com/details.asp?dbid=16503



Another option that seemed promissing is to use the Feature Class to Feature Class tool (access this tool by right clicking a layer in ArcCatolog and selecting Export>to Geodatabase single option) . It does create a new layer, however you can update field names and property information. It also appears that you can change the field order using the "Move Input Field Up/Down" arrows, but they seem to not work.


There has to be a solution to just edit field properties without having to create a new field.



Answer



As far as I am aware there is currently no way to make schema edits in a geodatabase without either dropping and adding fields, or deleting and reloading feature classes/tables. The latter is what I recommend in order to maintain field order.


What I normally do is:



  1. Make a backup of the original feature class

  2. Export the original feature class's schema to an XML file

  3. Modify the schema in a text editor or ArcGIS Diagrammer 10.0, 10.1 or 10.2.

  4. Delete the original feature class


  5. Import the schema back into the geodatabase

  6. Use the Append tool, Simple Data Loader or Object Loader to load the contents of the backup feature class into the newly imported feature class. See the "About loading data into existing feature classes and tables" help topic for more information.


At ArcGIS 10.1 and up there are geoprocessing tools to handle the XML import/export but at 10.0 and earlier you either have to use ArcObjects or do it manually. For an ArcObjects example see: Export XML Workspace Document


arcgis desktop - Joining blank shp with dBASE Table


What is the best way to join following tables? I have 55 records in the second table and I would like to just "drag and drop" the data to the first one.


Shapefile table Shapefile table Database table Database table




arcpy - Generating multivalue choice list in ArcGIS using Tool Validation without using Frequency?



I'm trying to adapt a model and script combination found on ESRI's blog site titled 'Generating a multivalue choice list'.


However, I've concluded that part of the validation used in the embedded script is reliant upon the 'Frequency' Tool in order to function properly, but this is only available with and Advanced license (lame). The blog post explains the workflow and where to download the models and scripts (but I'll happily post them up here upon request). As far as I can tell, the core of the functionality I'm after, generating a multivalue choice list:


enter image description here


..is predicated upon the validation script working properly. Without the validation, I'm unable to get the values from the field to appear as a list. Is there anything I can remove out of this validation script to get the functionality I'm after, or is there a workaround? I'm unfamiliar with the validation process. Here is the code for the validation (I was going to post as a Code Sample, but this looks like it might be easier to follow): enter image description here


[Editor note: here is the actual validation code, the image is not correct]


import arcpy

class ToolValidator(object):
"""Class for validating a tool's parameter values and controlling
the behavior of the tool's dialog."""


def __init__(self):
"""Setup arcpy and the list of tool parameters."""
self.params = arcpy.GetParameterInfo()

def initializeParameters(self):
"""Refine the properties of a tool's parameters. This method is
called when the tool is opened."""
return


def updateParameters(self):
"""Modify the values and properties of parameters before internal
validation is performed. This method is called whenever a parmater
has been changed."""
if self.params[1].altered: #Set condition - if the input field value changes
if self.params[1].value: #if the field parameter has a value
for field in arcpy.Describe(self.params[0].value).fields: #iterate through fields in the input dataset
if field.name.lower() == self.params[1].value.value.lower(): #find the field object with the same name as field parameter
try:
if self.params[2].values: #if this parameter has seleted values

oldValues = self.params[2].values #set old values to the selected values
except Exception:
pass
values = set() #create an empty set
fieldname = self.params[1].value.value #set the value of variable fieldname equal to the input field value
FrequencyTable = arcpy.Frequency_analysis (self.params[0].value, "in_memory\Frequency", self.params[1].value.value, "") #for large tables create a frequency table
cursor = arcpy.SearchCursor(FrequencyTable, "", "", self.params[1].value.value, "{0} A".format(self.params[1].value.value)) #open a search cursor on the frequency table
for row in cursor: #loop through each value
values.add(row.getValue(fieldname)) #add the value to the set
self.params[2].filter.list = sorted(values) #set the filter list equal to the sorted values

newValues = self.params[2].filter.list
try:
if len(oldValues): # if some values are selected
self.params[2].values = [v for v in oldValues if v in newValues] # check if seleted values in new list,
# if yes, retain the seletion.
except Exception:
pass

def updateMessages(self):
"""Modify the messages created by internal validation for each tool

parameter. This method is called after internal validation."""
return

Is it possible that my assumption (via testing) that the validation is the key piece is false, and that something else isn't allowing the values to be exposed as a selectable list? Many thanks in advance. Having this type of functionality will really jump start the adoption of several key workflows I'm trying to distribute in our company!



Answer



I thought some people might find this valuable. ESRI was gracious enough to help work through this and find an alternate to the validation used in the blog post which does not require an Advanced license. While I certainly had to figure out some additional items, I cannot take credit for the validation code. But, the ends justify the means and this qualifies as the answer to my question. Here you go:


import arcpy
class ToolValidator(object):
"""Class for validating a tool's parameter values and controlling
the behavior of the tool's dialog."""


def __init__(self):
"""Setup arcpy and the list of tool parameters."""
self.params = arcpy.GetParameterInfo()

def initializeParameters(self):
"""Refine the properties of a tool's parameters. This method is
called when the tool is opened."""
return


def updateParameters(self):
"""Modify the values and properties of parameters before internal
validation is performed. This method is called whenever a parameter
has been changed."""
if self.params[0].value and self.params[1].value:
self.params[2].filter.list = sorted({row[0] for row in arcpy.da.SearchCursor(self.params[0].value, self.params[1].value.value) if row[0]})

def updateMessages(self):
"""Modify the messages created by internal validation for each tool
parameter. This method is called after internal validation."""

return

Using the arcpy.da.SearchCursor returns values from the chosen field very fast considering the number of records its searching (at least in my data). I may start a new thread to see if anyone has any ideas on how to apply a filter to the validation based on a query. I hope this helps someone, but I'm glad we have an answer!


openlayers - With a WMS server Url, how to know it is a Web map service or a Web map tile service?


I need to add wms map in my project, and the url of wms server will be provided by users.


Are there any ways to know the type of server (wms or wmts) with the url proposed by users ? Or to know that the server supports tiles or not ?


I want to know this because in Openlayers3, there are two functions to getMap: ol.source.ImageWMS and ol.source.TileWMS. I think I need to know the type of server before calling the right function ol3.




Thursday 26 December 2019

polygon - Is there a way of auto_increment for the ID column in QGIS


I'm creating a fictive map and so I need to create lots of points, lines and for sure polygons. Later on I export my data as geojson. But before that I aways have to go and give every element an unique ID.


I don't need a special sorting, like the biggest polygon gets the smallest ID or so. I just need all polygons with an ID at the end, without doing it manually like I have to do now.


Would be great if someone knows how to do that.



Answer



Using the field calculator is the way to go:


No ID was given in



  1. Digitize every features without any entering any Id.

  2. Before export, update unique Ids with the expression, '$Id' using the field calculator.



No ID given


Some ID's already given in



  1. If you have already ID's you can use '-$Id'. Make sure you just select new Features what means that that are 'NULL' in the id row. Simply do that by ordering the column.

  2. Now do the steps from the pictures:


enter image description here enter image description here


coordinate system - Transformation data austria - slovenia


can anyone please help with fitting together these 2 shape-files? many many thanx


https://www.dropbox.com/s/saa9yp7ecg1qxh1/Grenzen.rar




Can we request new OpenLayers WMS image for an extent without moving the map?


Can we request new OpenLayers WMS image for an extent without moving the map center ?


Reading OpenLayers code, OpenLayers move map center to new extent, and THEN do HTTP request to get the image.


Is it possible to instruct the map to HTTP load certain extent and when 'tileloaded' is fired, THEN load new WMS image?


Edit ----------------------





  1. The problem is I need to "tell the map to zoomToExtent" so it gets a new bound.




  2. OpenLayers Map sets its center and zoom first, and then request new map through HTTP.




  3. The map response will be available in 2 seconds, until that time, map view is blank (if new bound is outside current bound).


    if( result.zoom ) {


                           var bounds = result.data.map.mapBounds;

    var x1 = parseFloat( bounds.x1 );
    var x2 = parseFloat( bounds.x2 );
    var y1 = parseFloat( bounds.y1 );
    var y2 = parseFloat( bounds.y2 );

    map.zoomToExtent([x1, y1, x2, y2]);
    }


  4. What i think i need is fetchWmsBound() below.



         if( result.zoom ) {
    var bounds = result.data.map.mapBounds;

    var x1 = parseFloat( bounds.x1 );
    var x2 = parseFloat( bounds.x2 );
    var y1 = parseFloat( bounds.y1 );
    var y2 = parseFloat( bounds.y2 );

    function wmsLoaded() {
    map.redraw(true);

    }

    //
    // @param { } bounds,
    // @param { Function } callback to exec when wms is fetched
    // fetchWmsBound might trigger a 'fetched' event also

    map.wmslayer.fetchWmsBound( [x1, y1, x2, y2], wmsLoaded );

    }



Filed at https://github.com/openlayers/openlayers/issues/573




coordinate system - Choosing Projection for Gulf coast region of USA?


What are the most appropriate datum and projection I could use for the Gulf coast region - spanning Louisiana, Mississippi, Alabama, and Florida? I have to analyze block group data along the coast.




Get Attributes from 1:n relation in QGIS as Expression to use in Atlas


I need to set up way signs for something like a "hiking-GIS". In the first step I made a network analysis (using QNEAT3) to find the distance from the signs ("Way Signs") to the available destinations ("Destinations") along the trail network ("Ways"). The result is the "OD-Matrix". So far so good... to see all destination distances for every single sign, I was setting up a 1:n relation via project properties. You can see all relevant results in the screenshot:


enter image description here


I'm able to see all destinations with distances in the attribute form - great! But I need to get for every sign all it's destinations with distance. To get a real printable sign, I thought about to use the Atlas to switch trough every sign (layer 'Way Signs' as coverage layer) and use a expression text field to show the destinations with distances.


So the question is: How do I get the 1:n related attributes via expressions to use them in the Atlas (or anywhere else)?


enter image description here



Answer




A starting point with expressions would be to use the Aggregate and Array functions: A construct like the following will put all destinations (from the joined layer) in an array which belongs to the current atlas feature:


string_to_array(
Aggregate(
layer:='OD-Matrix',
aggregate:='concatenate',
expression:="destination_id", concatenator:=',',
filter:="origin_id"=attribute( @atlas_feature, 'name')))

with array_get() you can access the members of the array


qgis - Plotting layer given with Longitudes [0 360] to [-180 180]?


I have a world map in correct Longitudes from -180° to 180°. Additionally, I have a shapefile of data from South America, which shows at 290° instead of -70°. How can I reproject it using QGIS?


World map and data plotted at the wrong place



Answer



If you load the South American Data into QGIS as WGS84, then Save as ... under a different name and Selected CRS EPSG:3857 World mercator, the shape will find its correct place in South America (supposed that On-the-fly-reprojection is active).


In a second step, you can save the World Mercator file to jet another name and EPSG:4326, which will reproject the data into the usual +/- 180° degree frame.


Note that this method might fail if the North or South pole is included in your data.


arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...