Wednesday, 30 September 2015

Open source alternative to ArcGIS geoprocessing service concept?


I am searching for any examples of implementing a logic that Esri has built with geoprocessing services. You author a custom tool or it can be a system tool >> you publish it exposing this functionality as a web service >> you consume the web service with a client.



A classical simple example: user can click on the map to create a point > the coordinates are sent to the server > buffering operation is being executed at the back-end > the buffer zone feature (or just an image) is sent back to client and shown on the map canvas. I've been developing Esri GP services for last 3 years and it has been a nice experience specifically since version 10.1.


Are there any examples of the open source system/solution that one could use to build a similar logic?


I've googled and found sextante gis but there was nothing specific on how this works and how much of logic that is available through Esri is available. 52North also seem to be working on that, but again I am missing the clean explanation of the concept.



Answer



Using Pre-existing WPS or Building Them


There is a whole description/tutorial on web processing services (WPS) found here. Most of this is going to be done using HTTP requests sent to a server like GeoServer that is hosting this process. The GeoServer link will outline the general process of hosting and calling a WPS using their software. GeoServer's WPS extension comes with JTS Topology Suite processes, which is a library of a common spatial functions such as buffering and intersection operations. Here is their example on executing a buffer operation using the JTS library.


Other Ways


There are other ways to do it though. Our web team uses queries against their PostGRE/PostGIS database to do simple analysis, but I am not familiar with that fully.


One idea I have been starting to research involves using open-source python scripts such as PySal/GDAL to do spatial analysis. You would need to use some JavaScript/AJAX (or whatever server-side language) to execute those scripts on your server and spit back out the results. It would be a complicated option, but I believe it would give you better customization options as opposed to ESRI.


I think you should be able to break down features into a GeoJSON string to pass as a parameter into the python scripts. From there, you would need to convert that result back into a GeoJSON string to pass back to the client to display. Python packages such as Shapely can interpret a GeoJSON string. Others might need some work-around written up to utilize GeoJSON, or there may be a better option I haven't thought about using.



Another option would be to use GeoTools to write up a Java web application that would do spatial analysis. This also supports GeoJSON strings and has some built in spatial analysis functions. They have some screenshots showing their various applications.


It is doable to create your own WPS or run a server-side script. It won't be as easy as ESRI makes it, but it would be free if you're using open-source.


Whatever route you decide, update it here since it is a great topic!


Using ArcGIS ModelBuilder to perform In-line variable substitution for input data path?



Does anyone know how/if you can use in-line variable substitution for an input path name in ModelBuilder for ArcGIS 10?


I used "get field value" to retrieve a code from one feature class, and I want to substitute that text string in two places in another input variable pathname.



For instance, if I used "get field value" to retrieve the code "ABC123", I would use that to call up a raster (make raster layer) with this path name: "D:\Data\Rasters\%ABC123%\el_%ABC123%


Does anyone know if this will work?


ModelBuilder won't recognize this as a valid path name right now.




density - Finding outlier point in set of points, by distance, using QGIS?


I need to identify and eliminate the outlier points in a set of points, as depicted below. It seems a simple task, but I cannot find the answer anywhere.



I suspect that the proper way to go is through a density analysis. But I cannot figure out.


I am using QGIS.


OUTLIERS VISUALIZATION



Answer



An answer straight from ROSSI, Richard E., MULLA, David J., JOURNEL, Andre G. and FRANZ, Eldon H., 1992. Geostatistical tools for modeling and interpreting ecological spatial dependence. Ecological Monographs. 1 February 1992. Vol. 62, no. 2, p. 277–314. DOI 10.2307/2937096. Available (free) from: http://onlinelibrary.wiley.com/doi/10.2307/2937096/pdf



Outliers can be identified through a variety of means. One technique was described earlier: h-scattergram values that plot very far from the 45° line are likely outlier candidates. With this method it is incumbent on the researcher to investigate these possible outliers. For example, does the suspected outlier occur in an area of generally small or large values? Could the unusual value be an incorrectly coded datum? Is the suspected outlier's presence due to an environmental or organism anomaly? Only after good ecological judgment should an outlier be removed prior to variogram analysis. This process can be tedious for large, outlier-laden data sets, but it is perhaps the only legitimate means for outlier identification and removal. With so few rating categories in the present data, outliers will be particularly difficult to explicitly identify.


Many other, more automatic, outlier identification techniques have been proposed. Dowd (1984) can be consulted for reviews of many of the more popular- often called "resistant"- variogram methods, i.e., variograms that are resistant to the outliers' effects. Huber (1964, 1972) provides an excellent statistical examination of outlier-resistant estimation. Some of the more popular resistant variogram techniques are: the medium absolute deviation estimator (Dowd 1984, Journel 1984a), generalized distance measures (Journel 1989), median polish (Cressie 1984, 1986), the Cressie- Hawkins estimator (Cressie and Hawkins 1980), and Omre's estimator (Omre 1984).



Here is a more recent paper with other algorithms that detect spatial outliers by multi-iterations :



LU, C.-T., CHEN, D. and KOU, Y., 2003. Algorithms for spatial outlier detection. In: Proceedings of the Third IEEE International Conference on Data Mining (ICDM’03). IEEE Comput. Soc. 2003. p. 597–600. ISBN 978-0-7695-1978-4. Available from: http://europa.nvc.cs.vt.edu/~ctlu/Publication/1998-2006/ICDM-2003-P.pdf


Creating points based on distance and bearing from survey point using QGIS?


I am plotting witness tree data to calculate pre-settlement tree density. I have a shapefile with points along section lines (square miles) at each half mile. I am making another table that contains tree data - distance, bearing, and diameter - related by survey point to the existing shapefile attribute table. Is there a way to create points on the map for the trees based on the distance and bearing from each survey point?


I'm not necessarily asking for a detailed set of instructions (although if anyone wishes to, I'd be grateful) but rather just if it's possible and a couple key words to continue searching how to do it.




openstreetmap - mapnik rendering with OSM Carto style


When I want to make png tiles that look like ones in osm tile server what I need to do?



  1. Do I need to import data to postGIS db somehow WITH osm Carto style or I can just import it "normally" w/o style


  2. I need to convert CSS style to xml? (If the answer is yes- what is the best way to do this (I've tried with TileMill import))





  3. Last step is to do the "generate_tiles.py" script




I did mange to render files using OSM Bright, but it is not as good as the original osm look for my purpose.


thanks in advance



Answer



First, the OSM "default" style resides here, along with instructions for deploying it.



  1. You cannot import data in PostGIS without a style. Osm2pgsql requires a style file to function. There is one in OSM Bright, and there is another in the repository linked above.

  2. Yes. npm install carto (you'll need to install npm, obviously, with your package manager) and carto -l project.mml > osm.xml. The -l option is not mandatory (it's for resolving layer paths), but is a good habit.


  3. I recommend using more advanced scripts than generate_tiles. Check out polytiles and Nik4. They do not require editing their source code, at least.


Assigning common field value to segments between two points using ArcGIS for Desktop?


I am using ArcGIS 10.3 for Desktop.


I have a shapefile with polylines connecting points. Instead of having one single polyline connecting two points, there exist a number of consecutive segments connected to each other (see image below).


These segments, however, have no field values in common, so if I wanted to use Dissolve to merge all of them together it would not work the way I expect.


Is there a way to assign a common field value to all the segments between two consecutive points? This way, Dissolve could be used effectively. I am open to any kind of solution, and would prefer a Python-based one, if available.


Visual example: all segments in between point A and point B should get a common field value, so that they can be later merged. This should apply to all segments in between any two points.


enter image description here


This question is not a duplicate of ArcGIS 10.3: How to merge lines divided by junction points?, which I myself have asked. Rather, this question is a precursor to it, if you like.


Screenshot of the real situation. Red dots are the equivalent of Points A and B, while the segments are recognizable by the junction points in green (there is a segment in between each pair of consecutive green dots). For each red dot, I want one single incoming and one single outgoing line.


enter image description here




Answer



You could just give all lines a common attribute value, use dissolve to merge them, then use split-lines-at-points to recreate the segments between the red points.


arcgis desktop - Select rows where column values don't match the domain


In ArcGIS Desktop, I can use the Validate Features option in the Editor Toolbar menu to validate feature class columns against their respective domains. More info here in the Validating Features section.


However, this functionality only works on spatial feature classes, not on non-spatial tables. Additionally, even if I were using a spatial feature-class, the feedback that the Validate Features function provides is too limited to be useful.


I want to select records in non-spatial tables where column values don't match their respective domains.


How can I do this?



Answer



Sample data:


USER1.A_NUMBERS_TABLE 
+--------------+

| NUMBER_FIELD |
+--------------+
| 1 |
| 2 |
| 3 |
| 4 |
| 5 |
| 66 |
+--------------+




A_NUMBERS_DOMAIN_VW:

SELECT
SUBSTR(EXTRACTVALUE(CodedValues.COLUMN_VALUE, 'CodedValue/Code'),1,255) AS Code,
SUBSTR(EXTRACTVALUE(CodedValues.COLUMN_VALUE, 'CodedValue/Name'),1,255) AS Description
FROM
SDE.GDB_ITEMS_VW I
JOIN
SDE.GDB_ITEMTYPES IT

ON I.Type = IT.UUID, TABLE(XMLSEQUENCE(XMLType(Definition).Extract('/GPCodedValueDomain2/CodedValues/CodedValue'))) CodedValues
WHERE
I.NAME = 'A_NUMBERS_DOMAIN'


+------+-------------+
| CODE | DESCRIPTION |
+------+-------------+
| 1 | ONE |
| 2 | TWO |

| 3 | THREE |
| 4 | FOUR |
| 5 | FIVE |
+------+-------------+

Solution:


Use a sub-query.


In the Select by Attributes window:


NUMBER_FIELD NOT IN (
SELECT

CODE
FROM
USER1.A_NUMBERS_DOMAIN_VW)

Where I got the idea:


I often use the sub-query from the Query for duplicate records in a feature class table example:


In the Select by Attributes window:


NUMBER_FIELD In (
SELECT
NUMBER_FIELD

FROM
USER1.A_NUMBERS_TABLE
GROUP BY
NUMBER_FIELD
HAVING Count(*)>1 )

This is a pretty clever way to execute SQL in the Select by Attributes window, without being able to execute full SQL statements.


However, this only works in SQL-based databases such as enterprise geodatabases and personal geodatabases (not file geodatabases) -- as per the documentation. And I don't think it would work between different databases (query from one database to another).




Note: In the Select by Attributes window, we are limited to using only the WHERE clause (ESRI often calls this an SQL expression). Using sub-queries in the Select by Attributes window like this is extremely inefficient; sub-queries are really just a last resort. It would be much more appealing to use a join, rather than a sub-query, but we can't do joins because we can't execute full SQL statements from the Select by Attributes window.



Bulk convert DBF to CSV in a folder ArcGIS 10.1 using Python


I would like to convert 1000 dbf files to CSV. All dbf files are in one folder and they have sequential names (S:\output_tables\tcb1.dbf, S:\output_tables\tcb2.dbf, S:\output_tables\tcb3.dbf, up to S:\output_tables\tcb1000.dbf).


I would like S:\output_tables\1.csv, S:\output_tables\2.csv, S:\output_tables\3.csv up to S:\output_tables\1000.csv


I would like to eventually run in a ARCGIS 10.1 model so ArcPy would be the preferred solution but I'm open to other options.



Answer



I have only tested this very briefly (and with a limited variety of data), but this script demonstrates one way this might be accomplished:



import arcpy
import csv
import os
import codecs
import cStringIO

def batch_convert_dbf_to_csv(input_dir, output_dir, rename_func=None):
"""Converts shapefiles and standalone DBF tables within the input directory
input_dir to CSV files within the output directory output_dir. An
optional function rename_func may be used to manipulate the output file

name."""
# Set workspace to input directory
arcpy.env.workspace = input_dir

# List shapefiles and standalone DBF tables in workspace
tables = list_tables()

# Only proceed if there actually exists one or more shapefiles or DBF tables
if tables:
# Create output directory structure

make_output_dir(output_dir)

# Loop over shapefiles and DBF tables
for table in tables:
# Generate output filename
output_name = os.path.splitext(os.path.basename(table))[0]
if rename_func:
output_name = rename_func(output_name)
output_csv_file = os.path.join(output_dir,
output_name + os.extsep + 'csv')


# List input fields
fields = list_fields(table)

# Open input table for reading
rows = read_rows(table, fields)

# Set flag indicating whether we are overwriting an existing file
output_exists = os.path.isfile(output_csv_file)


# Attempt to create output CSV file
try:
write_unicode_csv(output_csv_file, rows, fields)

# Warn if we overwrite anything
if output_exists:
print 'warning: overwrote {0}'.format(output_csv_file)
else:
print 'wrote {0}'.format(output_csv_file)
except IOError:

print 'warning: unable to create output CSV file {0}'.format(
output_csv_file)
else:
print 'No DBF files found in workspace {0}'.format(input_dir)

def list_tables():
"""Returns a list of shapefiles and standalone DBF tables in the current
workspace."""
tables = arcpy.ListFeatureClasses('*.shp')
tables.extend(arcpy.ListTables('*', 'dBASE'))

return tables

def list_fields(table):
"""Returns a list of fields in the specified table, excluding the shape
field if present."""
desc = arcpy.Describe(table)
shape_field_name = desc.shapeFieldName if hasattr(
desc, 'shapeFieldName') else ''
return [field.name for field in desc.fields
if field.name != shape_field_name]


def read_rows(table, fields='*'):
"""Generator function that yields the rows of a table, including only the
specified fields."""
with arcpy.da.SearchCursor(table, fields) as rows:
for row in rows:
yield row

def write_unicode_csv(output_csv, rows, header_row=None):
"""Creates a UTF-8 encoded CSV file specified by output_csv containing the

specified rows and the optional header_row."""
with open(output_csv, 'wb') as f:
f.write(codecs.BOM_UTF8) # Write Byte Order Mark character so Excel
# knows this is a UTF-8 file
csv_writer = UnicodeWriter(f, dialect='excel', encoding='utf-8')
if header_row:
csv_writer.writerow(header_row)
csv_writer.writerows(rows)

def make_output_dir(path):

"""Creates the output directory structure if it does not already exist."""
if not os.path.isdir(path):
try:
os.makedirs(path)
print 'created dir {0}'.format(path)
except OSError:
if not os.path.isdir(path):
raise

class UnicodeWriter:

"""
A CSV writer which will write rows to CSV file 'f',
which is encoded in the given encoding.
Based on: https://docs.python.org/2/library/csv.html#examples
"""

def __init__(self, f, dialect=csv.excel, encoding='utf-8', **kwds):
# Redirect output to a queue
self.queue = cStringIO.StringIO()
self.writer = csv.writer(self.queue, dialect=dialect, **kwds)

self.stream = f
self.encoder = codecs.getincrementalencoder(encoding)()

def writerow(self, row):
self.writer.writerow([str(s).encode('utf-8') for s in row])
# Fetch UTF-8 output from the queue ...
data = self.queue.getvalue()
data = data.decode('utf-8')
# ... and reencode it into the target encoding
data = self.encoder.encode(data)

# write to the target stream
self.stream.write(data)
# empty queue
self.queue.truncate(0)

def writerows(self, rows):
for row in rows:
self.writerow(row)

if __name__ == '__main__':

# Configure script here, or modify to take parameters/arguments
input_dir = r'path\to\input_directory'
output_dir = r'path\to\output_directory'

# Customize this function to change renaming logic
def rename_func(input_name, default='output'):
# Strips non-digits from string
output_name = ''.join((char for char in input_name if char.isdigit()))

# Give filename a sensible default name if there are no digits

return output_name or default

# Run it
batch_convert_dbf_to_csv(input_dir, output_dir, rename_func)

This does not take any arguments/parameters so I leave that up to you. If you want to implement it as a script tool or Python toolbox, read the appropriate ESRI documentation.


It attempts some defensive coding techniques for things like mixed shapefile and standalone DBF content, omitting Shape fields, non-ASCII characters, non-existent directories, warning when it overwrites existing files, etc., but as I said, not well tested, so use at your own risk!


Using Hyperlink URL in attribute table of QGIS?


I would like to have the URL from an attribute (in the attribute table) as a 'hotlink' wherein you can click on the URL and it opens the browser with the data. As it is right now, I'm having to identify feature, R-click on the URL and copy/paste the URL into a browser.




Tuesday, 29 September 2015

gdal - gdal_merge'd files take more space


I have two 3.6 MB GeoTIFF compressed with LZW. When I merge them with:


gdal_merge.py *.tif -o ~/out.tif  -co COMPRESS=LZW -co BIGTIFF=YES

The files is 10x larger at 31 MB.



Is there some reason the filesize would increase when merging? They are a continuous region (all touching each other), in EPSG 3413 projection.


This question was asked here (How to merge raster scenes while maintaining small file size?) but not answered. That OP selected the "use compression" answer, but my compressed file is larger. I'm posting gdalinfo for one input file (first) and the final file (below).


NOTE Oddly, when I do this for >2 files, or the whole folder which is 800 MB, it is 8.4 GB uncompress, and LARGER, or 11 GB with LZW compression.


Driver: GTiff/GeoTIFF
Files: gimpdem0_0_v1.1.tif
Size is 8310, 15000
Coordinate System is:
PROJCS["unnamed",
GEOGCS["WGS 84",
DATUM["WGS_1984",

SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],
PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]],
PROJECTION["Polar_Stereographic"],
PARAMETER["latitude_of_origin",70],
PARAMETER["central_meridian",-45],
PARAMETER["scale_factor",1],

PARAMETER["false_easting",1],
PARAMETER["false_northing",1],
UNIT["metre",1,
AUTHORITY["EPSG","9001"]]]
Origin = (-640000.000000000000000,-2905550.000000000000000)
Pixel Size = (30.000000000000000,-30.000000000000000)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
COMPRESSION=LZW

INTERLEAVE=BAND
Corner Coordinates:
Upper Left ( -640000.000,-2905550.000) ( 57d25'19.50"W, 63d 1'28.29"N)
Lower Left ( -640000.000,-3355550.000) ( 55d47'53.80"W, 59d11'57.96"N)
Upper Right ( -390700.000,-2905550.000) ( 52d39'30.45"W, 63d24'19.24"N)
Lower Right ( -390700.000,-3355550.000) ( 51d38'28.63"W, 59d31'30.22"N)
Center ( -515350.000,-3130550.000) ( 54d20'53.46"W, 61d18'11.32"N)
Band 1 Block=8310x1 Type=Int16, ColorInterp=Gray

And the merged file:



Driver: GTiff/GeoTIFF
Files: /Users/kdm/Desktop/20171011-204436.tif
Size is 8310, 30000
Coordinate System is:
PROJCS["unnamed",
GEOGCS["WGS 84",
DATUM["WGS_1984",
SPHEROID["WGS 84",6378137,298.257223563,
AUTHORITY["EPSG","7030"]],
AUTHORITY["EPSG","6326"]],

PRIMEM["Greenwich",0],
UNIT["degree",0.0174532925199433],
AUTHORITY["EPSG","4326"]],
PROJECTION["Polar_Stereographic"],
PARAMETER["latitude_of_origin",70],
PARAMETER["central_meridian",-45],
PARAMETER["scale_factor",1],
PARAMETER["false_easting",1],
PARAMETER["false_northing",1],
UNIT["metre",1,

AUTHORITY["EPSG","9001"]]]
Origin = (-640000.000000000000000,-2455550.000000000000000)
Pixel Size = (30.000000000000000,-30.000000000000000)
Metadata:
AREA_OR_POINT=Area
Image Structure Metadata:
COMPRESSION=LZW
INTERLEAVE=BAND
Corner Coordinates:
Upper Left ( -640000.000,-2455550.000) ( 59d36'29.72"W, 66d52'51.74"N)

Lower Left ( -640000.000,-3355550.000) ( 55d47'53.80"W, 59d11'57.96"N)
Upper Right ( -390700.000,-2455550.000) ( 54d 2'25.84"W, 67d20' 7.02"N)
Lower Right ( -390700.000,-3355550.000) ( 51d38'28.63"W, 59d31'30.22"N)
Center ( -515350.000,-2905550.000) ( 55d 3'28.16"W, 63d14'13.92"N)
Band 1 Block=8310x1 Type=Int16, ColorInterp=Gray

Answer



To put the comments into an answer:


It looks like the processing was doing a bad job at predicting the optimal compression? Coupled with no tiling there was really inefficient storage.


The solution was to use the geotiff creation options PREDICTOR=2 & TILED=YES.


The final command to call is then



gdal_merge.py *.tif -o ~/out.tif  \
-co COMPRESS=LZW -co BIGTIFF=YES -co PREDICTOR=2 -co TILED=YES

EDIT


An explanation about what happens, taken from http://www.fileformat.info/format/tiff/corion-lzw.htm



The TIFF predictor algorithm


The idea is to make use of the fact that many continuous tone images rarely vary much in pixel value from one pixel to the next. In such images, if we replace the pixel values by differences between consecutive pixels, many of the differences should be 0, plus or minus 1, and so on. This reduces the apparent information content, and thus allows LZW to encode the data more compactly.



polyline creation - Creating arcs between two points in ArcPy?


I've been working on an python script automating the visualization of live point data. I have a table with two coordinate pairs. I'd like to connect these two point with an arc of a circle or similar (parabola??).


I've been able to connect the two points with a straight line, but it gives me a boring visual.


One of the major hurdles is my license level: ArcView.



Anyone have any idea how I can generate a set of points that represent the path of a line between these two points?


I can then use the points to polyline command.



Answer



It seems the most common problem with these types of "flow maps" is that when many lines are included, they collide to such a great extent that it makes it difficult to discern any non-obvious pattern (when reciprocal flows are considered it happens to an even greater extent). Also the long lines tend to dominate the graphic, although it is quite possible the distribution of flows is predominately over short spaces (for instance a host of different distributions between places tend to be similar to Levy flights). I suppose this isn't necessarily a bad thing (the long lines may be more intrinsically interesting that the short lines for many phenomenon), but I don't think we want to lose the forest for the trees so to speak.


Although I don't doubt I've missed some potential "solutions" that have been proposed, I will try to sum up some of the ways individuals have tried to solve the problem in work that I have come across.


Distorting the Lines


If you peruse some of the other threads on the sight you will see some examples of how people have dealt with this problem. In particular, the lines are distorted so they don't overlap with each other or other objects on the map. Whuber's answer on another similar question (already mentioned in a comment) is an example of this. A presentation by some researchers at Stanford demonstrates this same idea (Phan et al., 2005). Thanks for that presentation goes to dslamb for this answer on another thread (and all of the answers to that thread will be on interest for your question as well). I particularly find it interesting that one of the cardinal examples of this is the old immigration map by Minard is an example of a desirable output (circa 1864!).


Given your particular use case (small number of nodes and lines), this seems sufficient. The other "solutions" I present are more intended to visualize data with many lines and many origins-destinations (although I assume they will be useful summaries for the community in general, so I continue on regardless).


Using Alpha Blending, Color, and Line Width/Height


The maps I listed in that same thread previously noted, Representation of network flows are examples of these. The facebook friends is a good case of adjusting the alpha level of lines, so it takes many more flows to represent a darker (or brighter in that case) connection between the two places. This also demphasizes the longer lines because they tend to happen more infrequently. Similar logic comes from Value-by-Alpha maps for polygon areas (Roth et al., 2010) that have been mentioned on this forum before.



The other map I present in that same answer uses color, and a non-traditional 3d perspective arcing lines (Ratti et al., 2010). The authors used a clustering criteria to group homogenous areas together and color code them (so by definition the areas within the color have more similar flow patterns than between colors). The clustering criteria in and of itself could be interesting to identify patterns in the data, although it seems a likely problem with this, as Andrew Gelman has mentioned, is that it tells you pretty much what you already know, that places nearer to each other tend to have more connections.


Lastly, in this category I include techniques that weight the lines (similar to alpha blending) using either the line width, or in the case of the 3d perspective line height, to convey the volume of the flow. See the page on Tobler's flow mapping software page for some examples in 2d (and the other article I mentioned is an example in 3d using line heights). Also on that page Tobler has a very useful article describing the problems with flowmapping and their historical application (Tobler, 1987).


Another example in 3d is this answer by a mankoff on this site. This post on the Sociological images blog shows a useful way in a flow diagram to distinguish between in-flows and out-flows (although again it works because the number of nodes and relatively small, and the nodes on the network can be layed out in an arbitrary way to reduce overplotting). Those same types of arrows (and a few others using hashings) are also in (Tobler, 1987).


In the end though line width and color don't really solve the over-plotting problem. The arcs in 3d help somewhat, although with more complicated flow patterns I think they will have limited utility. IMO alpha blending seems to be the most useful in a wide variety of situations of these three, but color and line width could/should be used in conjunction with line distortion mentioned above.


Data Reduction


I group two types of techniques here, 1) using small multiple maps (i.e. many maps with inherently less objects to visualize so overplotting is reduced), or 2) other graphical representions, that are not lines, but represent some of the flows via density or choropleth maps. Examples of these can be found in (Corcoran et al., 2009; Rae, 2009; Wood et al., 2010) (thanks to iant for the Rae reference). These tend to reduce the amount of visual information presented by either presenting a series of small multiple maps (or just a smaller area), or use a choropleth mapping scheme to represent some statistic (examples could be number of in-flows, number of outflows, direction of the flows, average distance of the flows). If you have point level data you could represent these statistics through kernal density raster maps, or aggregate them into quadrats.


When information is reduced like this, overplotting isn't as much a problem. A very cool interactive online example is this migration map by Forbes magazine. You can only see one county at a time, but the reduction of information makes it much easier to parse the lines (and the difference between in-flows and out-flows). A recent post on the ESRI mapping blog also uses a similar technique with the small multiples (they also choose a particular projection for the world map to have "pretty looking" lines, and make good use of color to further highlight different international origins). In that example it works out pretty well because the end destination is the same for all flows, but if flows could be reciprocal it probably wouldn't work out as well.


Using Other Non-map Representations of Flows


Others on this site have suggested using alternative diagrams to the actual map to represent the flows (just mapping the origins and destinations in some other manner than their actual geographic location). Examples of these are either cicular visualizations (such as that produced by Circos), arc diagrams (see this example on Protovis, these are also called kriskograms (Xiao & Chun, 2009)), or matrix heat maps (here is another example from the Protovis website). Another option would be to use some type of automated network layout to identify patterns in the flows (such as that capable by Graphviz). Besides Graphviz it appears Gephi, the NetworkX python library, and some R libraries are popular tools as well (see this answer on the stats site).


The libraries I cite are pretty cool as they have developed interactive visualizations as well. Here is an example with a similar style to the circular graphics (although not circular!). Here is another interactive visualization using some of the line distortion techniques discussed earlier, network placement (that appears similar to circular Dorling cartograms) as well as other useful statistical summaries (I saw both of those examples originally on the information aesthetics blog).





Some other resources I think are useful are the software and articles coming from the Spatial Data Mining and Visual Analytics Lab. Also the crime travel demand modeling in the CrimeStat program is a gentle introduction to applicable regression techniques for such flow data. Either of these tools may allow you to identify interesting correlations in the flow patterns to other geographic information. Another place to perhaps recieve some useful inspiration for either graphically displaying the data or statistical analysis would be a recent issue of in the Journal of Computational and Graphical Statistics, Volume 20 Issue 2, on examining flight arrival/departure statistics for commercial carriers in the US from 1987 to 2008 (if you are interested in handling big data this would be worthwhile to examine as well). All the articles are free and they have associated posters with each paper.


In the end, the data and the medium will dictate how well some of these techniques work in reducing the visual clutter that comes along with flow data. I hope this is a useful place though to find ideas on how to deal with this visualization problem though. If you further refine your question into what you want to accomplish, then others can give useful feedback into actual programmatic implementations (if something is not already available).






*note, links to ungated pdf documents are included when I could find one


How to use thematic layer for supervised classification in eCognition


I'm classifying vegetation in eCognition. From ArcMap I already have a shapefile with training polygons that I would like to use for this but so far I was only able to insert this as a thematic layer and use for the segmentation process. Also, is eCognition able to handle the attribute table information where the classes for the polygons are stored or will I have to split the shapefile with all the information into individual shapefiles?




inconsistent types in google earth engine with Export.image.toDrive()


I have computed NDVI from Landsat7 image, but when I want to export the result in google drive with Export.image.toDrive() it failed with the message error : Fail with an error : Error: Exported bands must have compatible data types; found inconsistent types: Float32 and Float64.


How can I overpass this issue ?


/*-------------------------------------------------------------
FUNCTION
-------------------------------------------------------------*/

// This function adds a band representing the image timestamp.
var addTime = function(image) {

return image.addBands(image.metadata('system:time_start')
.divide(1000 * 60 * 60 * 24 * 365));
};

// COMPUTE NDVI OVER A COLLECTION FUNCTION
var addNDVI = function(image) {
var ndvi = image.normalizedDifference(['B4', 'B3']).rename('NDVI');
return image.addBands(ndvi);
};



/*-------------------------------------------------------------
END
-------------------------------------------------------------*/

//2001
var img2001 = ee.Image("LANDSAT/LE7_L1T/LE72040492001066EDC00").clip(zone_travail);

var withNDVI = addNDVI(img2001).select('NDVI');
var withNDVI = addTime(withNDVI);


print(withNDVI);


/*------------------------------------------------------------------------------------------
SAVE DATA IN MY DRIVE
--------------------------------------------------------------------------------------------*/


Export.image.toDrive({

image: withNDVI,
description: 'ls7_ndvi_2001',
scale: 30,
region: zone_travail
});

Answer



Cast the whole darn thing to a 32-bit float: withNDVI.float(). Of course, make sure you're not getting any overflow, but looks safe here.


Calculating area of rasters in QGIS?


I have been working with some rasters and would like to know how to calculate the area occupied by one element of a raster in QGIS?



Answer




with 3 steps you can achieve what you want:



  1. Vectorize raster based on the values you want to calculate

  2. Open your new vector, and create a new column. Name it 'Area'.

  3. Use the field calculator to populate the column with the variable $area


windows - How to fix QGIS error "Entry Point could not be located"?


Today, after returning from vacations, I started my QGIS 1.7 and got this terrible message:



enter image description here


Help, anyone?



Answer



In Portugal, the citizen card which offers an authentication and digital signature service that works via browsers puts older qt dll's in windows\system32. Renaming these stops these services from running...


You can copy QGIS Qtcore4.dll and QtGui4.dll to the folder where qgis.exe is placed. This works without renaming the older dll's. The reason this works is the search order used by windows to find the required dll's for an executable. First in line is the executable folder. After that it searches \windows\system32, and only after that PATH is searched.


Monday, 28 September 2015

arcgis 10.0 - How to compare schemas from two File geodatabases?


We have template geodatabase, which we send out to our regional Data creators. They update these geodatabases and digitize all the data from their region in the predefined featureclasses.


Now that we have got the File geodatabases, we find that often many of the featureclasses have been modified. Maybe a field might be added or removed. In some cases, the Feature-class itself is deleted or a new one created.


I would like to get a report indicating which feature-classes and tables are changed, as well as the differing Fields in the common tables & featureclasses.


I have already looked at several questions such as:



But the answers given in these Questions have not been helpful.




How to create an array of repeated values on the client side using JavaScript in Google Earth Engine?


I want to do something like:



var from = [2,3]
var to = Array(from.length).fill(1)
image.remap(from,to,0)

Of course, I could just assign [1,1] to to, but from is an input of a function and it is not guaranteed to have a length of 2.


The problem is that I get an error that says Array(...).fill is not a function


This suggests that the fill function is not supported, but I thought that the pure javascripts parts are interpreted in the browser's JavaScript engine, and only ee objects are sent to the Earth Engine. My browser definitely supports fill, as it works on its test page, so does anyone know why and how the code editor in Google Earth Engine does not support full Javascript? Also, does anyone know what JavaScript version is supported by Google Earth Engine's code editor? And lastly how can I actually create an array of variable length filled with 1 in Google Earth Engine's flavor of JavaScript?



Answer



You can't do it in client-side javascript (Rodrigo is correct in that it's a sandboxed environment for security reasons), but you can do it with a server-side function:


var to = ee.List.repeat(1, from.length)

pyqgis - How to import for QgsVectorLayerJoinInfo in QGIS3 (python)?


I am trying to join layers programmatically following the answers in the GIS.SE question - Join table field with shapefile programatically (via PyQgis)? :


# Get input (csv) and target (Shapefile) layers
shp=iface.activeLayer()
csv=iface.mapCanvas().layers()[0]


# Set properties for the join
shpField='code'
csvField='codigo'
joinObject = QgsVectorJoinInfo()
#...

But I get an error about QgsVectorJoinInfo not defined. I tried (unsuccessfully) to import it with:


from qgis.core import QgsVectorJoinInfo


(because the documentation linked above says that QgsVectorJoinInfo is in the "core" library).


How should I import it properly or otherwise make the above code work?


(In general, how to tell from documentation which library to import?)



Answer



The class has been renamed from QgsVectorJoinInfo to QgsVectorLayerJoinInfo.


You now need to call each join function with their associated parameter:


#...
csvField = 'id'
shpField = 'ID'
joinObject = QgsVectorLayerJoinInfo()

joinObject.setJoinFieldName(csvField)
joinObject.setTargetFieldName(shpField)
joinObject.setJoinLayerId(csv.id())
joinObject.setUsingMemoryCache(True)
joinObject.setJoinLayer(csv)
shp.addJoin(joinObject)

Example:


Example


OpenLayers 3: Vector Layers not Drawing during animated panning or mouse-drag events



It appears that features on an ol.layer.Vector are not painted or drawn during panning animations or mouse-drag events (in OpenLayers v.3.1.1) Does anyone have a work-around?


To see the problem, take a look at: http://jsfiddle.net/StephenM/489aveLr/6/


If you zoom in on the rectangle covering Antarctica and use the mouse to pan around, you will see that the rectangle is only painted after the mouse button is release and the pan-animation has completed, whichever happens last.


I found some references to this behaviour in older versions of OpenLayers but it was apparently fixed in version 2, already. Why am I seeing it here? I have tested in Internet Explorer and both Chrome and Firefox and the behaviour is the same.



Answer



As @ahocevar mentioned in your ticket, there is a patch for this, and will be implemented in version 3.2.0. Until then, there is a buffer type approach, which can be used to render more of the vector layer than the viewport + 100px. It is called renderBuffer, and can be set as a property of the vector layer.


var vectorLayer = new ol.layer.Vector({
source: new ol.source.Vector({
features: [ new ol.Feature({ geometry: polygon }) ]
}),

renderBuffer: 1366
});

I set it to 1366, because that is the X axis resolution of my monitor, and this way, I can't scroll in an unrendered range. Or you can just merge the fix in your OpenLayers 3 library.


UPDATE:


I forgot about the pulling animation. This way, the resolution of the monitor will hardly make any sense, but a 5000px value did the magic for me. Although I didn't test the performance of this workaround, so it might be slow with complex layers.


postgis - Nearest neighbour for every row


I'm relatively new to PostGIS and postgresql . I'm trying to build a simple query containing two relations as follows :


I've two relations R1 and R2 both of which contains geometries(Points) . I need to find out the closest point of every point in R1 which is in R2 . Say R1 is a relation of all hospitals and R2 is a relation of all medical shops , i need to find a medical shop closest to each hospital . More over i've a geometry types of srid 4326 in case you want to use ST_Distance or similar functions . Assume i need the medical shops within say 500 meters,say .



Answer



You'll need a correlated query for each row (or a Lateral join in 9.3+...maybe). Do note also that although indices on the geometry columns will be used it still will mean one query per hospital so it will be slow.


SELECT p1.gid,p1.name, p2.gid,p2.name, p2.geom <-> p1.geom as dist FROM
( SELECT p1.gid as g1,
(SELECT p.gid
FROM shops AS p

WHERE p1.gid<>p.gid
ORDER BY p.geom2<->p1.geom2 ASC LIMIT 1) AS g2
FROM hospitals AS p1
OFFSET 0
) AS q
JOIN hospitals AS p1
ON q.g1=p1.gid
JOIN shops AS p2
ON q.g2=p2.gid
WHERE dist<=500;


I used as parts of the code I used here so you may want to check it in case I messed up.


How do I access GeoTransform array from gdal on the command line?



I'm using the gdal library via OSGeo for windows, and I was just wondering how to access the geotransform array for a .ecw file.



Answer



This can be found from gdalinfo, except it has three forms:




  1. If rotation / shear coefficients (adfGeoTransform[2] and adfGeoTransform[4]) are zero, the output is simplified:
    Origin = (%.15f,%.15f) % adfGeoTransform[0], adfGeoTransform[3]
    Pixel Size = (%.15f,%.15f) % adfGeoTransform[1], adfGeoTransform[5]





  2. If rotation / shear coefficients are non-zero, the full 6-coefficients are shown:
    GeoTransform =
    %.16g, %.16g, %.16g % adfGeoTransform[0], adfGeoTransform[1], adfGeoTransform[2]
    %.16g, %.16g, %.16g % adfGeoTransform[3], adfGeoTransform[4], adfGeoTransform[5]




  3. If there is no GeoTransform information, then neither of the above two forms are shown.




The order of coefficients used by GDAL are important, and are documented here.



python - Finding coordinates of point on line joining two points at certain distance from one of points using shapely




I am new to GIS. The question I am going to ask has been asked before but it didn't solve my purpose. I am only using python and shapely. I need to find the coordinates of a point on a line joining two points at a given distance from one of the points. To help illustrate, I am including the following figure:


In the figure, I need to find the coordinates (lat/lon) values of the red points at a certain distance (say d1*1.01) from the point marked as C. Given are the points C, and P1,P2 and P3 with their coordinates in lat/lon format. How do I do find these values (lat/lon values of the red points) using only python and shapely. I don't want to incorporate Arcpy unless there is no other options.


enter image description here




coordinate system - Custom CRS in Qgis web - client


Hello I'm abit new here and I hope I'll even set the question right.


Recently I've started to work on a project that will include displaying maps in qgis-web-client. At the moment I've set up the server to display the basic examples. But now I'm trying to import my own qgis project. This specific project has an custom CRS. So far I've discovered on this site that the webclient does support custom CRSs. So without hesitating I've followed the guide and did the steps. The custom CRS file I created was Proj4js.defs["EPSG:4800"] = "+title=nevem +proj=tmerc +lat_0=0.0 +lon_0=15.0 +k=0.9999 +x_0=500000 +y_0=-5000000 +units=m +no_defs";


This CRS was copied from the project in the qgis desktop application. And I edited it so it would match with the other ones. After I run the web-client I get an error from the browser console that this.resolutions is undefined. Also I should mention that the layer names load properly. For the sake of testing I tried using the default CRS just to test my qgis project with it and it loaded the layers, but they were projected incorrectly because of the CRS.


So now the question is: How to make this CRS work with qgis-web-client? And what do the a and b variables mean in:Proj4js.defs["EPSG:3857"]= "+title=WGS 84 / Pseudo Mercator +proj=merc +a=6378137 +b=6378137 +lat_ts=0.0 +lon_0=0.0 +x_0=0.0 +y_0=0 +k=1.0 +units=m +nadgrids=@null +no_defs";



Answer



SO I'll start with how to set the CRS in your QGIS-webclient. First of all I got pink tiles over my project if the default background layers can't be projected with your CRS. Thats why I disabled the background layers and the client loaded without pink tiles. You can disable background layers in GlobalOptions.js file


var enableBingCommercialMaps = false;


var enableGoogleCommercialMaps = false;

var enableOSMMaps = false;

Now to the part where you set the CRS. First you should go check if the CRS you need is already in the CRS folder. You can find the definitions in the apache\htdocs\site\libs\proj4js\defs folder. If the CRS doesnt exist like in my case, then you need to make your own. I used the website http://spatialreference.org/ to find the desired CRS and when you do, use the Proj4js link and copy that to your definition in apache\htdocs\site\libs\proj4js\defs folder. After that you need to tell the client what projection to use. Simply change the line:


//EPSG projection code of your QGIS project var authid = "EPSG:"+3787;


After that the client should project with the correct CRS.


PS: If you are using PostGIS as your datasource, there is a possibility your CRS could be missing in the spatial_ref_sys. In that case use the link for PostGIS spatial_ref_sys INSERT statement, to insert your CRS into the table.


Sunday, 27 September 2015

arcgis desktop - Calculating percentage of overlap between two different layers


I have two different layers (Layer 1 and Layer 2) both made up of many thousands of polygons. Many of the polygons in Layer 1 overlap to differing degrees with Polygons in Layer 2. I need to calculate how much (percentage) each polygon in Layer 1 overlaps with the polygons in Layer 2.





algorithm - Drawing Day and Night on a Google Map


I'm looking to plot day/night on a Google map, for an arbitrary point in time. I'm familiar with generating map tiles; I'm just looking for an algorithm to tell me whether a particular point on the globe is currently in daylight or darkness, or to otherwise plot the curve of the day/night interface onto the map.


I've done some searching, but it's possible I don't even know enough about the problem domain here to know what terms to search for!


Any ideas? Doesn't have to be perfect -- basically, I'm comparing Flickr geolocation data of sunrise and sunset photos (and their "date taken" timestamps) with reality, and this is to help me visualise it.



Answer



This page gives equations good to 1 degree. It looks like this code calculates it too but I didn't actually check.


arcgis 10.0 - Merge two polygons, attributes are added - Tool?


I have a bunch of polygons (counties) that I need to merge together. They all have various population attributes. I need to be able to selectively merge two counties together and get a single polygon with the combined population totals for the attributes. Is there a way to do this in ArcGIS?


I've tried the Merge tool but while that works for the boundaries, it doesn't work for the adding the two population totals together, you have to pick one or the other. The Dissolve tool seems to operate on all polygons with a single attribute, but there is no unifying attribute that I have that determines which polygons are merged.


Thanks.



Answer




As Aaron alluded to in his comment, if any features of a layer are selected prior to using that layer as an input to a geoprocessing tool, the input is restricted to the selection.


In your particular case, you should be able to select two (or more) countries with the Select tool or Select By Attributes/Location dialogs and then run the Dissolve tool, selecting the feature layer (not the path to the actual shapefile/feature class itself) by its name from the input features dropdown menu.


The Dissolve tool has an optional parameter, statistics_fields, where you can SUM the population.


arcgis desktop - Joining shapefile table with csv


So I am having a shapefile with all the world countries. And the table for this shapefile has a column named "Country Name". Each of this entry is unique, meaning, each country takes only 1 row, and each with it's own polygon represented in the shapefile and ID.


Now I have a .csv file with all the countries again. Now this .csv file has the column "Country Name" too, but this column has several rows containing the same country name, because, it also has a column called "Year", therefore each country has several row entries for each year.


How can I join my shapefile table with this .csv table? Right now, my problem is that when joining, only the first entry from the .csv file is being joined, since the shapefile table only has 1 row per each country entry.


Any suggestions how I can a union join, meaning that each country in my shapefile will be assigned every year found in the .csv file? I think it means that each polygon needs to be multiplied for every year entry from the .csv file.


Thanks


EDIT: According to ArcGIS Resources - joining and relating tables, "In all cases of 1:M joins, only the first matching record is joined and displayed in the layer's attribute table.". Which is actually the problem I have. Only the first matching record is displayed in my shapefile table.


So I believe I now need a work around? Since my .csv is having my data by year, but each year is a row (and the same country is spanning several rows for each year row), would I need to create a column for each of the years, and somehow transfer the data found in each row from my .csv file to the columns of my shapefile?





spatstat - How to get the second nearest neighbor between two point patterns in R?


Is there a way to get the distances for the second nearest neighbor between two point patterns in R? The spatstat package has a function called nncross but it only applies to the nearest neighbors between two patterns and I need the distances to the second nearest neighbors.



Answer



The function get.knnx in package FNN can compute the N-nearest neighbours in point patterns.


x1 = cbind(runif(10),runif(10))
x2 = cbind(runif(10),runif(10))
nn = get.knnx(x1,x2,2)


now nn$nn.index is a matrix such that nn$nn.index[i,j] is the row in x1 of the two nearest neighbours to row i in x2 - sorted so that the nearest is [i,1], and the next neighbour is [i,2].


The function also returns the distances for you, and has some options to compute spatial indexes for very fast searches.


Result Window in ArcGIS 10.1 is not Opening


I booted up ArcMap 10.1 today and found out the Results Window was not opening up.


Background info:



  • ArcMap is running through a virtualization program through at my company.

  • Similar issue has came up in the past but with the search window not showing up.

  • IT couldn't figure out what caused the issue with the search windows not opening up so they uninstalled the virulization program cleared the cache associated with ArcGIS.


  • I noticed whenever I click on Geoprocessing->Results (I have also added in the results button through customization), the right edge in the following image blinks.

  • I noticed a windows update went through last night so that may be the culprit? Then again, how did the search windows get borked?


I've scoured stack-exchange/internet and could not find any info relating to this issue. Tools successfully run and I can view the history log as mentioned here but i really need to view the results window so I can rerun various tools again without filling out the parameters again.


Does anybody know of any registry keys/services that I can look into that that may need to be tweaked?


I don't want to resort to IT since I know for a fact they will uninstall everything, in turn loosing all my defaults e.g. shortcut keys, toolbars, and document customization.



Answer



I was able to get the Results Window to show up again by deleting the following files:



  • normal.gxt (%AppData%\Roaming\ESRI\Desktop10.1\ArcCatalog)


  • normal.mxt (%AppData%\Roaming\ESRI\Desktop10.1\ArcMap\Templates)

  • normal.mxd (%AppData%\Roaming\ESRI\Desktop10.1\ArcMap\Templates)


Booting up ArcMap recreates the new files setting . I'm under the impression that one of these files was corrupt which kept the results window from opening up.


Note: If you want to keep your customization's, it's recommended to back up your normal. mxt and normal.mxd before deleting so they can be replaced later on. I would also suggest deleting normal.gxt first and then booting up ArcMap to see if problems are resolved. If problems still persist, delete the other two files.


Sources: 1,2,3,4


Saturday, 26 September 2015

How to read Greek fonts (ISO-8859-7) in shapefile attributes within QGIS 1.8.0?


I'd like to ask whether there is a way to read greek fonts in a shapefile's attribute table within QGIS 1.8.0.




land survey - How to Calculate North?


I am working on a field survey and a client requirement is to differentiate various norths, like magnetic north, true north and grid north. My question is what is the methodology of calculating north and how can we differentiate various type of north as above mentioned.




arcmap - Interpretation of ArcGIS Kernel Density legend parameters


In ArcMap 9.3 I've used Kernel Density to map various incidents, but the resulting shapefile doesn't display any units of measurement. Is there a good, not-to-technical source that would explain in lay terminology the interpretation of the output values in terms of the input cell size and search radius?




describe feature type - Custom DescribeFeatureType response in Geoserver?


I'm using Geoserver in my web application. I need custom DescribeFeatureType response. What this mean?

layer_a is a layer in Geoserver. It have Feature Type Details as follow:


enter image description here


DiscribeFeatureType request for this layer, return a response as follow:



xmlns:opengeo="http://opengeo.org"
xmlns:xsd="http://www.w3.org/2001/XMLSchema"
elementFormDefault="qualified"
targetNamespace="http://opengeo.org">

schemaLocation="http://localhost:8080/geoserver/schemas/gml/3.1.1/base/gml.xsd"/>



















I want to get more description for some fields. For example, for SQKM row, I want to get some extra description such as: minValue, maxValue and etc.



Is it possible? If yes, HOW?


A suggestion: We can put our data in column comment of table and then define a db datastore in geoserver. So we can response comment of table with describeFeatureType?




Why choose open source software for remote sensing research?




If you judge by the amount of the questions regarding FOSS software that are being asked in GIS SE many users seems to prefer FOSS over proprietary software.


I've read some articles - more precisely some personal blogs - supporting this choice. Furthermore corporate giants like Esri seem to acknowledge the open source development movement.


So, I am asking your position in this matter.



Why do you use open source tools?


What are the advantages or disadvantages of your choice, if any?




raster - Where to obtain free 15m ASTER image for New Mexico?


I am looking for a free Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) image that will let me to perform NDVI for the pilot crop health/soil moisture that can go as far back ten years or so.


It needs to have:




  • 15 m (meters)

  • temporal

  • Ten years back or so

  • Perform NDVI


This will be in a small area somewhere near Albuquerque, NM.



Answer



You can easily download freely available ASTER Level 1B (at-sensor calbirated radiances) from the USGS EarthExplorer site. A simple quick search for a polygon roughly covering New Mexico returns > 100 results, but individual scenes are much smaller, so for a full cover of the state you'd have to stitch them together. USGS Glovis should have the same data, though an older interface.


USGS EarthExplorer screenshot


arcgis desktop - Project raster tool stays white at iteration in ModelBuilder?


I want to project a bunch of rasters (WGS84) to different UTM (WGS84) coordinate systems. The coordinate system depends on a field value of a point shape.


So on each raster lays a point with its spatial reference.


I scripted a little ModelBuilder toolchain for it but unfortunately the Project raster Tool stays white and doesn't execute.


enter image description here


Some detail information for the toolchain:


I extract the information wich point lays on the individual raster with the sample tool. Then I get for each raster a table with many no data points and one with the stored EPSG Information. This point is extracted by the table select tool and then I get the EPSG with the get field value tool and store it as spatial reference.


The spatial reference I use as Output coordinate system. But the Project raster tool requires the "optional" Geographic Transformation although both coordinate systems use the WGS84 referece.



enter image description here Tool Dialog of Create spatial reference



  • I tested the toolchain until the projection part and there is no error.

  • I've checked if the rasters lay in the UTM zone I want to project them to.

  • When I run the tool first, then disconnect the spatial reference and the Projection and then reconnect, the tool gets colorful. But when I run the toolchain again (without validation) only one Raster is projected several times to the same coordinate system until I cancel it.

  • at no point any error message occurs when running the tool

  • I've changed the data type of the field value to "Coordinate System" with no effect



Answer



It seems like an Error of the Project Raster tool itself. When I use the ArcPy script of the Esri page and change it a bit to insert the parameters and load this script tool to my model it works perfectly. No error message and no white tool.



import arcpy
from arcpy import env

Input_Raster = arcpy.GetParameterAsText(0)
Output_Raster = arcpy.GetParameterAsText(1)
Output_Coord = arcpy.GetParameterAsText(2)

arcpy.ProjectRaster_management(Input_Raster, Output_Raster,Output_Coord, "BILINEAR")

Converting coordinates to something understandable for AutoCAD



I got this from a title and need to convert it and use it in AutoCAD with coordinates. Is there anyone who can tell me how to do this?



BEGINNING AT A POINT MARKED "1" ON PLAN, BEING S. 29 DEG. 29’W., 1792.86 M. FROM BLLM NO. 1, ANTIPOLO CADASTRE; THENCE THENCE S. 85 DEG. 56'E., 21.00 M. TO POINT 2; THENCE S. 4-DEG. 04'W., 12.00 M. TO POINT 3; THENCE N. 85 DEG. 56'W., 21.00 M. TO POINT 4; THENCE N.4 DEG. 04'E., 12.00 M. TO THE POINT OF BEGINNING,




geolocation - GPS and GIS for real time monitoring of an object


How is it possible to store coordinate of object in server in real time using feasible hardware ( can be used by student) GPS into database and retrived in real time??




  1. What are the names of equipment ?

  2. Any link to good resource?


The purpose of this experiment is to provide real time location of object through internet(web).



Answer



You could use a smart phone with GPS to regularly upload it's location to a webserver running both PostGIS and Geoserver. You can then request real time locations through Geoserver services.


qgis - How to "Merge Selected Features" with python?


I have been looking for the answer to this question for some time now, but can't seem to find it! My problem is straight forward: I have one layer It contains lots of features (only polygons), say countrys I want to merge some features together. Using python.


I know that I can select features with the mouse and hit the "Merge selected features"-button, but that is to slow. I can use python to select the features that i want, but how can i merge the selected features from python? (the features don't have a "common field" or something like that, what features are to be merged will be decided by in-data)


Help is greatly appreciated!



Answer



You can use QgsGeometry.combine( QgsGeometry ) for that.


Just loop over all your features and call


geom = geom.combine( currentFeature.geometry() )

with geom being a QgsGeometry



How to filter relation members in Overpass


My final goal is to get country borders from OSM via Overpass API. I know my country's relation object id - http://www.openstreetmap.org/relation/59065 - and can get its details with the following query:


[out:json];
rel(59065);
out body;


The relation members list contains ways (which comprise the country border) as well as some other nodes and relations. How can I filter relation members to leave only way elements?



Answer



You cannot reduce the output of the relation itself (it will always include all elements). However, you can specify to only return ways, which are included in a relation:


[out:json];
rel(59065);
way(r);
out geom;

Try it in overpass turbo: http://overpass-turbo.eu/s/c0l and press "Run"



postgis - Querying points inside and outside of polygons?


In PostGIS I have two tables:


Table that contains polygons:


table a.provinces


gid | geom | name |
------------------------------
1 | 010604 ... | Champagne |

Table that contains points:


table a.stations

gid | geom | province_short |
---------------------------------
1 | 0132 ... | ch |


They are not connected with foreign key!


I would like to get:



  1. Only points that are inside of the polygon

  2. Only points that are outside of the polygon


SRID: 3857


Something like this: enter image description here


Can it be done just like this or I have to modify database?





Friday, 25 September 2015

arcgis desktop - How to add item to an existing selection layer


I have a selection layer of about 150 properties in ArcGIS ArcMap, and I want to add one more building to it (see image, area outlined in red circle). The problem is that I missed this building in particular when I clicked to add all the buildings one-by-one originally.


When I click on it using the Selection Interactive tool to add it to the selection, I can't seem to make it a part of the existing selection layer called "LargeBuilding selection" (nor can I change its color to green).


I am relatively new to ArcGIS.


in red circle is what I want to add to an existing layer



Answer



As far as I know, there's no way to add features to a layer created by selection. The easiest thing to do is select all the features that went into that layer, add the ones you want by manual selection or using Select by Attributes/Location with a 'add to current' method (see my answer at this question for working with active selections), and then create a new layer based on the updated selection to replace the incorrect one.




  1. On the top menu bar, choose Selection > Select by Location.

  2. In the Select by Location dialog, set the following: selection method - select features from, Target - LargeBuilding, Source - LargeBuilding Selection, spatial selection method... - are identical to. Click Ok.

  3. That should select all 150 buildings in the original layer that you saved to the selection layer. Using the interactive selection (Select Features button on the Tools toolbar) and holding shift, click the building you want to add in the LargeBuilding layer. This should add it to the selection. Make sure you don't click anywhere else or clear the current selection (of the other buildings) when you do this or you have to repeat those steps.

  4. Right-click the LargeBuilding layer and choose Selections > Create Layer from Selection as you did the first time to create a new layer based on the updated selection.




The alternative to this method is to add your own field to the LargeBuildings layer. You can then use that field to signify however many groups or whatever you want (in or out, set1 or set2 or set3). You can add multiple copies of the same layer and put definition queries on each so that it only shows features with a particular value in that field. With this method, whenever you edit that attribute it will automatically appear/disappear from the definition queried layer when you save the edits and you don't have to worry about selections at all. This could also be done on an existing attribute field if there's one in particular that identifies the ones you're manually selecting. By doing it with an attribute, you don't have to worry about recreating a selection to subset or modify a subset - it's stored as an attribute.


arcgis desktop - Comparing two Digital Elevation Models (DEMs) from LAS files?


I have two LiDAR files (.las), one is original let's say with X points. And the other is copy of the first .las file but with Y points, where Y is less than X.


Now, I want to compare the Digital Elevation Models (DEMs) of these two .las files and visualize how different they are.


I want to get information like RMSE, standard deviation, among other types of comparison.


I would appreciate, if anyone could tell me what softwares, and ways to get the comparison info.




python - Pyspatialite installation error on MacOSX (QGIS plugin)


I am using Mac OSX Sierra (10.12.1), and I reinstalled QGIS 2.14.7 (and python 2.7, after a crash and clean install of OSX). Now when I open QGIS I get an error for Pyspatialite Plugin. I found the same question for Fedora (Pyspatialite Installation Error), but the solution doesn't seem to work for my system.


My QGIS error message says:



2016-11-07T11:47:02 1   Traceback (most recent call last):
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 281, in loadPlugin
__import__(packageName)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/__init__.py", line 29, in
from processing.tools.general import *
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/tools/general.py", line 28, in

from processing.core.Processing import Processing
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/core/Processing.py", line 40, in
from processing.core.GeoAlgorithm import GeoAlgorithm
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/core/GeoAlgorithm.py", line 41, in
from processing.core.parameters import ParameterRaster, ParameterVector, ParameterMultipleInput, ParameterTable, Parameter
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import

mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/core/parameters.py", line 33, in
from processing.tools.vector import resolveFieldIndex, features
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/tools/vector.py", line 20, in
from processing.algs.qgis import spatialite_utils
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/plugins/processing/algs/qgis/spatialite_utils.py", line 28, in

from pyspatialite import dbapi2 as sqlite
File "/Applications/QGIS.app/Contents/MacOS/../Resources/python/qgis/utils.py", line 572, in _import
mod = _builtin_import(name, globals, locals, fromlist, level)
ImportError: No module named pyspatialite

I've searched everything I can for how to install pyspatialite with pip, but in the terminal I keep getting a bailout error when I try "pip install pyspatialite", and I'm new to the terminal and to python so I can't figure out the message.


Error message:
Command "/usr/local/opt/python/bin/python2.7 -u -c "import setuptools, tokenize;__file__='/private/var/folders/34/fnw01lcx0r53cyz_hmfgk1d00000gn/T/pip-build-jqRZzp/pyspatialite/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))" install --record /var/folders/34/fnw01lcx0r53cyz_hmfgk1d00000gn/T/pip-ep2nth-record/install-record.txt --single-version-externally-managed --compile" failed with error code 1 in /private/var/folders/34/fnw01lcx0r53cyz_hmfgk1d00000gn/T/pip-build-jqRZzp/pyspatialite/

When I installed QGIS (from kyngchaos), I installed the list of python modules (Numpy etc) and I've now brew installed sqlite3, but I'm still getting the bailout error.



Did I incorrectly install something or am I missing a dependency, specifically for OSX?



Answer



There is a path problem in the Kyng Chaos SQLite installer (in GDAL 2.0, I don't know with the new 2.1)


enter image description here


The content of the sqlite-py2.7.pth is


import sys; sys.path.insert(0,'/Library/Frameworks/SQLite3.framework/Versions/C/Python/2.7')

It is a Python .pth file ( site), installed in /Library/Python/2.7/site-packages/sqlite-py2.7.pth and pointing to /Library/Frameworks/SQLite3.framework/Versions/C/Python/2.7


The problem is that this folder does not exist, the installer installs /Library/Frameworks/SQLite3.framework/Versions/C/Python/2.5 (see figure)


Therefore, you can



1) modify the name of the folder /Library/Frameworks/SQLite3.framework/Versions/C/Python/2.5 to /Library/Frameworks/SQLite3.framework/Versions/C/Python/2.7
2) or modify the sqlite-py2.7.pth file to


import sys; sys.path.insert(0,'/Library/Frameworks/SQLite3.framework/Versions/C/Python/2.5')

The QGIS version of Kyng Chaos uses exclusively the Apple Python installed and not Homebrew or other Python installers (/usr/local/opt/python/bin/python2.7 for example) with modules installed in /Library/Python/2.7/site-packages/ and the frameworks installed in /Library/Frameworks/. It don't recognize and use the Homebrew version of sqlite3 (look at QGIS Python version)


arcgis desktop - Fastest way to count the number of features in a feature class?


With the introduction of the Data Access module in arcpy (30x faster search cursors), I want to know if counting features matching sql criteria is faster than the traditional MakeTableView + GetCount methodology?



Answer




I have tested solution from answer above and on my real world data the difference is negligible. Opposite to results in other answer, my times for arcpy.MakeTableView_management and arcpy.da.SearchCursor within ArcMap are same same.


I have tested variations with and without query, please see the code for query version, and final measured results below:


@staticmethod
def query_features(feature_class, query):

# Method 1
time.sleep(5) # Let the cpu/ram calm before proceeding!
start_time = time.clock()
count = len(list(i for i in arcpy.da.SearchCursor(feature_class, ["OBJECTID"], query)))
end_time = time.clock()

arcpy.AddMessage("Method 1 finished in {} seconds".format((end_time - start_time)))
arcpy.AddMessage("{} features".format(count))

# Method 2
time.sleep(5) # Let the cpu/ram calm before proceeding!
start_time = time.clock()
arcpy.MakeTableView_management(feature_class, "myTableView", query)
count = int(arcpy.GetCount_management("myTableView").getOutput(0))

end_time = time.clock()

arcpy.AddMessage("Method 2 in {} seconds".format((end_time - start_time)))
arcpy.AddMessage("{} features".format(count))

The results below:


    No query:
Method 1 finished in 5.3616442 seconds
804140 features
Method 2 in 4.2843138 seconds
804140 features


Many results query:
Method 1 finished in 12.7124766 seconds
518852 features
Method 2 in 12.1396602 seconds
518852 features

Few results query:
Method 1 finished in 11.1421476 seconds
8 features
Method 2 in 11.2232503 seconds

8 features

arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...