Friday, 22 July 2016

data - Site to download al USGS Topo Maps in bulk


I am trying to create seamless topomap WMS or TMS layer that covers the entire US (not interested in geo-pdfs). To do that, I need either the rasters (geo-tiffs, etc) or preferably vectors (whatever vector format readable by ArcGIS or GDAL/OGR) that are used to generate those maps - and I want to download it them in bulk (without having to click through one thousand interfaces to get each 1 degree quad).


Hard drive disk space is not an issue


Any link? :)


Update: this is what the terraserver images look like and hence my preference for the vector data.



terraserver tile


The TopoOSM layer generation looks promising, Topo OSM layer


but it seems the data for the entire US is not available from one location.



Answer



What about the FREE 24K geotiff DRGs available through the Libre Map Project? All 24K DRGs for all 50 states are available there. They are collared, but I had access to GlobalMapper, which has a nice function that removes collars easily (and surely there are other ways to deal with those). They are filed nicely on the server at the Internet Archive and the Python script below, in conjunction with wget, fetches tons of 'em quickly and easily:


import os, shutil, string, time 

# This script fetches raster images (DRGs) from the web using an os.system call to wget
# The directory of states is at: http://libremap.org/data/
#

# Process:
# For each 24K quad in the list, it gets both
# the tif, tfw, and fgd (metadata), then moves them to new home. Moves on to next image in list
# Data is actually stored at the internet archive, with a URI like so:
# http://www.archive.org/download/usgs_drg_ar_35094_a2/o35094a2.tif

wgetDir = 'C:/Program Files/wget/o'
list = [['32094f5','HARLETON','TX'],
['32094f6','ASHLAND','TX'],
['32094f7','GLENWOOD','TX'],

['32094f8','GILMER','TX']]
exts = ['tif', 'tfw', 'fgd']
url = 'http://www.archive.org/download/'
home = '//share/Imagery/EastTexas/TOPOs/24k/'
# List of images we want to fetch
images = list

if __name__ == '__main__':

for image in images:

for ext in exts:
# Piece together out image/world file URI, so it looks like so:
# http://www.archive.org/download/usgs_drg_ar_35094_a2/o35094a2.tif
fullurl = url + 'usgs_drg_' + image[2].lower() + '_' + image[0][:5] + '_' + image[0][5:] + '/o' + image[0] + '.' + ext
# Get to wget thru an os.system call
os.system('wget %s' % (fullurl))
# Move the file to where we want it to live, with a descriptive filename, as in:
# AR_PRAIRIE_GROVE_o35094h3.tif
shutil.move(wgetDir + image[0] + '.' + ext,
home + string.upper(image[2]) + '_' + string.replace(image[1], ' ', '_') + '_' + 'o' + image[0] + '.' + ext)


I usually build the list of input quads by selecting the quads I want from a 24K topo vector footprint in ArcMap, exporting out the records to a dbf (or better yet directly to Excel with XTools), then in Excel, build the list by concatenating the fields of interest together, something like:


="['" & quad_id & "','" & quad_name & "','" & state & "'],"

I then copy the list of lists into my script or a external module and call it from there. Maybe not the most elegant method, but it works nicely. HTH.


No comments:

Post a Comment

arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...