There is a certain WCS layer that I'd like to save it to my local computer for analysis.
According to GDAL WCS driver draft:
"Accessing a WCS server is accomplished by creating a local service description xml file looking something like the following, with the coverage server url, and the name of the coverage to access. It is important that there be no spaces or other content before the element.
http://laits.gmu.edu/cgi-bin/NWGISS/NWGISS?
AUTUMN.hdf
I tried to create the xml file but I was unsuccessful.
Does anyone has any experience about extracting WCS layers? Are there any pointers (sites, tutorials, tips) on how to create the xml file?
EDIT:
The xml file im using to grab the first layer for a specific time is the following:
http://dmcsee.org/cgi-bin/mapserv?map=/var/www/tmp/dmcsee_wms/dmcsee_wcs.map
1.1.1
SPI6
EPSG:4326
gtiff
&BBOX=10.0,31.995,48.005,50.0&time=19900101
But for somereason returns an error
Answer
Using python's OWSlib which was suggest by I was able to programmatically acquire the data.
Here's the script I used for future reference:
import os
from owslib.wcs import WebCoverageService as w #owslib ->https://github.com/geopython/OWSLib
#easy_install owslib
folder = 'c:\\path\\to\\folder\\'
wcs = w(url="http://dmcsee.org/cgi-bin/mapserv?map=/var/www/tmp/dmcsee_wms/dmcsee_wcs.map",version="1.0.0")
spi6 = w['SPI6'] #The layer I am interested in
for timep in spi6.timepositions :
output = wcs.getCoverage('SPI6',time=[timep],bbox=(10.0,31.995,48.005,50),format='GTiff',CRS='EPSG:4326', WIDTH=380, HEIGHT=120)
f = open(os.path.join(folder,'spi6_'+str(timep)+'.tiff'),'wb')
f.write(output.read())
f.close()
The above python script successfully downloaded and wrote 270+ geotiffs with their georeferenced data.
No comments:
Post a Comment