I am trying to compute the global solar irradiation for a region and the run times seem to be a bit too high.
I have as inputs a Digital Surface Model with the buildings and a Digital Terrain Model for the surrounding area.
The PROJ_INFO is
name : Lat/Lon
proj : ll
datum : wgs84
ellps : wgs84
The PROJ_UNITS are
unit : degree
units : degrees
meters : 1.0
The DSM has 8020 Rows, 8782 Columns and 70431640 Total Cells. The resolution of the DSM is 0:00:00.03303.
The DTM has 1080 Rows, 1476 Columns and 1594080 Total Cells with a resolution of 0:00:00.28913.
These are the steps I did for the r.sun calculation
Set the region to the area covered by the DTM and the resolution to the resolution of the DSM
g.region rast=DTM@PERMANENT res=0:00:00.03303
Overlay the DSM and the DTM to get a high resolution elevation map using r.series
r.series input=DSM,DTM output=Overlay method=maximum
From the Overlay map I generate slope and aspect maps
r.slope.aspect elevation=Overlay slope=Slope aspect=Aspect
I run r.sun on using the following command
r.sun -s elevin=DSM aspin=Aspect slopein=Slope glob_rad=GlobalRad day=262
The whole process ran on a system with two Intel Xeon l5430 CPU @ 2.66 GHz with 8 GB of RAM and it took 15 hours to finalize 55% of the process.
What am I doing wrong?
Am I missing something, is it normal to run this long?
Help me understand how should I compute solar irradiation starting from two elevation maps, a Digital Surface Model that contains the buildings and a Digital Terrain Model for the area that surrounds the Digital Surface Model.
No comments:
Post a Comment