I'm learning how to manage a local PostGIS (ver 2.1) database, with spatial indexing enabled on the shapefiles that I've imported (using the PostGIS Shapefile Import/Export Manager). I have imported a large shapefile with 4.7 million polygons and a 2.1GB attribute table into a schema.
However, using QGIS 2.8.2 to load, render, and open the attribute table of this table has been rather disappointing. Adding the dataset to the QGIS canvas freezes the application for a short period. While the polygons are quickly rendered, opening the attribute table produces a small dialogue window saying '____ features loaded' until all 4.7 million features are loaded. While QGIS takes a long time to open the attribute table, it also appears to consume a great deal of memory (~1.1GB) as well. I've avoided making queries to subset rows of the attribute table due to poor performance as well.
I've heard of other QGIS+PostGIS users reporting excellent performance with tables containing >500,000 or even millions features, but maybe my expectations are unreasonable. So while I'd like to enjoy the performance benefits of PostGIS, I'm not sure what the problem is. Here's what I've tried:
I've created a spatial index on the table using the using the PostGIS Shapefile Import/Export Manager and the following SQL code:
CREATE INDEX parcels_geom_idx ON parcels USING gist (geom);
In QGIS under the Data Sources tab, I have the 'Attribute table row cache' set to 10,000 (the default)
- The database is on the local host machine which runs on a Core 2 Duo, 8GB RAM, and a 240GB SSD
Eventually I'd like to store up to 100GB of spatial data from a collection of geodatabases in a PostGIS database, but with performance problems on just a few data sets I'm not confident in moving away from ESRI's platform towards an open source ecosystem of GIS software.
No comments:
Post a Comment