I have a database of GPS points. There aren't any tracks, only points. I need to calculate some value for every 100 meters, but sometimes GPS gave a wrong coordinates that lies far from real GPS points, and instead of calculating values for a small square, I have to calculate it for a really big rectangular area.
What is the best algorithm to filter wrong GPS points?
I made a screenshot to help understand:
Answer
Run Anselin Local Moran's I against the points and throw out anything with a z-score below -1.96. That's a statistical method for locating spatial outliers. You must ensure that all points have a value related to their spatial position to do that.
But in checking on the tools in 10.1 after whuber's comment, I realize that if you use ArcGIS 10.1, the grouping analysis tool is available, which is really what you want to do.
I -think- you would want to do a grouping analysis with a Delaunay Triangulation spatial constraint. The roadblock here is that you need to have a number of partitioning groups equal to or greater than the number of disconnected groups (if any of the outliers are natural neighbors to each other). Otherwise, outliers with no natural neighbors will come up with no group from the grouping analysis.
Based on that, I think Delauney triangulation might be the source of a filter algorithm, but I am not sure yet.
Another update: After digging into Partition.py, the script that runs the grouping analysis tool, I think it is possible to use the algorithm in there for disconnected groups combined with the NoNeighbors portion, though I am having trouble digging out that part of the script.
No comments:
Post a Comment