Tuesday, August 27, 2024

Special Topics- Data Quality- Standards- Module 2

 This week we used the standards set by the National Standard for Spatial Data Accuracy (NSSDA) to determine positional accuracy of a provided road network. The positional accuracy of the test data was compared to an established data set as well as to reference points obtained to orthographic images of the study area.


To determine accuracy we chose a sampling of 20 points meeting specific criteria (no less than 4 per quadrant, greater than 10% of the diameter in distance from other points) along intersections. We established reference and testing feature classes for each intersection. 





After creating our feature classes we added XY coordinates to the data. We used excel to calculate the RMSE and NSSDA values.

Based on my results I determined the following about the positional accuracy of each data set:

The ABQ street data tested  4.18 feet horizontal accuracy at 95% confidence level.

The Street Maps USA data tested 434.85 feet horizontal accuracy at 95% confidence level.


Wednesday, August 21, 2024

Special Topics- Calculating Metrics for Spatial Data Quality- Module 1

 This module covered spatial precision, accuracy and bias. We also learned how to calculate root-mean-square error and cumulative distribution function.

Accuracy refers to the closeness of a collected data to an accepted reference point, while precision refers to the agreement between the data points. 

To determine accuracy we created a feature class with a single feature that represented the average of the latitude and longitude of all the collected data. We then compared this with a provided reference point that represented the "true" value.

To determine precision we spatially joined the average point feature class with the original point data feature class to obtain a distance attribute calculation. We calculated the distance of 50, 68 and 95 percentiles of the data from the average location, and used the buffer tool to create a visual representation of the percentiles. After determining the horizontal precision and accuracy we then determined the vertical precision and accuracy using the altitude data.

Our horizontal accuracy was 2.97 meters while our horizontal precision was 4.17 meters. 

Sunday, August 4, 2024

Module 6- Least Cost and Corridor Analysis- Applications in GIS

 The second part of our final module in Applications covered least cost path and corridor analysis using cost surfaces. We worked with two scenarios. For the first scenario we analyzed slope, river crossings and river proximity to determine the best single possible location for a new pipeline within a given study area. We then took this information a step further and presented a corridor analysis. To do this we reclassified our features with their cost factor and utilized the Cost Distance and Cost Path tools.



Image 1


First we assessed for slope alone. As you can see this first image contains 4 water crossings. 

Image 2


After adding the addition cost value of the river crossing the adjusted pathway only contained two water crossings (yellow). However, we further adding a 3 raster feature layer to the cost analysis for river proximity. This gave us the final path in image 2 (red).
Image 3

    Finally, we used the corridor tool to get a corridor analysis of the data. Image 3 depicts a copy of the final least cost analysis overlaying the corridor analysis



    After establishing the basics of performing a corridor analysis we were tasked with determining the most likely corridor for black bear migration between two protected areas of Coronado National Forest. We evaluated suitability based on slope, proximity to roads, and landcover surveys. 
    After giving suitability ratings to each of my variables, I ran a weighted overlay to establish the importance of each variable for the cost surface. I ran my cost surface in the cost distance tool with each protected area to create two rasters for the corridor output. When running the corridor analysis I got a very widespread result that I wanted to narrow down. To do this I created an empty polygon feature class for my proposed corridor and drew a polygon around the most suitable of the corridors. I then clipped my raster to my proposed corridor.

Here is my final map:


 




Friday, August 2, 2024

Module 6- Suitability Analysis- Applications in GIS

This week we ran both vector and raster suitability analyses in ArcGIS Pro.

For Scenario 1, we were tasked with identifying suitable Mountain Lion habitat based on slope, landcover, and distance from rivers and highways. To do this we created slope and landcover Boolean grids as well as highway and river buffers. We ran the same data 2 different times to practice working with both raster and vector tools and methods of analysis. We combined our layers and assessed what polygons fit our suitability requirements.

Areas suitable for Mountain Lion habitat

For Scenario 2 we were given a study area and data and tasked with identifying how much land in the area was suitable for development. While the output of the prior exercise was just the polygons suitable for habitat, for this exercise we used a suitability rating. We reclassified our Land Cover classes based on suitability rating with agricultural and grassland landcover being most suitable for development and water, wetlands and urban areas classified as least suitable. We also gave suitability ratings to the soil class, slope and created buffers around the rivers and roads. After reclassifying our rasters, we used the Weighted Overlay tool to give weight to each class. We ran two scenarios, one with equal weight and an alternative scenario with more weight given to certain classes (slope) and less given to others (distance to roads). The weighted overlay tool gave me issues before I realized that it wasn’t keeping the suitability ratings I gave to my raster and was instead trying to reclassify them for an unknown reason.



GIS Portfolio

 We were tasked to create a GIS portfolio for our internship program. It was a great opportunity to put organize the work I have been doing....