AT 309 Week 6: ArcGIS Pro

 Introduction

The past two labs have revolved around ESRI software. We looked at satellite imagery in ArcGIS Pro and ESRI Landsat Explorer, but we have not looked at much UAS gathered data. This week, the lab is an introduction to ArcGIS Pro and multispectral imagery that was gathered by UAS. ArcGIS Pro is another GIS software that allows users to create, manipulate, and analyze maps and models based on geographic information. The main differences between ArcGIS Earth and ArcGIS Pro are that Earth is a more simplified GIS software and it uses a globe format. ArcGIS Pro can show data in both 2D and 3D, but we are working with 2D maps in this lab.

Interpreting Data Without Looking at the Images

One of the big emphases of this lab was that information can and should be interpreted, from things like the metadata and image properties, before looking at the images. The information in this post was not necessarily analyzed in the order I wrote it in, but I think this order allows for a deeper analysis.

Metadata - Metadata provides information about the UAS that is important when it is time to analyze the data. Some of the metadata for the flight associated with this lab is listed below. 

Vehicle: Bramor ppX

Sensor: Altum set to 1ms and 16bit TIFF

Flight Number: 2

Takeoff Time: 12:18pm

Landing Time: 12:35pm

Altitude (m): 121

Sensor Angle: nadir

Metadata is useful because it provides context about the flight that can make it easier to interpret data. Before looking at the photos, the metadata shows that the sensor used was an Altum sensor. Additionally, the bit depth is 16bit which allows for more pixels than an 8bit image. Other relevant information like altitude provide context about how far up the images were taken which may help users visualize the data better.

Layer Properties - Once in ArcGIS Pro, there is additional information that can be looked at in the layer properties. Right clicking any of the layers and selecting 'Properties' pulls up the 'Layer Properties' box. This box provides a lot of information similar to the metadata. One example is the X and Y cell size. Cell size provides the size per pixel in the image. The lab asks for two layers' cell sizes: the blue band layer and the long-wave IR layer. The blue band layer has X and Y cell sizes of 0.056 meters and the long-wave IR band has cell sizes of 0.857 meters. This can be interpreted to find out that pixels are smaller in the blue band layer and should be more detailed than the long-wave IR image. 

These are just a few examples of data that can be used to better understand the images taken. There is more information that could be taken into account, like the projected coordinate system. All of this data can be looked at before analysis. This allows users to have prior expectations that may help them analyze the data faster.

Spectral Bands

Because this lab is focused on multispectral imagery, it is important to know about the spectral bands and their peak reflectance. The lab provides a graph for the peak reflectance values of the individual bands on the Micasense Altum sensor used in the mission. The peak reflectance values are listed below.

Blue: 480nm

Green: 560nm

Red: 670nm

Red Edge: 720nm

NIR: 825nm

Peak reflectance value is useful because it shows what wavelength is gonna have the highest reflection in the various bands. All of the values listed above are different and a simple interpretation of this should mean that the bands will reflect different things. 

Pixel Reflectance Values - Another piece of information that can be gathered without looking at the photos is the pixel reflectance value range. The table below lists the pixel reflectance range for each of the 6 bands used in this lab.


With little to no knowledge about pixel reflectance values, there is still some information that can be gathered from this. All of the bands have a similar range of 0-65000 except for the long-wave IR band. When looking at the images later, there should be some noticeable difference between all the other images and the long-wave IR image.

Single Band Images

Blue Band Image (pre-burn)

All of the single band images are in black in white. This makes it easy to see what has the highest reflection in each band. In the blue band, the trails and dirt in the fields have the highest reflection. The trees and vegetation in the fields have a low reflection. The main part that stands out here is the bright trails. 

Green Band Image (pre-burn)

After looking at the blue band, the green band appears much brighter. There is little variance between the fields and the trails. The main contrast here is that the fields are light gray and the trees are darker. The blue and green bands have the lowest peak reflectance, but the green band's peak reflectance is slightly higher. The green band seems to reflect more things in the area than the blue band did. A higher reflectance value might be the reason for this.

Red Band Image (pre-burn)

The red band image also has a high reflectance for the trails and a low reflectance for the trees. This image is more similar to the blue band than the green band. Both the blue and red band images have more contrast between the trees and trails than the green band. Whatever they are reflecting here, the green band may be reflecting the most because its peak reflectance is in between the two contrasting bands. The green band seems to be less useful because its reflectance leads to a washed out image.

Red Edge Band (pre-burn)

The red edge band is similar to the green band in that it has less contrast than the others. The flat areas seems to have a higher reflectance while the trees have a low reflectance again. The biggest outlier in this band is that the trees have more contrast and are fairly detailed. 

Near Infrared Band (pre-burn)

The NIR band is the first one that does not highlight the trails. There is more contrast in the trees and the highest reflectance points are some of the trees and some of the plants in the fields. Up to this point, the area that seems to vary the most is the break in the trees in the top left of the images. Some of the bands have little contrast here where others have a lot of contrast.

Long-Wave Infrared Band (pre-burn)

The long-wave infrared image is the most different image. It is similar to the blue, green, and red bands in that the highest reflectance is the trails and the lowest reflectance is the trees. This band was expected to be different because it stood out earlier with a different pixel size and peak reflectance range. Comparing this band to any of the other bands, it is easy to see that the LWIR photo is less detailed than the others. The larger cell size seems to be the reason for the lack of detail.

Long-Wave Infrared (post-burn)

After collecting data on the fields, the fields were burned and more data was collected with UAS. The long-wave IR photo after the burn provides further information about the pre-burn LWIR photo. LWIR is an infrared band that is measuring heat. After the burn, the patches that were burned would be the hottest areas. This is why they are very bright where the rest of the image is nearly blacked out. Using this information to interpret the pre-burn LWIR, it seems that the trails were the hottest area before the burn. This may be because the trails were being traveled on frequently or they were absorbing the most heat.

Multiband Composites

The photos above were all made of only 1 band. The various bands highlight different things and can be combined in a multiband composite. Combining bands can be useful when users want to highlight multiple things in an image. 

Pre-burn Composite (1, 2, 3)

The photo above is a multiband composite using bands 1, 2, and 3 for red, green, and blue. This composite combines the blue band (band 1), the green band (band 2), and the red band (band 3). It provides a true color image.

Post-burn Composite (1, 2, 3)

This is the post-burn composite with the same 1, 2, 3 band order. The obvious difference is that the burn areas are dark brown now. It also seems like the green areas are less vibrant in the post-burn than the pre-burn image. This could be explained by time of day because the shadows are longer in the pre-burn image.

Pre-burn Composite (5, 3, 2)

The 5, 3, 2 band combination is used to get a false color IR composite. This combination changes the vegetation to various shades of red. When looking at the near Infrared band (band 5) earlier, the trees seemed to have a high contrast. This is why the NIR band is used in false color IR. The high contrast in vegetation allows users to analyze vegetation health. 

Post-burn Composite (5, 3, 2)

The post-burn false color IR image provides more information on how this composite represents various things. The burn spots are dark green, but there are other green and blue areas in the photo. Dark green and blue represent a lack of vegetation. This information is useful because it shows which areas are void of vegetation. 

False Color IR Composite (zoomed in).

Zooming in shows the variance in vegetation color better. Bright red vegetation is healthy and dull red is unhealthy. 

Post-burn NDVI

The NDVI is an index used to monitor vegetation health. The ArcGIS Earth lab talked about NDVI and we learned that bright red areas are unhealthy vegetation and bright green areas are healthy vegetation. The trees are different shades of green and yellow and they should be because they were not burned. The burn area is bright red which also makes sense.

False Color IR (top) vs. NDVI (bottom)

One of the tools in ArcGIS Pro is the 'Swipe' tool. This tool allows you to quickly compare two layers without switching one off and on. The photo above was taken to compare false color IR to the NDVI. While both of the composites highlight the same areas, the NDVI washes out most of the healthy vegetation's detail. When looking at NDVIs in the ArcGIS Earth lab, there was a similar issue with the NDVI washing out detail. This makes the NDVI more confusing to analyze.

Post-burn Composite (3, 1, 5)

The end of the lab had the class make their own composites. I chose a 3, 1, 5 composite. This composite uses the red, blue, and NIR bands. It is similar to the false color IR because there is a high contrast between the dead areas and vegetation. The healthy vegetation is bright blue and the dead areas are bright green to yellow. This band combination seems better for highlighting dead areas than healthy vegetation.

Conclusion

The ArcGIS Pro lab was very useful because it showed me how much you can interpret from metadata and layer properties. I am a beginner in spectral data and through comparing the properties of the different layers, I was able to better understand the layers when I looked at them. This is an important lesson for UAS operators. The additional information allows users to better understand the data that they are collecting. 




Comments

Popular Posts