Skip to main content

Conducting Vegetation Surveys Using Geotagged Photographs from Smartphone and Unmanned Aerial Systems

J. S. Glueckert, A. E. Riner, J. K. Leary, K. L. Gladding, andE. C. Russell


Objective

The purpose of this publication is to describe a simple method of importing and displaying geotagged images from unmanned aerial systems (UAS) or smartphones as vector symbols that are scaled and oriented to co-register with base layer maps in a geographic information system (GIS) without the need for processing into an orthomosaic. Instructions in this publication are for QGIS, version 3.28 LTR, but can be easily transferable to other GIS software. These methods offer an efficient and effective way to view high-resolution aerial images and photo-point image surveys in a GIS environment, allowing for basic analysis such as annotating features or points of interest. For more information on using UAS to collect field data, refer to Ask IFAS publication AE527, “Instructions on the Use of Unmanned Aerial Vehicles (UAVs).”

Target Audience

The target audience for this publication includes Extension agents, farmers, biologists, and other field practitioners who have a basic level of familiarity using geospatial programs such as QGIS and ArcGIS.

Introduction

Photography has become invaluable for data collection in agriculture, ecology, environmental sciences, and related fields due to its affordability and ease of use with minimal training. It offers a versatile means of tracking long-term changes in land cover, monitoring species distribution, measuring crop health, and quantifying trends in land use and vegetation over time (Harsch et al. 2009; Dyrmann et al. 2021; Liu et al. 2010). Photographs can offer finer spatial resolution compared to satellite imagery and serve as unbiased records of past conditions. As camera technology improves and becomes more accessible, and with advancements in image processing techniques such as machine learning and deep learning, the capture of high-resolution imagery as a data collection method is expected to increase (Depauw et al. 2022).

Unmanned aerial systems specifically have been an important image acquisition technology for data collection, supporting the University of Florida Institute of Food and Agricultural Sciences (UF/IFAS) Extension mission to develop and share knowledge in agriculture, human and natural resources, and life sciences. Aerial surveys are commonly used for agricultural applications to provide high-resolution, real-time data on crop health, soil conditions, and field variability, enabling farmers to make data-driven decisions for optimizing yield and resource management (Fletcher and Singh 2020; Fletcher and Singh 2021). These precision agricultural principles can be adopted by the discipline of natural resource management in areas such as coastal monitoring, forest health, wildlife monitoring, aquatics, and invasive plant detection and management.

A UAS can be used to survey areas of interest (AOI) that may be too remote or expansive to feasibly attempt with traditional ground-based data collection. UAS can be used to survey hundreds to thousands of acres and can be equipped with various sensors, including a standard red, green, blue (RGB) digital camera, multispectral, hyperspectral, or Light Detection and Ranging (LIDAR) sensors, offering the capability to capture an array of valuable information (Kakarla and Ampatzidis 2021). Images captured by UAS are geotagged, embedding precise location data such as latitude, longitude, altitude, bearing, and heading at the image’s centroid. This information, recorded by the onboard Global Positioning System (GPS) and gimbal, is stored in the image file’s Exchangeable Image File Format (EXIF) and Extreme Memory Profiles (XMP) metadata for mapping and analysis. These images can be stitched together to create orthomosaics, which are composite images that have been corrected for distortion to form an accurate representation of the entire survey area. Orthomosaics are useful in many cases, but they require significant effort in the field. Ground control points must be installed before the flight to enhance spatial accuracy, which may be impractical for large or difficult-to-access AOIs. Additionally, a long flight time and multiple batteries are needed for a large AOI because thousands of overlapping images must be captured during a sufficiently slow flight to prevent image blur. The resulting large file sizes can be difficult to manage without advanced computing power and memory, making them challenging to share with stakeholders. For these instances, a single geotagged image, while less spatially accurate than an orthomosaic, may be just as useful. These images can be used for tasks such as estimating percent cover of a species, calculating the area of regions of interest, or counting individual plants.

While UAS are powerful tools for data collection, not all practitioners have access to this technology. However, the widespread availability of mobile phones capable of capturing geotagged imagery offers a practical alternative for capturing valuable data in smaller areas of interest. Using smartphones for photo-point surveys allows researchers and practitioners to collect replicable temporal data by taking photographs at predetermined locations with methods such as point-intercept surveys and quadrats. This approach can be effective for estimating vegetation cover, tracking species distribution, and monitoring other ecological indicators. Geotagged imagery from mobile phones has even been used to estimate chlorophyll levels in plants, which can indicate plant health and distribution in the field (Barman et al. 2022). This accessible method makes it easier to conduct localized environmental monitoring and gather valuable data for ongoing studies.

Survey Methods

Tried-and-true survey methods in agriculture and natural resource management such as quadrat sampling and line-point intercepts have been used for decades and are well established for evaluating broad landscapes and assessing community ecology (Madsen and Wersal 2018). These methods allow for classification to the species level in an assessment of community richness and diversity, or relation to species and canopy structure along a fixed transect across a landscape. These survey methods can be paired with photo-point monitoring to document changes in vegetation in fixed locations through repeat sampling over time, typically from an oblique perspective on the ground with a small field of view.

These data collection procedures are often suitable for the adoption of UAS capturing high-quality imagery in the nadir position (vertically downward) from a fixed GPS coordinate and altitude. Using a waypoint mission planner such as Ardupilot (Chintanadilok et al. 2022) (Table 1), UAS flight plans can be designed to be adopted into experimental designs where coordinates can be randomized throughout the AOI to create imagery akin to a plot or quadrat or sampled along a fixed transect as in the line-point intercept method. Images captured with waypoint missions using GPS coordinates can be used for repeated measures and detection of change in an AOI over time. Alternatively, these data collection procedures can be paired with mobile GIS platforms such as ArcGIS Survey 123 and tables or smartphones to create photo-point surveys from the field.

Table 1. Several commercially available software options for mission planning.

Software

Website

Ardupilot Mission Planner

https://ardupilot.org/planner/

QGroundControl

https://qgroundcontrol.com/

DroneDeploy

https://www.dronedeploy.com/

Drone Harmony

https://www.droneharmony.com/

Pix4Dmapper

https://www.pix4d.com/

Software

For this methodology, we are utilizing open-source geospatial software (QGIS, version 3.28 LTR) running on Windows 11. However, these basic processes apply to other GIS programs, including ArcGIS Pro.

Image Acquisition

Most cameras on commercially available unmanned aerial platforms are RGB with Complementary Metal Oxide Semiconductor (CMOS) sensors that range from 12 megapixels to 20 megapixels (MP). Typically, smaller vegetation can be mapped at the species level at pixel resolutions of 1 centimeter (cm) ground sampling distance (GSD) or less. GSD is the distance between two consecutive pixels on the ground, measured using the distance from the camera to an object on the ground, the sensor height and width, and the focal length of the camera. A larger GSD indicates lower spatial resolution and therefore less visible detail in the image. A GSD of 1 cm means that 1 pixel in the image represents 1 cm2 on the ground. Lower-resolution products such as satellite imagery can have a 10- to 30-meter (m) GSD, which would translate to a total image footprint of about 100 m. Visit the Pix4D GSD calculator tool for more information regarding GSD and an automated GSD calculator. The resolution of the imagery can be adjusted by simply changing the altitude above ground level (AGL) of the UAS. However, based on regulations from the Federal Aviation Administration, UAS flights are restricted to 400 ft or below, generally relegating images to GSDs that are 3–5 cm or fewer. All images are encoded with EXIF and XMP metadata that describe X, Y, and Z coordinates as well as yaw, pitch, and roll of the camera gimbal at the time the photo is taken (Figure 1). There are free programs such as EXIF.tools that allow the user to view this data, which is helpful when orienting the images in a GIS program.

Figure 1. Roll is the rotation around the front-to-back axis, pitch is the rotation around the side-to-side axis, and yaw is the rotation around the vertical axis.
Credit: © DesignLands/Adobe Stock, modified by A. J. Bullard, UF/IFAS

Smartphones only capture information about the latitude and longitude of the image centroid and do not record precise information about altitude or heading. To make a replicable smartphone photo-point survey, hold the phone camera parallel to the ground, directly over the point, in line with the survey transect without zoom, and try to maintain a consistent distance from the ground to your phone. This height can be measured and used in place of altitude in the GSD calculator linked above to import images into GIS at the proper scale (Figure 2). Alternatively, use a PVC quadrat of known size and line up the image width with the edges of the quadrat (Figure 3). To import the image at the proper rotation, consider downloading a digital compass app on your smartphone. The camera specifications of rear phone cameras can vary depending on the brand and version of your phone. Therefore, consult the camera specifications of your phone from your phone’s manufacturer.

Figure 2. Diagram of point-intercept method along a transect line. An observer records vegetation at regular intervals (red points) using a smartphone. This method allows for standardized and repeatable vegetation sampling.
Credit: © Reginald Diesel/Adobe Stock, modified by C. J. Romano, UF/IFAS
A white square frame made of PVC pipe placed over a transect line (yellow measuring tape).
Figure 3. Quadrat constructed from PVC placed over a transect line (yellow measuring tape) for vegetation sampling. Photo taken parallel to the ground with a smartphone camera to ensure a consistent overhead perspective.
Credit: A. E. Riner, UF/IFAS

Importing Geotagged Images into QGIS Software

  1. Start by opening the Processing Toolbox using the gear icon on the toolbar or use the Ctrl+Alt+T command.
    • In the search bar, search for “Import geotagged photos,” which will be listed under the “Vector creation” menu (Figure 4).
    Processing Toolbox menu. Under "Vector creation," "Import geotagged photos" is marked by a gear icon at the end of the list.
    Figure 4. “Import geotagged photos” tool location within the Processing Toolbox.
    Credit: A. E. Riner, UF/IFAS
  2. Double-click on “Import geotagged photos.” In the “Import geotagged photos” window:
    • Select the “…” button next to the input folder drop-down menu.
    • Browse to the location where your photos are stored. Click on the storage folder.
    • Click “Select folder.”
    • In the “Photos [optional]” drop-down menu, you can choose to import the photos as a temporary layer or save to file to save them into different formats such as a shapefile (.shp), comma-separated values file (.csv), and more.
    • Click “Run” (Figure 5).
    "Import geotagged photos" tool with inputs highlighted: "Input folder" at the top left, "Save to File" near the center of the screen, and "Run" at the bottom.
    Figure 5. “Import geotagged photos” tool with inputs.
    Credit: A. E. Riner, UF/IFAS
  3. The new layer will be saved in the Layers Panel as “Photos” by default and will display on the map as point symbols. If the Layers Panel is not visible, open the panel by right-clicking on the menu toolbar and selecting “Layers Panel” (Figure 6).
    The Panels menu is shown on the right side of the screen. "Layers Panel" is highlighted in yellow as the eighth panel from the top.
    Figure 6. Opening the Layers Panel.
    Credit: A. E. Riner, UF/IFAS
  4. Importing the photos will result in stored attributes such as file name, file pathway, altitude, latitude, longitude, and timestamp. To open the Attribute Table to view these values (Figure 7):
    • Right-click on the layer.
    • Select “Open Attribute Table.”
    • Rename the layer according to your desired naming convention by right-clicking on the layer and choosing “Rename Layer.” We recommend including the site name and date to avoid confusion when uploading multiple flight plans.
    Figure 7. Image locations overlaid on satellite imagery and the Attribute Table.
    Credit: J. S. Glueckert, UF/IFAS

Displaying Vector Points as Image Symbology

  1. To display the vector points as images, open the editing tool by right-clicking on the pencil icon and choosing “Toggle Editing.”
  2. Open the Layer Styling Panel by right-clicking on the menu toolbar and selecting “Layer Styling Panel.”
    • In the Layer Styling Panel, click on “Simple Marker.” This will display a drop-down menu for symbol layer type. Choose “Raster Image Marker” (Figure 8)
    Figure 8. Changing symbology of points.
    Credit: J. S. Glueckert, UF/IFAS
  3. Select the “Data Defined Override” button next to the blank drop-down menu.
    • Change “Field type: string” to “Field type: photo.”
    • Switch the unit to “Meters at Scale.”
    • Adjust the width based on the known GSD of your photos calculated from the camera specifications and the AGL of your flight for UAS images. For smartphone images, use the width of your quadrat or the GSD calculated from the average height of your image from the ground. The rotation of the image can be changed based on the rotation found in the XMP or EXIF data for UAS images, or a compass app on your phone for on-ground surveys (Figure 9).
    • Click “Live update” to see changes as they are made or click “Apply” when the necessary fields are filled (Figure 10).
Text reads: Symbol layer type: Raster Image Marker. Size, opacity, rotation, and offset fields are below. Three options are highlighted: "Unit: Meters at Scale," "abc photo," and "Field type: string."
Figure 9. Importing imagery as symbology at proper scale and orientation.
Credit: J. S. Glueckert, UF/IFAS
Figure 10. UAS waypoint (top) and smartphone images (bottom) imported as symbology.
Credit: J. S. Glueckert and A. E. Riner, UF/IFAS

Note: Some drones capture raw images, which are important for spectral applications such as calculating vegetation indices. However, raw files can slow down processing in QGIS due to limited GPU acceleration and are not necessary for simple assessments such as area calculation or species counts. If step 3 is slow, consider converting the imagery to JPG format first to improve performance.

Creating Annotations and Calculating Area

  1. Click on the Layers Panel on the toolbar.
    • Hover your mouse over “Create New Layer” and select “New Shapefile Layer.”
    • Click on “…” and navigate to your working directory where you would like to save the file. It is best practice to include date and location within the file name.
    • Enter the coordinate system of your project. Universal Transverse Mercator (UTM) is best for working in metric units. Gainesville, Florida, is in UTM zone 17N (Figure 11).
    • To annotate polygons, select “Polygon,” and to annotate points, select “Point.” Polygons can be used for calculating area or percent coverage, whereas point data can be used for species counts.
    • Add fields of interest by typing in the field name and data type under “New Field.” For qualitative data, select “String” for data type; for quantitative data, select “Integer” or “Decimal.” String data is text. Examples of this are species names or descriptions. Integer or decimal data are numbers and could contain values such as leaf counts or injury ratings.
    • Select “Add to Fields List.”
    • Select “OK.”
    Figure 11. Creating a new shapefile tool with inputs.
    Credit: A. E. Riner, UF/IFAS
  2. Right-click the new shapefile layer under the Layers Panel.
    • Select “Toggle Editing” (Figure 12).
    A highlighted sample layer and a menu containing the "Toggle Editing" option, which is denoted by a yellow pencil icon.
    Figure 12. “Toggle Editing” location.
    Credit: A. E. Riner, UF/IFAS
  3. Select “Add Polygon Feature” from the editing toolbar (Figure 13).
    "Add Polygon Feature" on the right side of the screen. The icon consists of a green, bean-like shape on top, with a small gold square below that contains a white star-like shape.
    Figure 13. “Add Polygon Feature” location.
    Credit: A. E. Riner, UF/IFAS
  4. Click the vertices around the feature of interest. When you are finished with the annotation, right-click to finalize the shape. Enter the field information and select “OK” (Figure 14).
    A closeup of a plant outlined in red within the quadrat. A menu to the right reads, "id: 1," "Species: Chamber Bitters."
    Figure 14. Creating annotation.
    Credit: A. E. Riner, UF/IFAS
  5. Right-click the layer under your Layers Panel and select “Properties.”
    • Under the “Symbology” tab, you can change the visualization of your annotation (e.g., making the interior of the annotation transparent) (Figure 15).
    • Be sure to save layer edits when you are finished annotating.
    Figure 15. Navigating to “Symbology” to make the polygon interior transparent.
    Credit: A. E. Riner, UF/IFAS
  6. To see a list of your completed annotations, navigate to the Attribute Table (step 4 in the previous directions).
    • You can open the field calculator to calculate the area of each species. The area will be in m2 if the coordinate system is set to UTM.
    • Select “Open Field Calculator.”
    • Select “Create a New Field” and enter the new field name.
    • Look for the “$area” option under the “Geometry” menu (Figure 16).
    • Select “OK.”
    Figure 16. Calculating the area of a polygon with the field calculator.
    Credit: J. S. Glueckert, UF/IFAS
  7. To pan between features, right-click the feature of interest in the Attribute Table and select “Zoom to Feature” (Figure 17).
    "Zoom to Feature" is shown as the third option on the menu.
    Figure 17. Panning between features of interest.
    Credit: J. S. Glueckert, UF/IFAS

Limitations and Applications

Establishing acceptable levels of spatial accuracy is essential to ensure the effectiveness of spatial analyses. Various limitations, such as distortion, can affect accuracy; for example, GPS readings often come with a margin of error of 1 to 2 meters. While this level of precision may be unacceptable in critical applications such as herbicide spray, it may be justifiable in other contexts, such as conducting species surveys or assessing plant health. Satellite imagery and orthomosaic products can be valuable tools, but they are not suitable for every application. Higher-resolution products may be required in certain situations, but large areas of interest may create computational challenges due to the size of the files generated.

Conclusion

When planned effectively, UAS waypoint missions or in-the-field smartphone photo-point missions can yield individual images that facilitate detailed analyses of various attributes. These include rapid assessments of an AOI, inspections, vegetation cover assessments, and management of invasive species. Photo-point surveys using UAS can enhance field efficiency by enabling faster travel speeds, improved access, and a broader field of view, and allowing for larger areas to be covered in less time. The capability to obtain repeated measures of an AOI more frequently and at higher spatial resolution can complement traditional survey methods, such as quadrat sampling or line-point intercepts along transects. This approach not only builds a more comprehensive dataset but also allows for the archiving of imagery, enabling more detailed analyses and retrospective assessments in the future.

References

Barman, U., and R. D. Choudhury. 2022. “Smartphone Image Based Digital Chlorophyll Meter to Estimate the Value of Citrus Leaves Chlorophyll Using Linear Regression, LMBP-ANN and SCGBP-ANN.” Journal of King Saud University — Computer and Information Sciences 34(6): 2938–2950. https://doi.org/10.1016/j.jksuci.2020.01.005

Chintanadilok, J., S. Patel, Y. Zhuang, and A. Singh. 2022. “Mission Planner: An Open-Source Alternative to Commercial Flight Planning Software for Unmanned Aerial Systems: AE576/AE576, 8/2022.” EDIS 2022(4). https://doi.org/10.32473/edis-ae576-2022

Depauw, L., H. Blondeel, E. De Lombaerde, K. De Pauw, D. Landuyt, E. Lorer, P. Vangansbeke, T. Vanneste, K. Verheyen, and P. De Frenne. 2022. “The Use of Photos to Investigate Ecological Change.” Journal of Ecology 110(6): 1220–1236. https://doi.org/10.1111/1365-2745.13876

Dyrmann, M., A. K. Mortensen, L. Linneberg, T. T. Høye, and K. Bjerge. 2021. “Camera Assisted Roadside Monitoring for Invasive Alien Plant Species Using Deep Learning.” Sensors 21(18): 6126. https://doi.org/10.3390/s21186126

Fletcher, J., and A. Singh. 2020. “Applications of Unmanned Aerial Systems in Agricultural Operation Management: Part I: Overview: AE541/AE541, 6/2020.” EDIS 2020(6). https://doi.org/10.32473/edis-ae541-2020

Harsch, M. A., P. E. Hulme, M. S. McGlone, and R. P. Duncan. 2009. “Are treelines advancing? A global meta‐analysis of treeline response to climate warming.” Ecology Letters 12(10): 1040–1049. https://doi.org/10.1111/j.1461-0248.2009.01355.x

Kakarla, S. C., and Y. Ampatzidis. 2021. “Types of Unmanned Aerial Vehicles (UAVs), Sensing Technologies, and Software for Agricultural Applications: AE565/AE565, 10/2021.” EDIS 2021(5). https://doi.org/10.32473/edis-ae565-2021

Liu, J., and E. Pattey. 2010. “Retrieval of Leaf Area Index from Top-of-Canopy Digital Photography over Agricultural Crops.” Agricultural and Forest Meteorology 150(11): 1485–1490. https://doi.org/10.1016/j.agrformet.2010.08.002

Madsen, J. D., and R. M. Wersal. 2018. “Proper Survey Methods for Research of Aquatic Plant Ecology and Management.” Journal of Aquatic Plant Management 56: 90–96.

Singh, A., and J. Fletcher. 2021. “Applications of Unmanned Aerial Systems in Agricultural Operation Management: Part II: Platforms and Payloads: AE552/AE552, 02/2021.” EDIS 2021(1). https://doi.org/10.32473/edis-ae552-2021