Vegetation monitoring using multispectral sensors – best practices and lessons learned from high latitudes

Emerging drone technologies have the potential to revolutionise ecological monitoring. The rapid technological advances in recent years have dramatically increased affordability and ease of use of Unmanned Aerial Vehicles (UAVs) and associated sensors. Compact multispectral sensors, such as the Parrot Sequoia (Paris, France) and MicaSense RedEdge (Seattle WA, USA) capture spectrally accurate high-resolution (fine grain) imagery in visible and near-infrared parts of the electromagnetic spectrum, providing supplement to satellite and aircraft-based imagery. Observations of surface reflectance can be used to calculate vegetation indices such as the Normalised Difference Vegetation Index (NDVI) for productivity estimates and vegetation classification. Despite the advances in technology, challenges remain in capturing consistently high-quality data, particularly when operating in extreme environments such as the high latitudes. Here, we summarize three years of ecological monitoring with drone-based multispectral sensors in the remote Canadian Arctic. We discuss challenges, technical aspects and practical considerations, and highlight best practices that emerged from our experience, including: flight planning, factoring in weather conditions, and geolocation and radiometric calibration. We propose a standardised methodology based on established principles from remote sensing and our collective field experiences, using the Parrot Sequoia sensor as an example. With these good practises, multispectral sensors can provide meaningful spatial data that is reproducible and comparable across space and time.

. New multispectral camera and sensor options continue to be released as 95 technologies develop rapidly, yet many common considerations exist with the use of these 96 type of sensors for the collection of vegetation monitoring data that we describe below.

119
With the goal of collecting comparable and reproducible drone imagery in mind, we discuss 120 the fundamental technical background of multispectral drone sensors (Section 1), outline the 121 proposed workflow for data collection and processing (Section 2) and conclude by reviewing 122 the most important steps of the protocol in more detail (Section 3-6).

147
Reflectance is not directly measured by multispectral imaging sensors, instead they 148 measure at-sensor radiance, the radiant flux received by the sensor (Figure 1). Surface 149 reflectance is a property of the surface independent on the incident radiation (ambient light),

150
whereas at-sensor radiance is a function of surface radiance (flux of radiation from the 151 surface) and atmospheric disturbance between surface and sensor (see Wang  here that this is not a calibration of the sensor itself, but a calibration of the output data.

181
Practical aspects of radiometric calibration are discussed later in Section 6.

183
The relationship between DN and the surface reflectance value of a pixel is also influenced 184 by the optical apparatus and the spectral response of the sensor, which require additional

265
Specific research questions and scientific objectives should be used to determine the exact 266 methods used and the data outputs required from a multispectral drone survey ( Figure 2).

267
However, using a standardized workflow will help users avoid common pitfalls that affect 268 data quality, and thus ensure repeatable and comparable data collection through time and 269 across sites. We suggest starting by identifying the spatial and temporal scales required to

299
Manufacturer guidance, online discussion boards and email lists (such as the HiLDEN 300 network: arcticdrones.org) can provide help and information on these technical problems.

301
Upon completion of the flight, image data can be retrieved from the sensors and transferred 302 to a computer for processing. We recommend backing up the drone / sensor memory after 303 every flight to reduce the risk of data loss due to hardware failure and crashes.

305
Processing will vary with the type of sensor / software that is used. Figure 2

313
We suggest a final quality control step (Step 10) to assess the accuracy of the geo-location 314 and radiometric calibration of the outputs, before using them in the analysis to answer the 315 research questions. We also highlight that drone surveys can produce large amounts of data 316 that can create challenges for data handling and archiving. It is helpful to produce a storage

353
We recommend a minimum of 75% of for multispectral flights for both side-and front-lap

441
We recommend recording sky conditions during the flight (Table 2)   Thin cirrussun not obscured 3 Thin cirrussun obscured 4 Scattered cumulussun not obscured 5 Cumulus over most of skysun not obscured 6 Cumulussun obscured 7 Complete cumulus cover 8 Stratussun obscured 9 Drizzle

725
In this manuscript, we suggested a standardized workflow for multispectral drone surveys, 726 discussed the technical aspects and challenges of multispectral drone sensors, flight 727 planning, the influence of weather and sun, as well as aspects of geolocation and radiometric calibration. We believe that these key factors, if properly accounted for, will allow 729 for the majority of multispectral drone surveys to produce data that is comparable across 730 different study regions, plots, sensors and time. We encourage ecologists and other 731 researchers to incorporate these methods and perspectives in their planning and data 732 collection to promote higher data quality and allow for cross site comparisons. Standardised 733 procedures and practises across research groups (e.g., those developed by the HiLDEN 734 network) have the potential to provide highly-valuable baseline data that can be used to