ArticlePDF Available

Abstract and Figures

Drone remote sensing research has surged over the last few decades as the technology has become increasingly accessible. Relatively easy-to-operate drones put data collection directly in the hands of the remote sensing community. While an abundance of remote sensing studies using drones in myriad areas of application (e.g., agriculture, forestry, and geomorphology) have been published, little consensus has emerged regarding best practices for drone usage and incorporation into research. Therefore, this paper synthesizes relevant literature, supported by the collective experiences of the authors, to propose ten fundamental practices for drone remote sensing research, including (1) focus on your question, not just the tool, (2) know the law and abide by it, (3) respect privacy and be ethical, (4) be mindful consumers of technology, (5) develop or adopt a data collection protocol, (6) treat Structure from Motion (SfM) as a new form of photogrammetry, (7) consider new approaches to analyze hyperspatial data, (8) think beyond imagery, (9) be transparent and report error, and (10) work collaboratively. These fundamental practices, meant for all remote sensing researchers using drones regardless of area of interest or disciplinary background, are elaborated upon and situated within the context of broader remote sensing research.
This content is subject to copyright. Terms and conditions apply.
OPEN ACCESS |Review
Fundamental practices for drone remote sensing research
across disciplines
Adam J. Mathews a,KunwarK.Singh b, Anthony R. Cummings c, and Stephanie R. Rogers d
aDepartment of Geography, Binghamton University, State University of New York, Binghamton, NY 13902, USA; bAidData, Global
Research Institute, William & Mary, Williamsburg, VA 23185, USA; cGeospatial Information Sciences Program, School of Economic,
Political and Policy Sciences, University of Texas at Dallas, Richardson, TX 75080, USA; dDepartment of Geosciences, Auburn
University, Auburn, AL 36849, USA
Corresponding author: Adam J. Mathews (email: adam.mathews@binghamton.edu)
Abstract
Drone remote sensing research has surged over the last few decades as the technology has become increasingly accessible.
Relatively easy-to-operate drones put data collection directly in the hands of the remote sensing community. While an abun-
dance of remote sensing studies using drones in myriad areas of application (e.g., agriculture, forestry, and geomorphology)
have been published, little consensus has emerged regarding best practices for drone usage and incorporation into research.
Therefore, this paper synthesizes relevant literature, supported by the collective experiences of the authors, to propose ten
fundamental practices for drone remote sensing research, including (1) focus on your question, not just the tool, (2) know the
law and abide by it, (3) respect privacy and be ethical, (4) be mindful consumers of technology, (5) develop or adopt a data
collection protocol, (6) treat Structure from Motion (SfM) as a new form of photogrammetry, (7) consider new approaches to
analyze hyperspatial data, (8) think beyond imagery, (9) be transparent and report error, and (10) work collaboratively. These
fundamental practices, meant for all remote sensing researchers using drones regardless of area of interest or disciplinary
background, are elaborated upon and situated within the context of broader remote sensing research.
Key words: drones, unoccupied aircraft systems, UAS, remote sensing, geographic information science, research
Introduction
Drones have significantly impacted the remote sensing
community by providing a temporally flexible, relatively low-
cost, and easy-to-operate platform to obtain very high spatial
resolution data over small study areas. Otherwise known as
unoccupied/unmanned/uncrewed aircraft systems (UAS), un-
occupied aerial vehicles, or remotely piloted aircraft, among
other names, drones provide a new and exciting perspec-
tive for Earth observation. Drones enable research at un-
precedented geographic scales, filling a low-altitude data
collection gap between ground-based field/survey data col-
lection and moderate-altitude aerial data collection from
occupied/piloted aircraft. Not surprisingly, remote sensing
research on applications ranging from forestry (Wallace
et al. 2012) and precision agriculture (Matese et al. 2015)
to atmospheric measurement (Hemingway et al. 2017)in-
corporates drones as data collection devices. Beyond re-
search applications, drones aid basic remote sensing re-
search through multi-scale/multi-sensor data fusion analy-
ses (Campbell et al. 2020;Alvarez-Vanhard et al. 2021). Re-
gardless of intended use, drones (re)connect researchers and
students to the remote sensing data collection process (not
including in situ measurements) that has been neglected
over the past several decades due to the increased avail-
ability of free satellite and aerial imagery (Rogers et al.
2022).
Simic Milas et al. (2018) envision drones as the third-
generation source of remotely sensed data after moderate-
/high-altitude aircraft and Earth-orbiting satellites. The pop-
ularity and importance of drones to remote sensing are un-
deniable and is reflected in the publication of drone-focused
textbooks (e.g., Jensen 2017;Aber et al. 2019;Green et al.
2020;Frazier and Singh 2021). Teaching and learning suc-
cesses attest to the many benefits of incorporating drones
into the classroom (Jeziorska 2014;Clie 2019;Joyce et al.
2020;Yang et al. 2020). Drones, further, provide a remote
sensing platform for non-traditional users, i.e., an era of
“personal remote sensing” (Jensen 2017). Some go so far as
to state that drones democratize remote sensing (Carrivick
et al. 2016;Frazier and Singh 2021; see survey responses
from Rogers et al. 2022), enabling citizen scientists to ac-
quire and share their own geographic data aided by web-
based platforms such as OpenAerialMap (openaerialmap.org)
and GeoNadir (geonadir.com).
So, why should we care? Do drones change remote sens-
ing? If so, how? By providing very high spatial, or hyperspa-
tial, resolution data (the most substantial oering to date,
both as two-dimensional images and three-dimensional point
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 1
Canadian Science Publishing
2 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
clouds), drones open new avenues for exploration and chal-
lenge traditional remote sensing methodological approaches,
indicating what some see as a paradigm shift within re-
mote sensing (Lippitt 2015). Conceptualizing the incorpo-
ration of drones within the remote sensing model (RSM;
Strahler et al. 1986) and the remote sensing communica-
tion model (Lippitt et al. 2014), Lippitt and Zhang (2018) em-
phasize how drones impact optical remote sensing by pro-
viding 1-2 cm ground sampling distances necessitating H-
resolution approaches (i.e., scenes wherein individual objects
can be identified) such as object-based image analyses (OBIA).
This notion of a paradigm shift in the context of only opti-
cal data presents a compelling case. Given that drones en-
able data collection from a variety of other sensors beyond
images (e.g., lidar and radar), the argument that they present
a paradigm shift is even stronger.
The sheer volume of remotely sensed data and its level
of spatial detail, though, have outpaced our ability to eec-
tively analyze it (see Li et al. 2016;Shook et al. 2019). Other
problems present themselves regarding this new scale of data
collection and its relationship to the scale of conceptualiza-
tion of geographic problems. In this way, there may be a dis-
connect between the two because of rapid advancements in
drone technology that have made hyperspatial and other data
available to researchers. In this way, are we producing too
much data to actually address our research objectives and
problems? Much like highly accessible geo-enabled smart-
phones, drones contribute a great deal to the era of geospatial
big data.
While there are many useful review papers and bibliogra-
phies (Hardin and Hardin 2010;Watts et al. 2012;Colomina
and Molina 2014;Pajares 2015;Cummings et al. 2017c;
Mathews and Frazier 2017;Singh and Frazier 2018;Hardin
et al. 2019;Tmu ši ´
c et al. 2020;Mathews 2022;Nex et al. 2022)
focusing on drone remote sensing, these works (albeit with
varying foci) do not provide both straightforward and broad-
in-scope (not focused on technical specifications) guidance
for those adopting drones as a remote sensing tool to conduct
research. Other works help to frame and conceptualize the
integration of the technology into the modern remote sens-
ing field (e.g., Lippitt and Zhang 2018;Simic Milas et al. 2018),
but likewise do not provide broad practical advice on the use
of the technology. Therefore, this article provides such direc-
tion by proposing ten fundamental practices for those con-
ducting remote sensing research with drones, regardless of
research focus. There could be more than ten practices be-
cause there are myriad items to cover, but we organized this
brief list for the sake of simplicity and the utility of adoption.
We aim to provide a go-to resource with both the ten funda-
mental practices and an extensive in-text citation/reference
list for both remote sensing and non-remote sensing scien-
tists new to drones as well as those with extensive experi-
ence. Importantly, though, we do distinguish remote sensing
scientists, i.e., those with extensive background in the field
and actively publishing research, from non-remote sensing
scientists adopting drone remote sensing methods to conduct
research. Such a dierentiation is increasingly important be-
cause drones are highly accessible (compared to more tradi-
tional forms of remote sensing), but without knowledge of re-
mote sensing fundamentals, this will result in significant er-
rors in the collected data. In this way, the latter group might
benefit more from this paper. We also provide a critical re-
mote sensing perspective on the subject by posing conceptual
challenges and opportunities.
Ten fundamental practices
We propose ten fundamental practices for drone remote
sensing regardless of field of study, as illustrated in Fig. 1: (1)
focus on your research question, not just the tool, (2) know
the law and abide by it, (3) respect privacy and be ethical,
(4) be mindful consumers of technology, (5) develop or adopt
a data collection protocol, (6) treat Structure from Motion
(SfM) as a new form of photogrammetry, (7) consider new ap-
proaches to analyze hyperspatial data, (8) think beyond im-
agery, (9) be transparent and report error, and (10) work col-
laboratively.
Focus on your research question, not just the
tool
Similar to geographic information systems (GIS) and global
navigation satellite systems (GNSS) before them, the excite-
ment associated with a new technology, i.e., the emergence of
low-cost o-the-shelf remote sensing drones, has led to rapid
adoption with little forethought regarding how such tech-
nology might alter the generation of research questions and
challenge current methodological conventions. GIS is “some-
times accused of being technology driven, a technology in
search of applications” (Goodchild 1992, p. 31). This accusa-
tion could very well be directed at drone technology. Those
looking to utilize drones, a relatively new tool, in their re-
search must first, and most importantly, focus on the re-
search question and not just the technology. As much as
possible, the tool should not dictate or influence the devel-
opment of research questions, objectives, and/or hypotheses
(i.e., the scientific process should not be changed based on
using a drone). That said, given the history of remote sensing
as a field of study, the tools (from satellite-based optical sen-
sors to airborne lidar) impact and constrain what researchers
can do. Importantly, it is up to the scientist to form suitable
research objectives that align with obtainable remote sens-
ing data. After developing research questions, consider the
questions provided in Box 1 to assess whether (and to what
degree) a drone is needed to address said questions.
Focusing on the research question helps narrow down the
type of imagery (or other data), seasonality of the targeted
object, spatial resolution, image overlap (if imagery is being
collected), and frequency of data collection. If the sole pur-
pose of drone-captured images is to map land cover types for
more than 1 hectare (>10,000 m2), the availability of high-
resolution satellite imagery (e.g., PlanetScope) captured at
high temporal frequency might serve better in mapping than
spending hours planning, capturing, and processing drone
images and should be strongly considered. Both the area and
height of the object of interest should be considered when
mulling over acquiring drone imagery. Drone-captured im-
ages, due to their sub-centimeter resolution, are optimal for
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 3
Fig. 1. The ten fundamental practices for drone remote sensing research across disciplines, starting with the importance of
focusing on your research question and ending with the significance of working collaboratively. These practices are not an
exhaustive list but are synthesized based on peer-reviewed literature and accrued through field experience.
Box 1. Questions for researchers to consider before adopting drones.
rDo you need drone-captured data to answer your research questions, or are there existing remote sensing datasets (e.g.,
satellite or aerial imagery,lidar) that meet the needs of your study?
Consider the necessary temporal and spatial resolutions required for your specific research purpose.
rWill the spatial resolution and scale of drone data match the detail and scale of your study topic?
rWill you be able to obtain permission to fly a drone in your study area?
rAre you planning to use data for mapping or measuring objects (e.g., heights, crown width)?
rWhat is the smallest size of the object (e.g., wildlife—birds, elephants; crop types—rice, orchards via individual apple tree
canopies; infrastructure) you will be mapping or measuring?
rDo seasonal changes affect the object of interest (e.g., presence/absence, growth, and location)?
rWill you create 3D data from drone-captured images using SfM photogrammetry?
rDo you have access to software (e.g.,flight mission planning, SfM, image stitching/orthophoto production) crucial for capturing
and processing images, both in the lab and the field?
Canadian Science Publishing
4 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Fig. 2. Shape and size of field observations (e.g., golf bunker, road, and golf cart) with minimum mapping unit (A) and number
of 6 cm ×6 cm pixels (B), 1 m ×1 m pixels (C), and 5 m ×5 m pixels (D) for mapping these objects. Drone images of C and D
are aggregated to 1 and 5 m spatial resolutions to show the number of pixels required to map field observations.
counting individual birds in a colony (Afán et al. 2018) or iden-
tifying wilted crops (Dang et al. 2020) compared to commer-
cial high-resolution satellite imagery. However, users often
forget their need for the minimum spatial resolution known
as the minimum mapping unit and continue focusing on im-
ages with the highest possible resolution that challenge their
data storage and computational resources (see Fig. 2). For ex-
ample, do you require drone images below sub-centimeter
spatial resolution if the smallest size of the object of interest
is 100 cm2? A spatial resolution between 1 and 5 cm might be
optimal. As a rule of thumb, you should divide the smallest
size of the object of interest by four to estimate the spatial
resolution. Usually, the smallest observable feature that you
can reliably identify would need to be four contiguous pixels
in an image. Therefore, the spatial resolution of the drone im-
age should be 5 cm to meet the requirement of 25 cm2.Plant
phenology, along with seasonal changes that aect vegeta-
tion productivity and greenness, should be a part of the “fo-
cus on your question” to improve the eectiveness of spatial
resolution in mapping or measuring any aspect of vegetation.
Targeting a certain growth stage of a plant (e.g., flowering)
for planning drone data acquisition may oer a better dis-
tinction. Focus on your question! If the end goal of a planned
project is to map an object or land cover type, users should
consider commercial imagery that may cover a larger area
with the lowest possible geometric and radiometric errors.
Measuring any aspect of the object of interest requires
many photogrammetric considerations (i.e., % overlap [for-
ward and side], georeferencing approach [direct or indirect
with ground control points (GCPs)], flight altitude, and ter-
rain) in the drone image acquisition. To create a precise three-
dimensional (3D) structure of the object of interest using the
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 5
SfM photogrammetric technique, consistent overlap between
successive images (i.e., forward overlap) and between flight
passes (i.e., side overlap) is one of the most important consid-
erations (Singh and Frazier 2018). The normal forward and
side overlaps are 60% and 30%, respectively. Image stitch-
ing requires only 20% overlap between images to create a
mosaic for mapping but insucient to create any surface
models (e.g., digital surface model (DSM)). Image stitching
is the process of combining drone-captured images to cre-
ate a seamless image or photomosaic. Higher levels of over-
lap are suggested (i.e., 85% forward and 70% side) for cre-
ating accurate point clouds and surface models (Dandois et
al. 2015;Fraser and Congalton 2018). Flat terrain may re-
quire higher overlap to extract matching points for 3D point
clouds. Any change beyond these overlaps may increase the
number of swaths and images for the study area, thereby in-
creasing the flight duration, image storage, and image pro-
cessing time. The height of the object of interest and terrain
characteristics also aect the drone’s cruising altitude and
takeo locations (see Thomas et al. 2020). A drone’s actual al-
titude above ground level varies with undulating terrain. The
targeted overlaps will decrease if the terrain is higher than
the drone’s takeo point (e.g., DronesMadeEasy 2021). High-
quality image stitching requires many surveyed GCPs or man-
ually documented tie points to build a geometric relation-
ship between the ground reference and images for creating
a seamless image. The GCPs are a set of points on the Earth’s
surface with known coordinates. A suitable number of GCPs,
representing landscape type (e.g., mono standcrop, com-
mercial forest; high relief), contribute to better image stitch-
ing for a high-quality photomosaic (see Galván Rangel et al.
2018). Tie points (e.g., road intersections, rock outcrops, crop
rows, buildings, trails, etc.) are easily recognizable objects
that can be identified in overlapping images to improve the
alignment between images of the mosaicked imagery. Hence,
focus on your question to decide these variables in advance
for capturing images suitable for measuring various parame-
ters (Abdullah 2021).
Know the law and abide by it
The civil aviation authority (CAA) of the country within
which a researcher will operate a drone dictates how, when,
and where they can do so and for what purposes (e.g.,
Unidad Administrativa Especial de Aeronáutica Civil [Colom-
bia], South African Civil Aviation Authority, Guyana Civil Avi-
ation Authority [GCAA; Guyana], etc.). In the United States,
the Federal Aviation Administration (FAA) permits drone op-
erations under the Part 107 (or small UAS) Rule, requiring
those conducting research to obtain a “Remote Pilot” license
(FAA 2022) and follow regulations such as operating within
Class G airspace (below 400 feet of altitude from ground
level), maintaining a visual line of sight with the aircraft, fly-
ing at night with anti-collision lighting, etc. In the case of
the FAA, obtaining a remote pilot license only requires pass-
ing a written exam and a background check without having
to demonstrate satisfactory ability to operate a drone. There-
fore, it is imperative for remote pilots to train on how to fly
with already experienced remote pilots. Given the amount
of time needed to obtain a license and the steep learning
curve associated with remotely controlling aircraft, some re-
searchers might rely on colleagues or consultants to collect
data for them. The onus remains on the scientist, though, to
make sure whoever operates the drone adheres to all laws
and regulations.
Importantly, though, civil aviation authority regulations
are fluid and continue to change. For example, flying at night
was only recently permitted by the FAA. Further, the FAA in-
stituted the Remote Identification (ID) rule in 2021, requiring
drones to broadcast aircraft information such as the drone
ID, location and altitude, velocity, takeo location, and more
during operation (FAA 2021). With the complicated integra-
tion of drones into the airspace, rules are expected to con-
tinue changing, and those using drones for remote sensing
should check for aviation authority changes/updates before
conducting any flight. Remote pilots, in most cases, have to
renew licenses/certificates regularly (e.g., the FAA requires
biannual recurrent training). Be cognizant, too, that regula-
tions vary between countries (Stöcker et al. 2017) and some-
times within countries (Hodgson and Sella-Villa 2021). That
is, a FAA-issued “Remote Pilot” license is only adequate for
drone flight in the United States unless another country’s
CAA formally declares otherwise. In another variation, in
Guyana, where a remote pilot license is not required, the
GCAA recently issued an advisory that aects the entire chain
of events for flying a drone in that country. Depending on the
class of drone (micro [100 g or less], very small [100 g and less
than 2 kg], small [2 kg and less than 25 kg], medium [25 kg
and less than or equal to 150 kg], and large [greater than
150 kg]), someone lacking a UAS Operator’s Certificate and
wishes to import a drone into the country needs a letter of im-
portation at a cost ($4,000 [US$20], $6,000 [US$30, or $10,000
[US$51], respectively; GCAA Advisory Circular, 2022). To ob-
tain a UAS permit/flight authorization, the cost is determined
by the size of the drone. Aside from fees for importation and
obtaining flight authorization, upon making an application
for the issue or renewal of a UAS Operator Certificate, there
is a requirement for a non-refundable deposit of the basic fee
based on whether the application is for a first-time license
or renewal. The GCAA guidelines have also been amended
to include fees for commercial operations where “dangerous
goods” may be carried by the drone. This illustrates the com-
plexity of legal flight from country to country; researchers
must not overlook the amount of time and eort needed
to adhere to regulations. Fortunately, the International Civil
Aviation Organization (ICAO) provides guidance to 193 mem-
ber states that have similar regulations (ICAO 2022).
Beyond adhering to CAA rules, drone users must consider
within-country, state, and local (e.g., county/parish, town,
city, etc.) regulations as well (see Fig. 3). Typically, these local
authorities do not and cannot regulate the airspace, but other
regulations can be in place, often related to where you are
flying. For example, many public lands (e.g., city and county
parks, state preserves) have outright bans on drone usage on
their properties. Often, though, exemptions for scientific re-
search can be approved by such authorities assuming drone
operators agree to follow additional policies (i.e., fly when
the park is closed to visitors [hourly/daily], fly when an en-
Canadian Science Publishing
6 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Fig. 3. Rules and regulations that remote sensing scientists
collecting data need to adhere to from civil aviation authori-
ties regulating the airspace (top) to landowners and potential
spectators on the ground (bottom). All these aspects are im-
portant and should be thoroughly reviewed before operating
a drone at a study site.
dangered bird species is no longer roosting in the park [sea-
sonal]). Data collection plans must account for these potential
hindrances, which is especially important with remote sens-
ing that requires data capture at specific times of day, such
as near noon when park visitor trac is high. To avoid any
issues in the field, drone pilots should obtain written permis-
sion from all landowners, including those adjacent to study
areas, before conducting flights. In the U.S., this also ensures
that drone operators adhere to United States v. Causby, which
granted landowners exclusive rights to airspace up to 350 feet
above the ground or ground-based structures (U.S. Supreme
Court 1945). Drone operators should maintain open commu-
nication with all stakeholders (e.g., park managers, private
landowners) regarding when flights will be conducted (in-
cluding who will be involved, i.e., an individual or team) and
if any other field data will be collected. Print out permissions
and have a copy readily available when conducting data col-
lection. The process of obtaining permission may take days,
weeks, or months, depending on the site, and this should be
considered when planning your research strategy.
Permissions from your employer also need to be secured.
In the academic environment, many universities have drone
standing committees and/or review boards that issue policies
and guidelines for those using drones in research. University-
specific rules should be researched and followed. Further-
more, an application may need to be filed or additional con-
sultation with administrators may be needed prior to a flight.
Reaching out to the chair of the committee/review board (if
your university has one) is never a bad idea if you have ques-
tions or concerns.
Employers such as universities often handle insurance for
drone users, which is also highly suggested to safeguard peo-
ple (and, to a lesser extent, property) and drone equipment
during use. Most, if not all, remote pilots can share stories
of accidents occurring in the field due to equipment mal-
function, remote pilot error, rapidly changing weather condi-
tions, and a suite of other issues. Even the best remote pilots
face complicated circumstances that can lead to an accident.
Being insured alleviates the risks associated with operating
a drone. Some universities also require each drone to be in-
sured prior to flight.
Respect privacy and be ethical
“Emerging technology can advance faster than society as a
whole can reasonably assimilate and create new laws, poli-
cies, or ethics to govern the conduct and operation of the
technology” (Slonecker et al. 1998, p. 590). Initially, this was
the case with the FAA, which took several years to create
and implement Part 107 to regulate drones in the national
airspace. Such a lag is also apparent regarding additional poli-
cies and ethics. To protect privacy and conduct ethical drone
remote sensing research, think beyond adherence to only
CAA regulations but also in terms of obtaining permission to
be on land (e.g., public, private, and Indigenous peoples land)
and collecting data where people are present (i.e., for safety
but also how collected data might impact people and their
livelihoods). The remote sensing community must go beyond
the minimum requirements, including compliance with eth-
ical codes provided by geospatial organizations such as the
ASPRS (2014),GISCI (2022),andURISA (2003).
Further, remote sensing scientists must resist assuming
that all will benefit from the use of drone technology. Not
everyone has positive associations with the technology, and
the significance of drones used for military and surveillance
purposes cannot be overstated (see Bracken-Roche 2016;
Jackman and Brickell 2021). Drones, especially in military set-
tings, cast a long shadow over the utility of these devices
and their potential benefits to society and humanity over-
all. Scholars emphasize the negative connotations of military
drone usage and how these perceptions transfer over to and
persist in non-military settings (Coker 2013;Enemark 2013;
Cohn 2014).
Researchers should remain open to public concerns about
privacy related to drones. It is the researcher’s obligation
to ensure that everyone surrounding the flying of a drone
is aware of: (1) the research objectives and why a drone is
needed to address them, (2) what data will be obtained and
how it will be used, (3) permission being obtained from the
landowners prior to flying, and (4) an open review process
for landowners/community members to view the data (and
approve its’ use) obtained by the drone before leaving the
study site. The researcher is responsible for protecting the
integrity and privacy of the individuals captured in drone im-
agery. In other words, even though permission was obtained
prior to data collection and an initial review of imagery was
completed before leaving the field, the researcher must main-
tain the responsibility of always protecting people captured
in the data.
Unlike data obtained by satellites, which fly overhead at
altitudes that make them inaccessible and invisible to peo-
ple on the ground, drones can change this aspect of re-
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 7
motely sensed data collection. The entire process of drone
data acquisition provides an opportunity for people to be in-
volved. Community members can provide input on the ar-
eas where flying is permitted and the implications for the
research questions being examined, help to develop drone
flight plans/missions, review data and provide insight into
land-cover classification and analysis, and help to process im-
agery in the field to identify potential areas of concern for im-
agery quality and other issues. Therefore, drones provide an
opportunity for people’s involvement on the ground, allow-
ing the drone data acquisition process to follow accepted ethi-
cal norms while simultaneously presenting opportunities for
transforming the significance of drone data for larger societal
applications. Related to this, data sharing agreements enable
drone researchers to define who will have access to the data,
what will be done with said data, when/how the data will be
shown (i.e., very high spatial resolution can contain several
forms of identifying information), and similar themes may
be explored with research subjects. Special attention should
be paid to respecting sacred sites and cultural spaces (Davis
et al. 2021), such as Indigenous landscapes. In all instances
of work in such spaces, researchers must obtain community
approval and involve community members in the research
process (Cummings et al. 2017a).
The ethical operation of drones and considerations to
protect human and non-human subjects during and post-
operation, in any setting, should be established long before
the remote pilot and team plan missions. Drone operators
should approach their work with the express knowledge and
understanding that the data they obtain will contain aspects
and elements that will make the research subjects—p e o p l e
or landscapesvulnerable in some way. Consequently, a re-
sponsible drone operator will anticipate the type of data that
will be obtained, what will be contained in the imagery or
other data being captured, and likely outcomes of such data
falling into the wrong hands despite the most stringent and
well-intentioned data management protocols. The operator
will therefore develop a workflow and protocols for data col-
lection and management that will allow potential negative
outcomes to be minimized. The drone operator should seek
to separate their data management and data collection proto-
cols into distinct phases and identify potential risks, as sug-
gested in Box 2.
Consider, too, the storage and handling of drone-collected
data (i.e., who will have access to the data). Cloud-based com-
panies are governed by their home countries and therefore
can be influenced by geopolitical tensions. This is well doc-
umented with the United States Government, where adop-
tion and use of Chinese-made DJI drones are largely banned
due to security concerns (Wright 2017;Puko and Ferek 2019;
U.S. DOI 2020;U.S. DOD 2021). Data processing modality
is a significant consideration (i.e., desktop/local storage vs.
cloud processing/storage); for example, should the faces of
onlookers or vehicle license plates be captured within images
and processed using a cloud-based platform (e.g., Pix4Dcloud,
etc.), identifying information is then on servers. Those using
such servicesscientists and professionalstypically do not
have any say in altering any data agreements with technology
companies.
Be mindful consumers of technology
As the number of drone users has increased, so have the
technologies to support drone data collection. Relatively in-
expensive, o-the-shelf products (e.g., drone/aircraft, sensors,
apps, and SfM software) have made drones easy to adopt and
allowed users to obtain high-quality data products quickly
and eectively. A growing repository of help documentation
exists to support drone users’ endeavors (e.g., DroneDeploy
2022;Pix4D 2022). Scientists must be cautious when rely-
ing on the black box functionality associated with many o-
the-shelf products with proprietary algorithms (e.g., SfM-MVS
software, especially). When the user is only concerned with
the inputs and outputs of the software, uncertainty may arise
in the resulting datasets. The user should be aware that even
seemingly minute changes in data collection procedure (e.g.,
altitude, overlap, image/camera settings, etc.) or processing
parameters (e.g., point cloud density) can influence result-
ing data accuracy (O’Connor et al. 2017;Pricope et al. 2019;
Young et al. 2022). Open-source software options for flight
control/operations (e.g., ArduPilot and Pixhawk) and data pro-
cessing (e.g., MicMac, OpenDroneMap, VisualSfM) continue
to emerge and evolve (Ebeid et al. 2017) but are sometimes
constrained by inconsistent updates, dicult-to-use graphi-
cal user interfaces (including language barriers), and a lack
of integration between desktop and cloud platforms.
Accuracy is a critical component of mapping products pro-
duced with drones (e.g., orthophotomosaics and digital ele-
vation models). Without knowing better, a user may expect
all drone products to be created equal, but that is not the
case. Scientists report a high degree of variation in the accu-
racy of SfM-generated elevation data depending on the type
of o-the-shelf drone used (Rogers et al. 2020;Stark et al.
2021). Most o-the-shelf drones produce outputs with high
relative accuracy; that is, when features within the project
are compared to other features within the project, positional
accuracy is very high. However, the absolute accuracy, that is,
when data are compared to a true position on the Earth’s sur-
face, ranges from ±1–2 m horizontally and vertically (Thomas
et al. 2020). This is problematic when overlaying drone data
products with other georeferenced data layers or if you want
to compare drone outputs from the same location over time.
Several o-the-shelf drones now encompass onboard real-
time kinematic (RTK) GNSS functionality to increase posi-
tional accuracy; however, this drastically increases the cost
of the drone. While RTK is useful for real-time positional pro-
cessing, it is not completely necessary. Users can instead place
ground control targets within their mapping area, obtain geo-
graphic coordinates from those targets using a high-precision
GNSS (often already available to researchers in academic set-
tings), and then use those GCPs to improve the absolute accu-
racy of their drone outputs (Sanz-Ablanedo et al. 2018;Oniga
et al. 2020;Liu et al. 2022).
When conducting multi- and hyperspectral surveys with
drones, radiometric calibration is another critical component
that needs to be controlled for in the field (Fawcett and An-
derson 2019;Xu et al. 2019). That is, it is controlled by the re-
searcher and not the (sensor) manufacturer, which requires
scientists to have in-depth knowledge of remote sensing fun-
damentals. Sensor-measured radiance, a portion of irradiance
Canadian Science Publishing
8 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Box 2. Hypothetical data collection eort with significant privacy and ethical concerns.
Phase 1: Pre-mission
During the pre-mission phase, the drone operator/remote pilot must consider the research subjects and how the data they are
about to obtain might negatively impact them. While flying over people is not allowed in many jurisdictions,the operator should
assume that people will be captured in the imagery.Therefore,the remote pilot should explain to human subjects what data are
likely to be captured and provide them with an opportunity to determine whether they will permit themselves and their spaces to
be imaged.The permission process should include disclosing to human subjects that: (1) data on sensitive sites may be acquired,
(2) images of people may be captured, and (3) data on property may be captured. The pre-mission assessment should also
include providing details on the members of the mission team, the actions that will be taken to protect human subjects, and,
where necessary, the mode of communication the team will employ. Prior to planning the drone mission, the operator should
explain these details to the human subjects and obtain permission before flying over property or people. Remember, consent to
fly in any space can be rescinded as desired by human subjects, including local-, national-, and international-level authorities.
But if the operator is transparent about where they intend to fly, what data they are likely to obtain, and what the data will be
used for, consent will likely remain in place. The pre-mission assessment should be included in the standard human subjects
research protocol (e.g., IRB) that researchers need to adhere to for their work.
Phase 2: Mission
The operator must be prepared to abort a mission, if necessary, to protect lives and adhere to their research protocol. As an
example, while one of the authors was completing a mission in a rural setting, a mother carrying a days-old baby suddenly
appeared on the edge of the area being flown.In this case, she happened to be aware of the mission, was curious, and wanted
to view the drone while it flew over her family’s farm. Imagine the potential disaster if the drone did something unexpected that
led to injury. Thankfully, the team operating the drone was sufficiently trained and alert to recognize the risk posed by completing
the mission.Team members recognized that pre-mission protocols would be breached if these persons were present, as consent
was given to fly only in the presence of the team members. The pilot was quickly alerted, and the mission was aborted. Drone
operators must be aware of their operating protocols and be prepared to abort a mission at the small sacrifice of battery life
and camera data storage to ensure they remain in compliance. Where first-person view capabilities are present on the drone,
the operator can ensure that the data being obtained adheres to the pre-mission standards.
Phase 3: Post-mission
Upon the completion of flights, the drone operator must review the collected data with human subjects to ensure that they are
comfortable with the images and data obtained. The operator should work with human subjects to review the captured data
to remove any images that contain human subjects or simply do not meet the requirements of the established pre-mission
planning protocols. In the production of publications (assuming involved persons approve of the use of the collected data), the
operator must remove any features that may have cultural or other significance, or that may identify individuals.
(or total incident energy), varies over time and space, requir-
ing a number of corrections to obtain accurate spectral in-
formation. The same targets in a mapping scene display var-
ied radiance values under dierent environmental conditions
(e.g., weather/cloud cover, and aerosols), viewing angles (re-
quiring bidirectional reflectance distribution function [BRDF]
modeling; Li et al. 2018;Deng et al. 2021), and more. Thus, sci-
entists must undertake in situ calibration using a spectral tar-
get either before, or before and after, image collection occurs,
depending on the sensor employed (e.g., MicaSense 2022).
Calibration values are then used to correct the images dur-
ing post-processing to ensure that consistent spectral data (in
units of spectral reflectance) across multiple scenes or time
periods are obtained (Kelcey and Lucieer 2012). This is an es-
pecially important consideration because many optical sen-
sors developed specifically for drones are known to introduce
significant radiometric error (Huang et al. 2021). In addition
to these issues, calibration of camera lens distortion (inte-
rior orientation corrections) must be conducted prior to data
collection. It is true that drone aerial image data can be col-
lected, processed, and presented with the researcher know-
ing little about remote sensing or photogrammetry. The best
practice is to be mindful of the correct data collection param-
eters for your research target in the field and knowledgeable
about how imagery is processed with SfM software. Without
considering the calculations happening behind the scenes
or understanding the myriad flight collection and post-
processing parameters, researchers run the risk of inaccurate
results.
Beyond knowledge of data handling and processing, some-
times it becomes necessary to have knowledge of the drone
itselfthe characteristics of the hardware and software that
you rely on for data collection. Some drone-related stakehold-
ers have raised concerns about DJI products (Wright 2017;
Puko and Ferek 2019). Such concerns may impact the rela-
tionship the drone user has crafted with local people, includ-
ing being able to protect their privacy and other concerns.
In such instances, it may become necessary to pursue other
drone options, including building drones from scratch where
you have more control over the parts and components used
in data collection (see Cummings et al. 2017a). Knowledge
of what goes into the drone can help to deal with fears sur-
rounding the drone obtaining data that will compromise pri-
vacy and other concerns.
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 9
Table 1. Checklist for drone flight operations under U.S. FAA regulations (adapted from GeoIDEA Lab 2022).
Preplanning and flight
preparation
Day of flight
Preflight Flight Postflight
rObtain a remote pilot license
rRegister aircraft
rPurchase/confirm insurance
coverage
rAcquire permissions from
landowners and any other
partners
rCheck/update flight planning
and other smartphone/tablet
apps
rCheck/update firmware on the
drone and remote control
rCreate flight plan(s) on flight
planning app(s)
rPrepare microSD/SD cards for
data storage on the drone
rCharge batteries (drone, remote
controller, tablet or
smartphone)
rPrepare any other field
equipment: GNSS for GCPs,
spectroradiometer, spectral
target, etc.
rCheck for NOTAMs
rSubmit for LAANC approval if
needed (day before or before
flight)
rWeather precheck
rMonitor weather prior to
traveling to field site
rTravel to field site
rConfirm inperson weather is
suitable for safe flight
rInspect drone and related
equipment to ensure it is
ready-to-fly
rPrepare drone as needed
(mount propellers, insert
battery, etc.)
rPrepare staging area for drone
takeo/landing ensuring no
nearby obstacles (tree,
overhang, etc.)
rPlace drone in takeo location
rTurn on drone and controller
following manufacturer
specifications
rReview flight plan on app
rConfirm correct camera settings
rVerify GNSS connection is active
rConfirm skies are clear for
flight
rProceed to takeo
rAfter takeo, before allowing
autonomous control, hover
aircraft 2 meters above ground
level to confirm drone is under
control and stable
rVerify controls are operating
correctly
rConfirm full battery level
rProceed to data collection
mission (under manual control
or autonomous mode enabled
by flight planning app)
rMaintain line-of-sight with
observers as needed throughout
flight
rCommunicate with team
members throughout drone
flight on progress
rWhen flight mission or partial
mission is complete, manually
or autonomously bring drone
back toward staging area
rConfirm staging area is clear for
landing
rLand drone
rChange batteries and repeat
flights if multiple are needed
rShutdown drone and controller
following manufacturer
recommendations
rInspect aircraft to confirm no
damage
rLog flight details for remote
pilot records
rUsing a laptop, confirm
drone-collected image (or other)
data are satisfactory
rConduct non-drone
ground-based data collection as
needed
rReview data with stakeholders
where applicable
Develop or adopt a data collection protocol
Drone-based data collection requires extensive planning.
Developing from scratch, adopting, or adapting a data collec-
tion protocol is a fundamental practice for conducting drone
remote sensing research. When preparing a protocol, two pri-
mary components to include (not mutually exclusive) are (1)
aircraft operation and (2) data capture. Myriad resources are
available to support the development of a protocol. Table 1
provides an example of a drone operation checklist adapted
from the GeoIDEA Lab (2022). In this case, data collection
specifics are not covered in detail, so this could be formu-
lated within another checklist or incorporated into this one.
Checklists can be drone-specific or created more generically,
such as in Table 1.
Specific checklists can be altered if you own more than one
drone, especially when comparing rotary-to-fixed-wing plat-
forms. Importantly, operators should compartmentalize op-
erations as shown in Table 1 with preplanning/preparation
before the data collection day, and preflight, flight, and post-
flight items on the data collection day. Preplanning entails
obtaining a remote pilot license and registering your aircraft,
as well as confirming insurance coverage, along with per-
missions, and equipment preparation. Preplanning can also
include practicing manual flight control prior to capturing
data. There is a steep learning curve to learning how to fly
drones (especially with fixed-wing platforms), and adequate
time should be allotted to ensure safe flight. Those not com-
fortable with flying, or wanting to concentrate on the data
itself, are advised to outsource flight operations to a pro-
fessional that regularly operates drones. Many universities
now employ drone pilots for a variety of purposes (i.e., mar-
keting/videography, within drone-focused academic depart-
ments, extension oces), which can fulfil this need for re-
searchers uneasy with drone operation.
On the day of flight, preflight checks include weather mon-
itoring prior to travel to the field site and confirmation on
site, drone preparation and inspection prior to flight, locat-
ing and establishing a staging area for drone takeo and land-
ing (keeping in mind dierent requirements for rotary-wing
and fixed-wing aircraft), placing the drone in the staging area
and turning it on along with remote control, conducting any
necessary camera calibrations, confirming flight plan details
and camera settings, verifying GNSS connection, and, lastly,
checking that the immediate area and nearby skies are clear
for takeo.
The flight checklist includes actual aircraft takeo, confir-
mation of full control of the aircraft and an adequate bat-
tery level for the flight mission, and then data collection.
As for data collection during flight, the type of data be-
ing collected influences how to operate the aircraft (with
altered checklists needed for dierent sensors). Most com-
monly, aerial images are gathered for SfM photogrammetry,
which requires a flight plan that maintains consistent overlap
(front and side) between images (see Figs. 4A4C for polygon,
grid, and double-grid patterns, respectively) and ensures ad-
equate camera angle variation. Altitude of data capture and
altitude variation throughout a flight also help to improve
eventual data quality (Zimmerman et al. 2020;Santana et
Canadian Science Publishing
10 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Fig. 4. Drone-based aerial image collection from using (A) polygon, (B) grid, and (C) double-grid flight plans.
al. 2021;Swayze et al. 2022). Importantly, how drone aerial
images are captured impacts the quality of the output data
(Dandois et al. 2015;Young et al. 2022). Convergent imagery
(i.e., integrating converging obliques) has proven time and
again to remove systematic error in SfM-MVS-generated data
(Wackrow and Chandler 2011).
The postflight checklist includes powering down the drone
and other equipment, inspecting equipment for damage,
logging flight details, reviewing collected data on-site, col-
lecting other ground data to support the research project
(i.e., spectroradiometer measurements of a spectral target
placed in the field prior to flight, surveying GCPs), and inter-
acting/reviewing data with community members/landowners
as needed. Data processing and analyses follow the day
of flight (i.e., SfM for image data, georeferencing for all
data).
Treat Structure from Motion as a new form of
photogrammetry
Imagery is by far the most collected data using drones,
especially RGB images. This involves the capture of over-
lapping aerial images for Structure from Motion-Multiview
Stereo (SfM-MVS, often simply SfM) photogrammetric pro-
cessing (see Stefanik et al. 2011;Fonstad et al. 2013;Anderson
et al. 2019). While SfM is photogrammetry, it diers from tra-
ditional and more recent forms of the digital photogrammet-
ric method. The bulk of the algorithms utilized within SfM
workflows originate from the Microsoft Photosynth project
(Snavely et al. 2008), which built upon early SfM research
(Ullman 1979) and adopted, adapted, and integrated a series
of open-source algorithms such as the scale-invariant feature
transform (Lowe 2004). Importantly, though, the SfM work-
flow developed by Snavely et al. (2008) inputs ground-based
photos (as opposed to aerial photos) posted by tourists on
online image repositories. Soon after, SfM was applied to
aerial images by the same team (Kaminsky et al. 2009). To
further densify point clouds, additional algorithms were in-
corporated, such as patch-based multiview stereo (Furukawa
and Ponce 2010) and clustering multiview stereo (Furukawa
et al. 2010), to complete the SfM-MVS process. SfM-MVS soft-
ware, whether open source (e.g., MicMac, VisualSfM, and
OpenDroneMap) or proprietary (e.g., Agisoft Metashape, and
Pix4Dmapper), relies on this suite of algorithms (with varia-
tions by platform) to process drone-collected aerial images
Fig. 5. Documented issues with SfM-MVS-generated data
products such as (A) “dishing” and (B) “doming” of generated
terrain surfaces due to inadequate lens distortion corrections
and/or poor flight planning (i.e., overlap and angle).
(see Mathews 2021 and Shahbazi 2021 for SfM processing
overviews).
In addition to being created for ground-based photos, SfM
further diers from previous forms of photogrammetry due
to its flexibility in using non-metric digital cameras (i.e., o-
the-shelf digital cameras or other optical sensors created for
use on drones). SfM even handles images collected by dif-
ferent cameras with dierent lens parameters to recreate
scenes. While non-metric/uncalibrated cameras and SfM pro-
cessing can create accurate geospatial data, the use of non-
metric cameras does lead to predictable errors. Specifically,
systematic error in SfM-generated data comes in the form
of dishing and doming due to radial lens distortions (see
Fig. 5;Wackrow and Chandler 2008;Wackrow and Chan-
dler 2011;Rosnell and Honkavaara 2012;Harwin et al. 2015;
Griths and Burningham 2019). Convergent-angled (oblique)
imagery helps to address this issue (James and Robson 2014),
along with adequate camera calibration (Carbonneau and Di-
etrich 2017). Regarding the former, research confirms the
complexities of data collection (i.e., drone flight planning
for overlap and image angles) and the resulting data qual-
ity (Ludwig et al. 2020;Meinen and Robinson 2020;Stöcker
et al. 2020). Further, ground-based image capture for SfM as-
sumed no movement during individual image capture, which
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 11
when mounted on a moving drone requires a global shutter
or corrections to be applied to images captured with a rolling
shutter (Vautherin et al. 2016;Nex et al. 2022).
In capturing aerial photos for SfM-MVS processing, vari-
ation in flying height and camera angle is recommended,
and some might argue a degree of randomness is preferred.
This is unlike traditional and more recent forms of digital
photogrammetry, where flight planning is rigid and system-
atic. The amount of image overlap is important in flight
planning for accurate SfM data. A meta-analysis revealed
that most research utilizes greater than 75% forward over-
lap and 70% side overlap (Singh and Frazier 2018). Dandois
et al. (2015) suggest 60% forward overlap and 60%–80% side
overlap. Building consensus remains dicult because dif-
ferent landscape types (including ground/above ground fea-
tures) require particular image acquisition approaches to
maximize accuracy (Joyce et al. 2018;Meinen and Robinson
2020;Stöcker et al. 2020). As opposed to traditional and prior
forms of digital photogrammetry, though, SfM requires much
higher overlap amounts between consecutive drone-collected
images.
Georeferencing techniques impact data quality. Methods
include direct georeferencingusing only GNSS locations
from the drone geotagged into image metadata, indirect
georeferencingusing on-ground surveyed GCPs, or a com-
bination of both (Mathews 2021). Direct georeferencing pro-
vides adequate accuracy for many, but not all, applications
(Carbonneau and Dietrich 2017). Within the indirect method,
many studies stress how GCP placement and density im-
pact data quality (James et al. 2017;Galván Rangel et al.
2018;Sanz-Ablanedo et al. 2018;Oniga et al. 2020;Stöcker
et al. 2020;Liu et al. 2022). Regardless of approach, ground
truthing by way of survey-grade GNSS-located check points
(CPs) is highly encouraged accuracy assessment of produced
models. Also, in generating three-dimensional point cloud
data, researchers must transform elevation values from GNSS
ellipsoidal heights to orthometric heights.
SfM-MVS workflows create three-dimensional point clouds
for visualization and analysis. Although point clouds are typ-
ically used to build other data products such as digital eleva-
tion models and orthophotomosaics, researchers are encour-
aged to consider analyzing the point clouds directly (focusing
on the structure in SfM). To this end, many studies have imple-
mented comparison tools for multitemporal point clouds (or
to compare SfM data with lidar; Wallace et al. 2016;Esposito
et al. 2017a,2017b) or have conducted point cloud analytics
with or without the use of point RGB values (Dandois and El-
lis 2013).
While SfM-MVS algorithms are suggested to have a democ-
ratizing eect on image processing (Carrivick et al. 2016),
most SfM-MVS software packages remain black box and pro-
prietary (at a cost). Further, there remains a steep learning
curve to properly handle SfM-MVS-generated data products.
Open-source options are increasingly available but are often
less approachable. Similarly, but applied to drones in general,
some propose that drones are democratizing remote sensing
because of their low cost and ease of use (Lippitt and Zhang
2018). While this might be true for a part of the remote sens-
ing community, this remains a small group. While drones are
relatively low-cost compared to Earth-observing satellites and
piloted aircraft, drone technology (including SfM software)
is still cost-prohibitive for many within the remote sensing
community, not to mention broader society (e.g., geospatial
practitioners, citizen scientists, and hobbyists).
Consider new approaches to analyze
hyperspatial data
The detail captured with sub-centimeter hyperspatial
aerial imagery opens research avenues that were once
implausible. For example, drones allow monitoring bird
colonies at desired spatial and temporal scales in areas in-
accessible to ground surveys, repeat assessments of changes
in wood productivity due to management decisions, and re-
motely monitor the spread of pests and pathogens in forests
and farms. However, many drone studies typically focus on
the "go-to" approach that includes plugging in processed
drone-captured, sub-centimeter image mosaics to supervised
and unsupervised classifiers, leaving the advantages provided
by drones underexplored. OBIA provides an ideal means
by which to isolate objects of interest from drone-collected
and SfM-generated orthophotos for a variety of applications
(Ventura et al. 2018;McCarthy et al. 2021;Nababan et al.
2021). As valuable as image segmentation is, though, it is
a two-dimensional approach when 3D data are commonly
generated through the SfM process. Eorts to find a suitable
drone and compatible sensors, plan flight missions, and per-
form required image processing often take users’ attention
away from developing new approaches to analyzing hyper-
spatial resolution data by improving and advancing OBIA as
well as moving beyond it. Drones and sensors, in the end,
become fascinating tools rather than data sources to answer
research questions. To counter this problem, users should
identify and propose new approaches (e.g., time-series analy-
sis, ground surveys, data fusion, 3D image segmentation, and
more) to analyze hyperspatial resolution data.
Optimal resolution
While drone-captured hyperspatial-resolution images are
thrilling to view and work with, it is essential to estab-
lish an optimal spatial resolution that balances data volume
and mapping accuracy. Drone-captured hyperspatial resolu-
tion images result in a large volume of data that not only
poses computational challenges for eectively processing
and analyzing data but also shows deficiencies caused by non-
relevant details and improper image stitching. The higher the
spatial resolution, the more noise impacts individual pixels
(increasing the importance of BRDF modeling, which is of-
ten overlooked in drone remote sensing research; Deng et al.
2021), i.e., increased reflectance from exposed soils, incon-
sistent brightness within and across scenes (Mathews 2015),
and complexity of light interaction at the individual leaf scale
(see Fig. 6C). The calculation of vegetation indices such as
the normalized dierence vegetation index (NDVI) is sensi-
tive to soil and pixel components (Huang et al. 2021), and
any such pixel deficiencies may produce erroneous vegeta-
tion indices. Images with a 10 cm spatial resolution might
Canadian Science Publishing
12 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Fig. 6. Drone-collected true color orthophoto for precision viticulture at (A) vine row, (B) partial vineyard, and (C) individual
vine scales with the operational/management scale circled in orange (per-vine) and the spatial resolution/scale of 1 cm with
white grid boundaries (per-pixel). This scale dierence results in thousands of pixels being captured for an individual vine
canopy.
be more eective in plant-level management compared to 1
cm spatial resolution (see Rogers et al. 2020). As Fig. 6 illus-
trates, drone-collected multispectral data to calculate NDVI
at a 1 cm scale (shown in true color) does not improve per-
plant management practices (assuming per-plant, precision
agriculture crop management practices) compared to data at
a 0.5 m scale. Therefore, establishing an optimal resolution
may help remove some of these deficiencies while improving
computational eciency (Singh et al. 2012).
As is the case with precision viticulture, as highlighted
in Fig. 6, research confirms no advantage to very high spa-
tial resolution imagery when NDVI or other vegetation in-
dices are implemented (Lamb et al. 2004). Yes, we can cal-
culate NDVI with 1 cm pixels, but does it give us anything
that aerial/satellite imagery did not already provide? These is-
sues necessitate potentially less reliance on the spectral data
and more on the spatial arrangement of pixels (i.e., OBIA;
Mathews 2014) and 3D structure of SfM-MVS point clouds (i.e.,
volumetric analyses, point cloud analytics, and comparisons;
Turner et al. 2015;Clapuyt et al. 2017;Hunt and Daughtry
2018).
Ground(like) surveys and imagery from the same
sensor
Reference data are fundamental to remote sensing appli-
cations and are regularly acquired through cost-ineective
ground surveys. Drone-based low-altitude aerial surveys are
highly eective and underexplored as a replacement for
ground-based observation and measurement (e.g., quadrat
plots to determine vegetation types and quantities). Drones
can be tasked with randomly selecting plots for capturing
very high spatial resolution photos that can later be analyzed
visually to collect, estimate, and measure any aspect of the
field plot for above-surface observations. This approach pro-
vides more reference data in a shorter time frame, access to
inaccessible sites, coverage to larger extents, repeat visits of
sites, and temporal congruence between ground reference
observations and collected images (Kattenborn et al. 2019).
Time-series analyses
Flexibility in the repeat collection of images makes drone
technology well suited for time-series analyses that may oer
new ways to analyze hyperspatial images. For example, NDVI
shows the eect of pests and pathogens and the application
of water and fertilizers on growth and productivity through-
out crop growth stages (e.g., Hunt et al. 2010;Mathews 2014;
Hunt and Daughtry 2018). However, as mentioned previously,
NDVI at a hyperspatial scale might not always provide useful
information for addressing research objectives. Monitoring
the restoration of retired cranberry farms in the northeastern
United States is another example that can benefit from the
time-series analysis (Harvey et al. 2019). While the restoration
of retired farms can contribute to sustainable agriculture,
it is essential to periodically monitor conservation progress
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 13
Table 2. Types of data captured using drones, with examples of publications.
Passive sensing Active sensing Other sensing
True color (RGB still images and/or video)
(Fonstad et al. 2013;Niu et al. 2019;Schaeer
et al. 2021)
Lidar (Wallace et al. 2012;Almeida et al. 2019;
Pricope et al. 2020)
Atmospheric: humidity, temperature,
pressure, particulate matter, etc.
(Hemingway et al. 2017;Bieber et al. 2020;
Madokoro et al. 2021)
Multispectral (Matese and Di Gennaro 2018;
Guo et al. 2019;Fawcett et al. 2020)
Radar: ground-penetrating radar (López et al.
2022;Vergnano et al. 2022)
Magnetic: anomaly detection (Campbell et al.
2020;Yoo et al. 2021)
Hyperspectral (Nevalainen et al. 2017;Sankey
et al. 2017;Nezami et al. 2020)
Synthetic aperture radar (Bekar et al. 2022;
Oré et al. 2022)
Flight data: GNSS, IMU, etc. (Jiang and Jiang
2017;Li et al. 2018)
Spectroscopy (Aasen et al. 2018;Bolch et al.
2021)
Ultra-wideband radar (Jenssen and Jacobsen
2021;Abushakra et al. 2022)
Thermal (Baluja et al. 2012;Harvey et al. 2019;
Heinemann et al. 2020)
to ensure that conservation eorts are beneficial for both
landowners and biodiversity. Knowing the utility of OBIA
with hyperspatial image data, object-based change detection
provides an ideal approach for those interested in identifying
landscape alterations (Chen et al. 2012). Other approaches in-
clude principal component analysis to detect changes in the
temporal dimension (Deng et al. 2008).
Synopsis
In sum, scientists are restricting themselves and their work
if, by default, they merely apply traditional methodologies
(i.e., developed for coarser satellite and aerial imagery) to
non-traditional, in terms of scale, data. The remote sensing
community must consider drone-collected data as another
form of geospatial big data and treat it accordingly by col-
lectively outlining the challenges and opportunities aorded
by this new means to capture data.
Think beyond imagery
Drones are most often utilized to capture aerial images us-
ing optical, passive sensors with SfM-MVS as the data pro-
cessing method. However, there are many types of data that
can be collected with the help of a drone, including atmo-
spheric measurements, lidar, radar, etc. (see Table 2). It is
therefore important that researchers consider options be-
yond imagery when crafting methodologies. Of course, not
all research problems require multiple data types, but aware-
ness of the potential options is important to fully addressing
research questions. Fusion across multiple data types is en-
abled when researchers capture additional data along with
aerial imagery (e.g., lidar and multispectral imagery data fu-
sion).
Miniaturized multi- and hyperspectral cameras are suit-
able for remote sensing drone applications. However, insu-
cient radiometric calibration methods limit their use in quan-
titative remote sensing applications and thus cannot com-
pletely replace satellite-based imagery. Radiance measured
by cameras is prone to illumination changes and sensor uni-
formities caused by the roll–pitch–yaw orientation (Smith
and Milton 1999). Radiometric calibration of the camera
requires accurate reflectance transformation (see Mathews
2015;Suomalainen et al. 2021). The process involves a pro-
cedure for the correction of dark current, flat field, spec-
tral response, and absolute radiometric coecients that al-
lows for accurate conversion of the camera digital numbers
to at-sensor radiances and/or surface reflectance (Aasen et
al. 2018). Low flight altitudes of drones also limit users, un-
less in situ data are captured by correcting atmospheric ef-
fects in multispectral (Guo et al. 2019;Mamaghani and Sal-
vaggio 2019) and thermal (Heinemann et al. 2020)images,
potentially making them unfit to compare or create a fusion
dataset with satellite-based imagery. Hyperspectral sensors,
specifically with one-dimensional scanners, require dierent
workflows from multispectral sensors entirely to create ge-
ometrically correct data. In sum, quantitative remote sens-
ing applications require radiometric calibrations of sensors
and should be considered before implementing drones in the
project for data acquisition.
While SfM-generated point clouds are invaluable datasets
for 3D modeling and analysis (Gómez-Gutiérrez and
Gonçalves 2020), image-based SfM-MVS is not an alterna-
tive to lidar. As an active remote sensing technology, lidar
provides the benefit of collecting data within and beneath
a tree canopy and other porous features (Wallace et al.
2012). In other words, image-based SfM data are “what you
see is what you get” (i.e., if you cannot see the forest floor
in the images, you will not see it in the generated data).
Integrated lidar sensors for drones continue to decrease
in cost, weight, and power requirements (see Almeida et
al. 2019), and o-the-shelf drones even oer lidar systems
(DJI 2022). Hence, if the purpose of drones is to create a
topographic surface model in a partially forested area, lidar
provides the best means by which to measure elevation
from the top of tree canopies to ground level. Drone-based
lidar for 3D modeling has enormous potential in a variety of
applications (Sankey et al. 2017;Jaskierniak et al. 2021)but
is often limited by data processing algorithms. Therefore,
algorithmic advancements are needed to filter and extract
information from very dense, small-area point clouds as
well as eectively and directly compare point clouds over
time (as opposed to converting these data to rasters; this
goes for both lidar and SfM data; see Esposito et al. 2017a,
2017b).
Canadian Science Publishing
14 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Table 3. Items to report in drone remote sensing publications for transparency and to support research reproducibility and
replicability.
Drone operation Data
Collection Processing Error
Aircraft specifications: platform
type (fixed- or rotary-wing),
make and model, GNSS
accuracy, and other onboard
sensors (if applicable)
Sensor specifications: type
(optical, lidar, radar, etc.), make
and model, details (image size
and type of sensor, lidar/laser
scanning speed/density),
radiometric (or other)
calibration
procedures/atmospheric
corrections
Software and parameters:
lens distortion correction
procedures, SfM software used
and options selected, image
stitching method, radiometric
correction method, manual
interventions in data
production (if applicable)
Terrain (assessed with point
clouds or raster
DEM/DTM/DSM): error
assessment/validation method
(point to point, point to raster,
raster to raster), vertical error
metrics including RMSE, MAE,
SDE, ME
Flight log: civil aviation authority
adherence procedures,
altitude(s), speed, flight time,
flight plan type and
characteristics, app used to
flight plan
Image characteristics:
front and side overlap
percentages, image angles, and
number of images collected
Georeferencing approach: method
(direct, indirect, or both);
additional corrections (if
applicable)
Orthophotomosaics: horizontal
accuracy, spectral
inconsistencies, including level
of agreement with
ground-based data
Study site details: areal
coverage/survey extent,
on-ground permissions
obtained, landscape
description, weather on data
collection days
Ground-based data: GNSS data for
GCPs and CPs (horizontal and
vertical accuracy), GCP
configuration (number and
distribution), GNSS type (RTK,
dierential corrections),
terrestrial laser scanning (TLS)
data, spectral calibration data
(spectral target and
spectroradiometer
measurements), etc.
Data product details: imagery
(spatial, spectral, radiometric,
temporal resolutions), point
cloud (SfM or lidar) density
and/or point spacing,
Other data: validate
drone-collected data (radar,
magnetic anomalies,
atmospheric, etc.) by comparing
those with ground-based or
other known data sources
Another active remote sensing technology, radar, has been
adapted to operate from small drone platforms. Radar data
(see Table 2) support remote sensing research ranging from
vegetation and soil moisture to underground feature detec-
tion (see Abushakra et al. 2022 and López et al. 2022,re-
spectively). Like the latter example, drone-mounted magnetic
field sensors within an o-the-shelf smartphone were shown
to accurately detect buried metal anomalies (Campbell et al.
2020). Researchers are encouraged to creatively integrate at-
mospheric measurements (Hemingway et al. 2017), data from
onboard sensors (i.e., GNSS and IMU), and other low-cost sen-
sors even if intended for other uses, which was commonly
done with altered point-and-shoot digital cameras (Hunt et
al. 2010;Mathews 2015).
Several constraints aect the quality and quantity of data
from field surveys: inaccessibility of the terrain, area cover-
age, GNSS inaccuracy due to dense canopy cover (Valbuena
et al. 2010;Kaartinen et al. 2015), and human errors in iden-
tifying and/or counting objects of interest (e.g., number of
invasive plants within a plot or number of birds in a colony;
Lunetta et al. 1991;Lepš and Hadincova 1992). Discrete point
or plot field observations performed on the ground within
a forest stand may not represent continuous observations
made through sensors above the forest stand (Turner 2014;
Immitzer et al. 2018;Leitão et al. 2018). Kattenborn et al.
(2020) used a drone to collect reliable ground-reference data
on three dierent invasive plants, suggesting drones as a
promising alternative to cost-ineective field surveys for
ground-reference data. Drones may help to overcome many
of these constraints and improve the quality and quantity of
ground reference data on broad scales. Remote sensing uses
of drones in mapping and measuring landscapes are full of
possibilities waiting to be explored.
Be transparent and report error
Increased transparency and clear communication of data
and methods within drone remote sensing research is neces-
sary, as studies have only sporadically reported flight plan-
ning details, image overlap, GCP placement, spectral cali-
bration, etc. (Singh and Frazier 2018). Providing comprehen-
sive methodological details in publications (and open data
when able) will help to combat reproducibility and replica-
bility issues within geospatial research (Kedron et al. 2021).
Table 3 provides a broad overview of items to include in pub-
lications organized into two broad categories of drone op-
eration and data (the latter includes collection, processing,
and error as subcategories). Drone operation requires report-
ing of aircraft specifications, flight logs, and study site de-
tails (see Table 3). The data collection portion entails a de-
scription of sensor specifications, image characteristics (if
imagery was captured), and ground-based data. Publications
should inform readers of data processing components such
as software and parameters, georeferencing approach, and
data product details (see James et al. 2019 for SfM-specific
guidance).
Reporting errors is especially important with terrain mod-
els (using image-based approaches and lidar), orthophotomo-
saics, and other data. Carrivick et al. (2016) rightly point out
inconsistent reporting of errors associated specifically with
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 15
Fig. 7. A conceptual framework for drones incorporating three major components (including descriptors for each): devel-
opment/manufacturing/testing of drone technology, operation of unoccupied aircraft, and application of drones and drone-
collected data. Contributing academic disciplines (by no means an exhaustive list) are placed outside of the triangle in prox-
imity to their predominant focus (i.e., engineers focus on developing drone technology, whereas geographers emphasize data
analysis for topographic modeling).
SfM-processed terrain data. Not surprisingly, there are no
agreed-upon error metrics to disclose. This is especially im-
portant as SfM algorithms (especially proprietary ones) evolve
over time, producing dierent resulting data. Common met-
rics reported with SfM terrain data include root mean square
error (RMSE) and mean absolute error (MAE; Carrivick et al.
2016). Other error metrics, though less commonly reported,
include standard deviation of error (SDE) and the mean er-
rors (ME). James et al. (2017) emphasize reporting RMSE but
also visualizing errors with maps and graphs. Further, how to
conduct error assessment is also important. Terrain data val-
idation methods include point-to-point, point-to-raster, and
raster-to-raster comparisons (Carrivick et al. 2016). Vertical
accuracy is assessed by comparing drone-based SfM or lidar
data to survey-grade GNSS data or terrestrial laser scanning.
Researchers must consider the issues associated with raster-
izing highly detailed, dense point clouds while still being
able to practically handle the comparison computationally.
For this fundamental practice, researchers, regardless of ap-
plication area, should report the geometric accuracy of SfM-
generated data based on validation data in the form of RMSE
and MAE.
Beyond these suggestions, researchers should review and
closely follow geospatial data standards established by profes-
sional organizations such as the ASPRS (see Whitehead and
Hugenholtz 2015). Further, staying up-to-date on the current
drone remote sensing literature can provide important guid-
ance.
Work collaboratively
Collaborators come from expected and unexpected places
on- and o-campus. A fundamental practice in drone remote
sensing research is to engage with a diverse array of schol-
ars and professionals. This serves to increase scientific rigor
as well as provide an important, well-rounded perspective
on the implementation of drone technology for problem
solving. From the university/college perspective, on-campus
(and/or cross-campus) collaboration can be incredibly impor-
tant in addressing research questions. The remote sensing
community is already multi- and interdisciplinary, with con-
tributors from geography, biology, engineering, computer
science, and other fields of study (i.e., ranging from natu-
ral to social sciences and theoretical to applied sciences).
Figure 7 conceptualizes the many aspects of drone technol-
ogy, grouped as development, operation, and application, as
well as provides fields of study that contribute to these areas.
Drone development includes airframe design, propulsion,
flight testing, sensor development and refinement, robotics
integration, and system optimization (not an exhaustive list).
The development of drones is supported by engineering
subfields such as aerospace, mechanical, electrical, chemi-
cal, computer, and electrical engineering (see Austin 2010;
Barnhart et al. 2021). Engineers can enable data collection
that is otherwise not possible with o-the-shelf equipment.
Operation of the aircraft entails airspace integration, train-
ing of pilots, dierent flying methods (autonomous, semi-
Canadian Science Publishing
16 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
autonomous, and/or manual), flight planning, and data col-
lection. Aviation, atmospheric science, physics, law, and pol-
icy contribute to drone operation. By no means are re-
searchers from other fields not able to learn to operate a
drone; these fields provide the background needed for drone
operation for everyone. Application of drones is primarily
focused on the drone-collected data and its use for a par-
ticular topic of interest. The application work requires re-
mote sensing data and analysis techniques, computer vision
data processing methodologies, data visualization, and map-
ping. A wide variety of disciplines utilize drones for appli-
cations ranging from biology and environmental science to
agriculture, civil and environmental engineering, geography,
and geology. Computer science advancements help all areas,
but predominantly development and applications (i.e., in-
cluding SfM-MVS photogrammetry/computer vision). For re-
mote sensing scientists, the incorporation of drone technol-
ogy presents opportunities to collaborate with partners in
other disciplines. Researchers should not overlook the impor-
tance of fields such as science and technology studies, philos-
ophy, and ethics to engage with the critical components of
working with drones (Braun et al. 2015;Jackman and Brick-
ell 2021).
Importantly, though, synergy between disciplinary bound-
aries can create more impactful work (Calvario et al. 2017;
Hoople et al. 2019). Practically, conducting drone remote
sensing work can be time-consuming with fieldwork, need-
ing to obtain a remote pilot certificate, etc. Do not do it all by
yourself! A well-designed drone remote sensing team brings
together complimentary experts to address complex research
problems. Not every remote sensing scientist will find op-
erating a drone easy, especially when they are likely more
concerned with data collection. This could mean teaming up
with a pilot, working with an engineer who can customize
the drone and sensor technology, and/or including an agricul-
tural specialist to better understand the crop being remotely
sensed. However, multi- and interdisciplinary research can be
challenging due to a lack of understanding of each other’s dis-
ciplines. Such collaboration can be extremely rewarding, but
collaborative relationships take time to foster. Be wary that
those from fields of study that have yet to implement drones
might view the technology as a magical solution capable of
solving any problem. Additionally, those interested in using
drones in research should appreciate that remote sensing is a
field of research and takes many years to hone specific skills;
it is very dicult to just buy a drone and be ready to collect
data (as many expect). Often, drone remote sensing experts
must break the news to potential collaborators about what
the technology can actually do. Having resources prepared to
share with colleagues about your expertise and how it relates
to your planned collaborative work.
More broadly, o-campus, the power of drones lies in their
ability to change and potentially increase participation in
the remote sensing process. Prior to drones, remotely sensed
data collection was primarily a top-down process. Managers,
academics, and government authorities, with dierent out-
comes in mind, deployed the full range of platforms
pigeons, kites, balloons, and satellites as examples—t o o b t a i n
data. Imagine waking up one morning to see a strange de-
vice flying over your head, undoubtedly looking at your land,
not knowing who sent it or what exactly it is doing. For satel-
lites and airplanes, local people possess little power over how
and when they fly over their landscapes. The advent of drones
has the potential to change this and benefit local communi-
ties. Community-based scholars agree on the myriad benefits
of drone implementation in supporting remote sensing re-
search (Paneque-Gálvez et al. 2014,2017;Wachowiak et al.
2017;Vargas-Ramírez and Paneque-Gálvez 2019).
It is true that the law surrounding the deployment of
drones across the world has been evolving to the point
where most countries have regulatory frameworks surround-
ing their use. But while drone users may easily meet the reg-
ulatory requirements and fly in spaces where interactions
with people are not necessary, embracing local people can
enhance the remote sensing process. Local people under-
stand their landscapes, have questions drone imagery may
be able to address, and most critically, are curious what will
be done with the resulting data (see local engagement with
drones to examine agricultural landscapes in Cummings et
al. 2017a,2017b). As scholars, engaging local people provides
the opportunity to address questions, thereby allowing the
research activity to have positive impacts beyond academic
articles. Researchers, like it or not, are invariably role models.
Most researchers are associated with institutions of higher
learning where peers and students observe their conduct, or
at the very least, their work is released in some form that
can influence other people’s thinking. It is therefore impera-
tive that researchers traverse their study areas and spaces in
such a manner that they leave them better than they found
them. An engaged and informed local community, however
defined, will more than likely bear the best testimony of the
impacts of the drone data collection and handling processes
and serve to enhance the credibility of drone remote sens-
ing. In this way, Bennett et al. (2022) highlight the impor-
tance of engagement of situated knowledge and empower-
ment of marginalized actors within remote sensing work. Al-
though many would agree with this sentiment, few drone re-
mote sensing works have adopted this approach. Cummings
et al. (2017a)provide an important example of collaborative
drone remote sensing with Indigenous communities in ru-
ral Guyana, specifically focusing on empowerment through
resource management practices. More broadly, though, liter-
ature on drone-based countermapping also illustrates how
mapping practices can be inclusive and empowering (see
Radjawali and Pye 2017;Radjawali et al. 2017).
Remember, you do not have to learn it all when it comes to
drones because there is often too much to cover. Be open to
learning and working with others. Get involved and support
the work of your colleagues. Acknowledge the work of your
colleagues and share authorship with groups you work with,
including landowners and community members that con-
tribute to your projects (see Cummings et al. 2017a,2017b).
Summary
Simply put, drones are and will continue to impact and
change the field of remote sensing. But will we, the remote
sensing community, change with it? The growing body of
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 17
drone remote sensing literature, both peer-reviewed articles
and books, provides evidence that we are already doing so.
However, there remain a number of inconsistencies among
scientists doing drone remote sensing work, from not re-
porting vertical error in SfM-based digital terrain models
to glossing over the airspace regulatory framework within
which data were collected and adopting analysis techniques
meant for coarser-scale geospatial data. Remote sensing and
other scientists using drones should follow these fundamen-
tal practices to better complete their research and more
clearly communicate their findings:
1) focus on your question, not just the tool [confirm your
problem needs drone data!],
2) know the law and abide by it [acquire your remote pilot
license as well as land and any other permissions in all
cases],
3) respect privacy and be ethical [protect people and cul-
tural spaces],
4) be mindful consumers of technology [know the limi-
tations of black box software and beware of accuracy
promises],
5) develop or adopt a data collection protocol [make check-
lists and plan accordingly],
6) treat SfM as a new form of photogrammetry [variation in
angle and altitude is key to SfM modeling],
7) consider new approaches to analyze hyperspatial data
[create methodologies specifically for drone-collected
data],
8) think beyond imagery [integrate multiple types of data
from lidar, radar, and others],
9) be transparent and report error [share your findings in
their entirety, including error], and
10) work collaboratively [partner on- and o-campus and
bring together multiple disciplines and perspectives to
conduct research].
By being mindful of these practices, the remote sensing
community can more meaningfully advance remote sens-
ing methods and generate knowledge as supported by drone
technology.
Acknowledgements
We acknowledge many remote sensing colleagues over the
years who have contributed to important conversations at
conferences, meetings, and through other interactions about
the integration of drone technology into remote sensing.
These colleagues especially inspired the last and one of the
most important fundamental practices of “work collabora-
tively”.
Article information
History dates
Received: 18 April 2023
Accepted: 5 September 2023
Accepted manuscript online: 6 September 2023
Version of record online: 4 October 2023
Copyright
© 2023 The Author(s). This work is licensed under a Creative
Commons Attribution 4.0 International License (CC BY 4.0),
which permits unrestricted use, distribution, and reproduc-
tion in any medium, provided the original author(s) and
source are credited.
Author information
Author ORCIDs
Adam J. Mathews https://orcid.org/0000-0002-5577-4308
Kunwar K. Singh https://orcid.org/0000-0002-9788-1822
Anthony R. Cummings https://orcid.org/0000-0003-0902-6883
Stephanie R. Rogers https://orcid.org/0000-0002-3972-3228
Author contributions
Conceptualization: AJM
Methodology: AJM, KKS, ARC, SRR
Project administration: AJM
Visualization: AJM, KKS
Writing original draft: AJM, KKS, ARC, SRR
Writing review & editing: AJM, KKS, ARC, SRR
References
Aasen, H., Honkavaara, E., Lucieer, A., and Zarco-Tejada, P.J. 2018. Quan-
titative remote sensing at ultra-high resolution with UAV spec-
troscopy: a review of sensor technology, measurement procedures,
and data correction workflows. Remote Sens. 10(7): 1091. doi:10.3390/
rs10071091.
Abdullah, Q. 2021. Mission planning for capturing UAS imagery. In Fun-
damentals of capturing and processing drone imagery and data.
Edited by A.E. Frazier and K.K. Singh. CRC Press (Taylor & Francis
Group), New York. ISBN: 9780367245726.
Aber, J.S., Marzo, I., Ries, J.B., and Aber, S.E.W. 2019. Small-
format aerial photography and UAS imagery: principles, techniques,
and geoscience applications. 2nd ed. Elsevier, Oxford, UK. ISBN:
9780128129425
Abushakra, F., Jeong, N., Elluru, D.N., Awasthi, A., Kolpuke, S., Luong,
T., et al. 2022. A miniaturized ultra-wideband radar for UAV remote
sensing applications. IEEE Microw. Wirel. Compon. Lett. 32(3): 198–
201. doi:10.1109/LMWC.2021.3129153.
Afán, I., Máñez, M., and Díaz-Delgado, R. 2018. Drone monitoring of
breeding waterbird populations: the case of the glossy ibis. Drones,
2: 42. doi:10.3390/drones2040042.
Almeida, D.R.A., Broadbent, E.N., Zambrano, A.M.A., Wilkinson, B.E., Fer-
reira, M.E., Chazdon, R., et al. 2019. Monitoring the structure of forest
restoration plantations with a drone-lidar system. Int. J. Appl. Earth
Obs. Geoinf. 79: 192–198. doi:10.1016/j.jag.2019.03.014.
Alvarez-Vanhard, E., Corpetti, T., and Houet, T. 2021. UAV & satellite syn-
ergies for optical remote sensing applications: a literature review. Sci.
Remote Sens. 3: 100019. doi:10.1016/j.srs.2021.100019.
Anderson, K., Westoby, M.J., and James, M.R. 2019. Low-budget topo-
graphic surveying comes of age: structure from motion photogram-
metry in geography and the geosciences. Prog. Phys. Geogr. Earth En-
viron. 43(2): 163–173. doi:10.1177/0309133319837454.
ASPRS (American Society for Photogrammetry and Remote
Sensing). 2014. ASPRS Code of Ethics. Available from http:
//www.asprs.org/a/publications/pers/2014journals/PERS_Decembe
r_2014/HTML/files/assets/common/downloads/page0056.pdf (last
accessed 2 August 2022).
Austin, R. 2010. Unmanned aircraft systems: UAVS design, development,
and deployment. Wiley, Chichester, UK.
Baluja, J., Diago, M.P., Balda, P., Zorer, R., Meggio, F., Morales, F., and
Tardaguila, J. 2012. Assessment of vineyard water status variability by
Canadian Science Publishing
18 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
thermal and multispectral imagery using an unmanned aerial vehicle
(UAV). Irrig. Sci. 30(6): 511–522. doi:10.1007/s00271-012- 0382-9.
Barnhart, R.K., Marshall, D.M., and Shappee, E. (Editors). 2021. Introduc-
tion to unmanned aircraft systems. 3rd ed. CRC Press, Boca Raton, FL.
ISBN: 9780367366599.
Bekar, A., Antoniou, M., and Baker, C.J. 2022. Low-costs, high-resolution,
drone-borne SAR imaging. IEEE Trans. Geosci. Remote Sens. 60:
5208811. doi:10.1109/TGRS.2021.3085235.
Bennett, M.M., Chen, J.K., Alvarez León, L.F., and Gleason, C.J. 2022.
The politics of pixels: a review and agenda for critical remote
sensing. Prog. Hum. Geogr. 46(3): 729–752. doi:10.1177/
03091325221074691.
Bieber, P., Seifried, T.M., Burkart, J., Gratzl, J., Kasper-Giebl, A., Schmale,
D.G., and Grothe, H. 2020. A drone-based bioaerosol sampling system
to monitor ice nucleation particles in the lower atmosphere. Remote
Sens. 12(3): 552. doi:10.3390/rs12030552.
Bolch, E.A., Hestir, E.L., and Khanna, S. 2021. Performance and feasibility
of drone-mounted imaging spectroscopy for invasive aquatic vegeta-
tion detection. Remote Sens. 13(4): 582. doi:10.3390/rs13040582.
Bracken-Roche, C. 2016. Domestic drones: the politics of verticality and
the surveillance industrial complex. Geogr. Helv. 71: 167–172. doi:10.
5194/gh-71-167-2016.
Braun, S., Friedewald, M., and Valkenburg, G. 2015. Civilizing drones
military discourses going civil? Sci. Technol. Stud. 28(2): 73–87.
doi:10.23987/sts.55351.
Calvario, G., Sierra, B., Alarcón, T.E., Hernandez, C., and Dalmau, O. 2017.
A multi-disciplinary approach to remote sensing through low-cost
UAVs. Sensors, 17(6): 1411. doi:10.3390/s17061411.
Campbell, M.J., Dennison, P.E., Tune, J.W., Kannenberg, S.A., Kerr, K.L.,
Codding, B.F., and Anderegg, W.R.L. 2020. A multi-sensor, multi-scale
approach to mapping tree mortality in woodland ecosystems. Remote
Sens. Environ. 245: 111853. doi:10.1016/j.rse.2020.111853.
Carbonneau, P.L., and Dietrich, J. 2017. Cost-eective non-metric pho-
togrammetry from consumer-grade sUAS: implications for direct geo-
referencing of structure from motion photogrammetry. Earth Surf.
Processes Landforms, 42: 473–486. doi:10.1002/esp.4012.
Carrivick, J.L., Smith, M.W., and Quincey, D.J. 2016. Structure from mo-
tions in geosciences. Wiley-Blackwell, Chichester, UK.
Chen, G., Hay, G.J., Carvalho, L.M.T., and Wulder, M.A. 2012. Object-based
change detection. Int. J. Remote Sens. 33(14): 4434–4457. doi:10.1080/
01431161.2011.648285.
Clapuyt, F., Vanacker, V., Schlunegger, F., and Van Oost, K. 2017. Unrav-
elling earth flow dynamics with 3-D time series derived from
UAV-SfM models. Earth Surf. Dyn. 5: 791–806. doi:10.5194/
esurf-5-791-2017.
Clie, A.D. 2019. Evaluating the introduction of unmanned aerial ve-
hicles for teaching and learning in geoscience fieldwork education.
J. Geogr. Higher Educ. 43(4): 582–598. doi:10.1080/03098265.2019.
1655718.
Cohn, M. 2014. Drones and targeted killing: legal, moral, and geopolitical
issues. Olive Branch Press, New York.
Coker, C., 2013. Warrior Geeks: how 21st century technology is changing
the way we fight and think about war. Columbia University Press,
New York.
Colomina, I., and Molina, P. 2014. Unmanned aerial systems for pho-
togrammetry and remote sensing: a review, ISPRS. J. Photogramm.
Remote Sens. 92: 79–97. doi:10.1016/j.isprsjprs.2014.02.013.
Cummings, A.R., Cummings, G.R., Hamer, E., Moses, P., Norman, Z.,
Captain, V., et al. 2017a. Developing a UAV-based monitoring pro-
gram with indigenous peoples. J. Unmanned Veh. Syst. 5(4): 115–125.
doi:10.1139/juvs-2016-0022.
Cummings, A.R., Karale, Y., Cummings, G.R., Hamer, E., Moses, P., Nor-
man, Z., and Captain, V. 2017b. UAV-derived data for mapping change
on a swidden agriculture plot: preliminary results from a pilot study.
Int. J. Remote Sens. 38(8–10): 2066–2082. doi:10.1080/01431161.2017.
1295487.
Cummings, A.R., McKee, A., Kulkarni, K., and Markandey, N. 2017c. The
rise of UAVs. Photogramm. Eng. Remote Sens. 83(4): 317–325. doi:10.
14358/PERS.83.4.317.
Dandois, J.P., and Ellis, E.C. 2013. High spatial resolution three-
dimensional mapping of vegetation spectral dynamics using com-
puter vision. Remote Sens. Environ. 136: 259–276. doi:10.1016/j.rse.
2013.04.005.
Dandois, J.P., Olano, M., and Ellis, E.C. 2015. Optimal altitude, overlap,
and weather conditions for computer vision UAV estimates of forest
structure. Remote Sens. 7(10): 13895–13920. doi:10.3390/rs71013895.
Dang, L.M., Wang, H., Li, Y., Min, K., Kwak, J.T., Lee, O.N., et al. 2020.
Fusarium wilt of radish detection using RGB and near infrared images
from unmanned aerial vehicles. Remote Sens. 12: 2863. doi:10.3390/
rs12172863.
Davis, D.S., Bua, D., Rasolondrainy, T., Creswell, E., Anyanwu, C.,
Ibirogba, A., et al. 2021. The aerial panopticon and the ethics of
archaeological remote sensing in sacred cultural spaces. Archaeol.
Prospect. 28(3): 305–320. doi:10.1002/arp.1819.
Deng, J.S., Wang, K., Deng, Y.H., and Qi, G.J. 2008. PCA-based land-use
change detection and analysis using multitemporal and multisen-
sor satellite data. Int. J. Remote Sens. 29(16): 4823–4838. doi:10.1080/
01431160801950162.
Deng,L.,Chen,Y.,Zhao,Y.,Zhu,L.,Gong,H.-L.,Guo,L.-J.,andZou,H.-
Y. 2021. An approach for reflectance anisotropy retrieval from UAV-
based oblique photogrammetry hyperspectral imagery. Int. J. Appl.
Earth Observ. Geoinf. 102: 102442. doi:10.1016/j.jag.2021.102442.
DJI. 2022. Zenmuse L1. Available from https://www.dji.com/zenmuse-l1
[accessed 24 May 2022].
DroneDeploy. 2022. Drone industry resources. Available from https://ww
w.dronedeploy.com/resources/ [accessed 25 October 2022].
DronesMadeEasy. 2021. Overlap management. Available from
https://support.dronesmadeeasy.com/hc/en-us/articles/207743803
-Overlap-Management [accessed 25 October 2022].
Ebeid, E., Skriver, M., and Jin, J. 2017. A survey on open-source flight
control platforms of unmanned aerial vehicle. 2017 Euromicro Con-
ference on Digital System Design 396-402. doi:10.1109/DSD.2017.30.
Enemark, C. 2013. Armed drones and the ethics of war: military virtue
in a post-heroic age. Routledge, Kentucky.
Esposito, G., Mastrorocco, G., Salvini, R., Oliveti, M., and Starita, P. 2017a.
Application of UAV photogrammetry for the multi-temporal estima-
tion of surface extent and volumetric excavation in the Sa Pigada
Bianca open-pit mine, Sardinia, Italy. Environ. Earth Sci. 76: 103.
doi:10.1007/s12665-017-6409-z.
Esposito,G.,Salvini,R.,Matano,F.,Sacchi,M.,Danzi,M.,Somma,R.,
and Troise, C. 2017b. Multitemporal monitoring of a coastal land-
slide through SfM-derived point cloud comparison. Photogramm.
Rec. 32(160): 459–479. doi:10.1111/phor.12218.
FAA (Federal Aviation Administration). 2021. Part 89 remote identifica-
tion of unmanned aircraft. Available from https://www.ecfr.gov/cur
rent/title-14/chapter-I/subchapter-F/part-89 [accessed 24 May 2022].
FAA (Federal Aviation Administration). 2022. Unmanned aircraft sys-
tems (UAS). Available from https://www.faa.gov/uas/ [accessed 11 April
2022].
Fawcett, D., and Anderson, K. 2019. Investigating impacts of calibra-
tion methodology and irradiance variations on lightweight drone-
based sensor derived surface reflectance products. In Proceedings of
SPIE: Remote Sensing for Agriculture, Ecosystems, and Hydrology,
XXI 111490D. doi:10.1117/12.2533106.
Fawcett, D., Panigada, C., Tagliabue, G., Boschetti, M., Celesti, M., Evdoki-
mov, A., et al. 2020. Multi-scale evaluation of drone-based multispec-
tral surface reflectance and vegetation indices in operational condi-
tions. Remote Sens. 12(3): 514. doi:10.3390/rs12030514.
Fonstad, M.A., Dietrich, J.T., Courville, B.C., Jensen, J.L., and Carbonneau,
P.E. 2013. Topographic structure from motion: A new development
in photogrammetric measurement. Earth Surf. Processes Landforms,
38(4): 421–430. doi:10.1002/esp.3366.
Fraser, B.T., and Congalton, R.G. 2018. Issues in unmanned aerial systems
(UAS) data collection of complex forest environments. Remote Sens.
10: 908. doi:10.3390/rs10060908.
Frazier, A.E., and Singh, K.K. (Editors). 2021. Fundamentals of capturing
and processing drone imagery and data. CRC Press, New York. ISBN:
9780367245726
Furukawa, Y., and Ponce, J. 2010. Accurate, dense and robust multiview
stereopsis. IEEE Trans. Pattern Anal. Mach. Intell. 32(8): 1362–1376.
doi:10.1109/tpami.2009.161.
Furukawa, Y., Curless, B., Seitz, S.M., and Szeliski, R. 2010. Towards
internet-scale multi-view stereo. In Proceedings of the 2010 IEEE
Computer Society Conference on Computer Vision and Pattern
Recognition, San Francisco, CA, USA. pp. 1434–1441. doi:10.1109/
CVPR.2010.5539802.
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 19
Galván Rangel, J.M., Gonçalves, G.R., and Antonio Pérez, J. 2018. The
impact of number and spatial distribution of GCPs on the posi-
tional accuracy of geospatial products derived from low-cost UASs.
Int. J. Remote Sens. 39(21): 7154–7171. doi:10.1080/01431161.2018.
1515508.
GeoIDEA Lab (Auburn University Geosciences) .2022. Mission checklist
UAV (P4P). Available from https://geolightbulbcom.files.wordpress
.com/2022/10/uas-mission-checklist-p4p.docx [accessed 3 November
2022].
GISCI (Geographic Information Systems Certification Institute). 2022.
The GIS Certification Institute: code of ethics. Available from https://
www.gisci.org/Portals/0/Ethics/CodeOfEthics_PR.pdf [accessed 31 July
2022].
Gómez-Gutiérrez, A., and Gonçalves, G.R. 2020. Surveying coastal clis
using two UAV platforms (multirotor and fixed-wing) and three dif-
ferent approaches for the estimation of volumetric changes. Int. J. Re-
mote Sens. 41(21): 8143–8175. doi:10.1080/01431161.2020.1752950.
Goodchild, M.F. 1992. Geographical information science. Int. J. Geogr. Inf.
Syst. 6(1): 31–45. doi:10.1080/02693799208901893.
Green, D.R., Gregory, B.J., and Karachok, A. (Editors). 2020. Unmanned
aerial remote sensing: UAS for environmental applications. CRC Press
(Taylor & Francis Group), New York. ISBN: 9781482246070.
Griths, D., and Burningham, H. 2019. Comparison of pre- and self-
calibrated camera calibration models for UAS-derived nadir imagery
for a SfM application. Prog. Phys. Geogr. 43(2): 215–235. doi:10.1177/
0309133318788964.
Guo, Y., Senthilnath, J., Wu, W., Zhang, X., Zeng, Z., and Huang, H. 2019.
Radiometric calibration for multispectral camera of dierent imag-
ing conditions mounted on a UAV platform. Sustainability, 11: 978.
doi:10.3390/su11040978.
Hardin, P.J., and Hardin, T.J. 2010. Small-scale remotely piloted vehicles
in environmental research. Geogr. Compass, 4(9): 1297–1311. doi:10.
1111/j.1749-8198.2010.00381.x.
Hardin, P.J., Lulla, V., Jensen, R.R., and Jensen, J.R. 2019. Small unmanned
aerial systems (sUAS) for environmental remote sensing: challenges
and opportunities revisited. GIScience Remote Sens. 56(2): 309–322.
doi:10.1080/15481603.2018.1510088.
Harvey, M.C., Hare, D.K., Hackman, A., Davenport, G., Haynes, A.B.,
Helton, A., et al. 2019. Evaluation of stream and wetland restora-
tion using UAS-based thermal infrared mapping. Water, 11: 1568.
doi:10.3390/w11081568.
Harwin, S., Lucieer, A., and Osborn, J. 2015. The impact of calibration
method on the accuracy of point clouds derived using unmanned
aerial vehicle multi-view stereopsis. Remote Sens. 7(9): 11933–11953.
doi:10.3390/rs70911933.
Heinemann, S., Siegmann, B., Thonfeld, F., Muro, J., Jedmowski, C.,
Kemna, A., et al. 2020. Land surface temperature retrieval for agricul-
tural areas using a novel UAV platform equipped with a thermal in-
frared and multispectral sensor. Remote Sens. 12: 1075. doi:10.3390/
rs12071075.
Hemingway, B.L., Frazier, A.E., Elbing, B.R., and Jacob, J.D. 2017. Verti-
cal sampling scales for atmospheric boundary layer measurements
from small unmanned aircraft systems (sUAS). Atmosphere, 8(9): 176.
doi:10.3390/atmos8090176.
Hodgson, M.E., and Sella-Villa, D. 2021. State-level statutes governing un-
manned aerial vehicle use in academic research in the United States.
Int. J. Remote Sens. 42(14): 5366–5395. doi:10.1080/01431161.2021.
1916121.
Hoople, G., Choi-Fitzpatrick, A., and Reddy, E. 2019. Drones for good:
interdisciplinary project-based learning between engineering and
peace studies. Int. J. Eng. Educ. 35(5): 1378–1391.
Huang, S., Tang, L., Hupy, J.P., Wang, Y., and Shao, G. 2021. A commen-
tary on the use of normalized dierence vegetation index (NDVI) in
the era of popular remote sensing. J. For. Res. 32(1): 1–6. doi:10.1007/
s11676-020-01155-1.
Hunt, E.R., and Daughtry, C.S.T. 2018. What good are unmanned aircraft
systems for agricultural remote sensing and precision agriculture?
Int. J. Remote Sens. 39(15–16): 5345–5376. doi:10.1080/01431161.
2017.1410300.
Hunt, E.R., Hively, W.D., Fujikawa, S.J., Linden, D.S., Daughtry, C.S.T.,
and McCarty, G.W. 2010. Acquisition of NIR-Green-blue digital pho-
tographs from unmanned aircraft for crop monitoring. Remote Sens.
2: 290–305. doi:10.3390/rs2010290.
ICAO (International Civil Aviation Organization). 2022. ICAO model UAS
regulationsintroduction to ICAO model regulations and advisory
circulars. Available from https://www.icao.int/safety/UA/Pages/ICAO-
Model-UAS-Regulations.aspx [accessed 12 April 2022].
Immitzer, M., Böck, S., Einzmann, K., Vuolo, F., Pinnel, N., Wallner,
A., and Atzberger, C. 2018. Fractional cover mapping of spruce and
pine at 1 ha resolution combining very high and medium spatial
resolution satellite imagery. Remote Sens. Environ. 204: 690–703.
doi:10.1016/j.rse.2017.09.031.
Jackman, A., and Brickell, K. 2021. ‘Everyday droning’: towards a femi-
nist geopolitics of the drone-home. Prog. Hum. Geogr. doi:10.1177/
03091325211018745.
James, M.R., and Robson, S. 2014. Mitigating systematic error in to-
pographic models derived from UAV and ground-based image net-
works. Earth Surf. Processes Landforms, 39: 1413–1420. doi:10.1002/
esp.3609.
James, M.R., Chandler, J.H., Eltner, A., Fraser, C., Miller, P.E., Mills, J.P.,
et al. 2019. Guidelines on the use of structure-from-motion pho-
togrammetry in geomorphic research. Earth Surf. Processes Land-
forms, 44(10): 2081–2084. doi:10.1002/esp.4637.
James, M.R., Robson, S., d’Oleire-Oltmanns, S., and Niethammer, U. 2017.
Optimising UAV topographic surveys processed with structure-from-
motion: ground control quality, quantity and bundle adjustment. Ge-
omorphology, 280: 51–66. doi:10.1016/j.geomorph.2016.11.021.
Jaskierniak, D., Lucieer, A., Kuczera, G., Turner, D., Lane, P.N.J., Benyon,
R.G., and Haydon, S. 2021. Individual tree detection and crown delin-
eation from unmanned aircraft system (UAS) LiDAR in structurally
complex mixed species eucalypt forests. ISPRS J. Photogramm. Re-
mote Sens. 171: 171–187. doi:10.1016/j.isprsjprs.2020.10.016.
Jensen, J.R. 2017. Drone aerial photography and videography: Data col-
lection and image interpretation. eBook.
Jenssen, R.O.R., and Jacobsen, S.K. 2021. Measurement of snow wa-
ter equivalent using drone-mounted ultra-wide-band radar. Remote
Sens. 13(13): 2610. doi:10.3390/rs13132610.
Jeziorska, J. 2014. Unmanned aerial vehiclea tool for acquiring spatial
data for research and commercial purposes. New course in the ge-
ography and cartography curriculum in higher education. Int. Arch.
Photogramm. Remote Sens. Spatial Inf. Sci. XL–6: 37–42. doi:10.5194/
isprsarchives-XL-6-37-2014.
Jiang, S., and Jiang, W. 2017. On-board GNSS/IMU assisted feature extrac-
tion and matching for oblique UAV images. Remote Sens. 9(8): 813.
doi:10.3390/rs9080813.
Joyce, K.E., Duce, S., Leahy, S.M., Leon, J., and Maier, S.W., 2018. Principles
and practice of acquiring drone-based image data in marine environ-
ments. Mar. Freshwater Res. 70(7): 952–963. doi:10.1071/MF17380.
Joyce, K.E., Meiklejohn, N., and Mead, P.C.H. 2020. Using minidrones to
teach geospatial technology fundamentals. Drones, 4(3): 57. doi:10.
3390/drones4030057.
Kaartinen, H., Hyyppä, J., Vastaranta, M., Kukko, A., Jaakkola, A., Yu, X.,
et al. 2015. Accuracy of kinematic positioning using global satellite
navigation systems under forest canopies. Forests, 6(9): 3218–3236.
doi:10.3390/f6093218.
Kaminsky, R.S., Snavely, N., Seitz, S.T., and Szeliski, R. 2009. Alignment
of 3D point clouds to overhead images. In Proceedings of the 2009
IEEE Computer Society Conference on Computer Vision and Pattern
Recognition Workshops, Miami, FL, USA. Vol. 1. pp. 63–70. doi:10.
1109/CVPRW.2009.5204180.
Kattenborn, T., Eichel, J., Wiser, S., Burrows, L., Fassnacht, F.E., and
Schmidtlein, S. 2020. Convolutional Neural Networks accurately pre-
dict cover fractions of plant species and communities in Unmanned
Aerial Vehicle imagery. Remote. Sens. Ecol. Conserv. 6: 472–486.
doi:10.1002/rse2.146.
Kattenborn,T.,Lopatin,J.,Forster,M.,Braun,A.C.,andFassnacht,F.E.
2019. UAV data as alternative to field sampling to map woody invasive
species based on combined Sentinel-1 and Sentinel-2 data. Remote
Sens. Environ. 227: 61–73. doi:10.1016/j.rse.2019.03.025.
Kedron, P., Li, W., Fotheringham, S., and Goodchild, M. 2021. Repro-
ducibility and replicability: opportunities and challenges for geospa-
tial research. Int. J. Geogr. Inf. Sci. 35(3): 427–445. doi:10.1080/
13658816.2020.1802032.
Kelcey, J., and Lucieer, A. 2012. Sensor correction of a 6-band multi-
spectral imaging sensor for UAV remote sensing. Remote Sens. 4(12):
1462–1493. doi:10.3390/rs4051462.
Canadian Science Publishing
20 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Lamb, D.W., Hall, A., Louis, J., and Frazier, P. 2004. Remote sensing
for vineyard management. Does (pixel) size really matter? Aust. NZ
Grapegrower Winemaker, 485: 139–142.
Leitão, P., Schwieder, M., Pötzschner, F., Pinto, J.R.R., Teixeira, A.M.C., Pe-
droni, F., et al. 2018. From sample to pixel: multi-scale remote sensing
data for upscaling aboveground carbon data in heterogeneous land-
scapes. Ecosphere, 9(8): e02298. doi:10.1002/ecs2.2298.
Lepš, J., and Hadincová, V. 1992. How reliable are our vegetation analy-
ses? J. Veg. Sci. 3(1): 119–124. doi:10.2307/3236006.
Li, D., Zheng, H., Xu, X., Lu, N., Yao, X., Jiang, J., et al. 2018. BRDF eect on
the estimation of canopy chlorophyll content in paddy rice from UAV-
based hyperspectral imagery. In 2018 IEEE International Geoscience
and Remote Sensing Symposium (IGARSS), Valencia, Spain. pp. 6464–
6467. doi:10.1109/IGARSS.2018.8517684.
Li, S., Dragicevic, S., Antón Castro, F., Sester, M., Winter, S., Coltekin,
A., et al. 2016. Geospatial big data handling theory and methods: a
review and research challenges. ISPRS J. Photogramm. Remote Sens.
115: 119–133. doi:10.1016/j.isprsjprs.2015.10.012.
Lippitt, C.D. 2015. Remote sensing from small unmanned platforms:
a paradigm shift. Environ. Pract. 17(3): 235–236. doi:10.1017/
S1466046615000204.
Lippitt, C.D., and Zhang, S. 2018. The impact of small unmanned airborne
platforms on passive optical remote sensing: a conceptual perspec-
tive. Int. J. Remote Sens. 39(15–16): 4852–4868. doi:10.1080/01431161.
2018.1490504.
Lippitt, C.D., Stow, D.A., and Clarke, K.C. 2014. On the nature of models
for time-sensitive remote sensing. Int. J. Remote Sens. 35(18): 6815–
6841. doi:10.1080/01431161.2014.965287.
Liu, X., Lian, X., Yang, W., Wang, F., Han, Y., and Zhang, Y. 2022. Accuracy
assessment of a UAV direct georeferencing method and impact of the
configuration of ground control points. Drones, 6: 30. doi:10.3390/
drones6020030.
López, Y.A., García-Fernández, M., Álvarez-Narciandi, G., and Las-Heras
Andrés, F. 2022. Unmanned aerial vehicle-based ground-penetrating
radar systems: a review. IEEE Geosci. Remote Sens. Mag. 10(2): 66–86.
doi:10.1109/MGRS.2022.3160664.
Lowe, D. 2004. Distinctive image features from scale-invariant keypoints.
Int. J. Comput. Vision, 60(2): 91–110. doi:10.1023/B:VISI.0000029664.
99615.94.
Ludwig, M., Runge, C.M., Friess, N., Koch, T.L., Richter, S., Seyfried,
S., et al. 2020. Quality assessment of photogrammetric methods—a
workflow for reproducible UAS orthomosaics. Remote Sens. 12(22):
3831. doi:10.3390/rs12223831.
Lunetta, R.S., Congalton, R.G., Fenstermaker, L.K., Jensen, J.R., McGwire,
K.C., and Tinney, L.R. 1991. Remote sensing and geographic informa-
tion system data integration: Error sources and research issues. Pho-
togramm. Eng. Remote Sens. 57(6):677–687.
Madokoro, H., Kiguchi, O., Nagayoshi, T., Chiba, T., Inoue, M., Chiyonobu,
S., et al. 2021. Development of drone-mounted multiple sensing sys-
tem with advanced mobility for In situ atmospheric measurement: a
case study focusing on PM2.5 local distribution. Sensors, 21(14): 4881.
doi:10.3390/s21144881.
Mamaghani, B., and Salvaggio, C. 2019. Multispectral sensor calibration
and characterization for sUAS remote sensing. Sensors, 19: 4453.
doi:10.3390/s19204453.
Matese, A., and Di Gennaro, S.F. 2018. Practical applications of a mul-
tisensor UAV platform based on multispectral, thermal and RGB
high resolution images in precision viticulture. Agriculture, 8(7): 116.
doi:10.3390/agriculture8070116.
Matese, A., Toscano, P., Di Gennaro, S.F., Genesio, L., Vaccari, F.P., Prim-
icerio, J., et al. 2015. Intercomparison of UAV, aircraft and satellite re-
mote sensing platforms for precision viticulture. Remote Sens. 7(3):
2971–2990. doi:10.3390/rs70302971.
Mathews, A.J. 2014. Object-based spatiotemporal analysis of
vine canopy vigor using an inexpensive UAV remote sensing
system. J. Appl. Remote Sens. 8(1): 085199, 1-17. doi:10.1117/1.JRS.8.
085199.
Mathews, A.J. 2015. A practical UAV remote sensing methodology to gen-
erate multispectral orthophotos for vineyards: estimation of spectral
reflectance using compact digital cameras. Int. J. Appl. Geospatial Res.
6(4): 65–87. doi:10.4018/ijagr.2015100104.
Mathews, A.J. 2021. Structure from motion (SfM) workflow for processing
drone imagery. In Fundamentals of capturing and processing drone
imagery and data. Edited by A.E. Frazier and K.K. Singh. CRC Press
(Taylor & Francis Group), New York. ISBN: 9780367245726.
Mathews, A.J. 2022. Unoccupied aircraft systems. In Oxford bibliogra-
phies: geography. Edited by B. Warf. Oxford University Press.
Mathews, A.J., and Frazier, A.E. 2017. Unmanned aerial systems. In Ge-
ographic information science & technology body of knowledge. 2nd
quarter 2017 ed. Edited by J.P. Wilson. doi:10.22224/gistbok/2017.2.4.
McCarthy, E.D., Martin, J.M., Boer, M.M., and Welbergen, J.A. 2021. Drone-
based thermal remote sensing provides an eective new tool for mon-
itoring the abundance of roosting fruit bats. Remote Sens. Ecol. Con-
serv. 7: 461–474. doi:10.1002/rse2.202.
Meinen, B.U., and Robinson, D.T. 2020. Mapping erosion and deposi-
tion in an agricultural landscape: optimization of UAV image ac-
quisition schemes for SfM-MVS. Remote Sens. Environ. 239: 111666.
doi:10.1016/j.rse.2020.111666.
MicaSense. 2022. Best practices: collecting data with MicaSense sensors.
Available from https://support.micasense.com/hc/en-us/articles/22
4893167-Best-practices-Collecting-Data-with- MicaSense-Sensors
[accessed 25 October 2022].
Nababan, B., Mastu, L.O.K., Idris, N.H., and Panjaitan, J.P. 2021.
Shallow-water benthic habitat mapping using drone with object
based image analyses. Remote Sens. 13(21): 4452. doi:10.3390/
rs13214452.
Nevalainen, O., Honkavaara, E., Tuominen, S., Viljanen, N., Hakala, T.,
Yu, X., et al. 2017. Individual tree detection and classification with
UAV-based photogrammetric point clouds and hyperspectral imag-
ing. Remote Sens. 9(3): 185. doi:10.3390/rs9030185.
Nex, F., Armenakis, C., Cramer, M., Cucci, D.A., Gerke, M., Honkavaara,
E., et al. 2022. UAV in the advent of the twenties: where we stand
and what is next. ISPRS J. Photogramm. Remote Sens. 184: 215–242.
doi:10.1016/j.isprsjprs.2021.12.006.
Nezami, S., Khoramshahi, E., Nevalainen, O., Pölönen, I.,
and Honkavaara, E. 2020. Tree species classification of
drone hyperspectral and RGB imagery with deep learning
convolutional neural networks. Remote Sens. 12(7): 1070.
doi:10.3390/rs12071070.
Niu, Y., Zhang, L., Zhang, H., Han, W., and Peng, X. 2019. Estimat-
ing above-ground biomass of maize using features derived from
UAV-based RGB imagery. Remote Sens. 11(11): 1261. doi:10.3390/
rs11111261.
O’Connor, J., Smith, M.J., and James, M.R. 2017. Cameras and settings for
aerial surveys in the geosciences: inimizes image data. Prog. Phys.
Geogr. 41(3): 325–344. doi:10.1177/0309133317703092.
Oniga, V.E., Breaban, A.I., Pfeifer, N., and Chirila, C. 2020. Determining
the suitable number of ground control points for UAS images georef-
erencing by varying number and spatial distribution. Remote Sens.
12: 876. doi:10.3390/rs12050876.
Oré, G., Alcântara, M.S., Góes, J.A., Teruel, B., Oliveira, L.P., Yepes, J.,
et al. 2022. Predicting sugarcane harvest date and productivity with
a drone-borne tri-band SAR. Remote Sens. 14(7): 1734. doi:10.3390/
rs14071734.
Pajares, G. 2015. Overview and current status of remote sensing applica-
tions based on unmanned aerial vehicles (UAVs). Photogramm. Eng.
Remote Sens. 81(4): 281–329. doi:10.14358/PERS.81.4.281.
Paneque-Gálvez, J., McCall, M.K., Napoletano, B.M., Wich, S.A., and Koh,
L.P. 2014. Small drones for community-based forest monitoring: an
assessment of their feasibility and potential in tropical areas. Forests,
5(6): 1481–1507. doi:10.3390/f5061481.
Paneque-Gálvez, J., Vargas-Ramírez, N., Napoletano, B.M., and Cum-
mings, A. 2017. Grassroots innovation using drones for indigenous
mapping and monitoring. Land, 6: 86. doi:10.3390/land6040086.
Pix4D. 2022. Pix4D documentation. Available from https://support.pix4
d.com/ [accessed 25 October 2022].
Pricope, N.G., Halls, J.N., Mapes, K.L., Baxley, J.B., and Wu, J.J.
2020. Quantitative comparison of UAS-borne LiDAR systems for
high-resolution forested wetland mapping. Sensors, 20: 4453. doi:10.
3390/s20164453.
Pricope, N.G., Mapes, K.L., Woodward, K.D., Olsen, S.F., and Baxley, J.B.
2019. Multi-sensor assessment of the eects of varying processing
parameters on UAS product accuracy and quality. Drones, 3: 63.
doi:10.3390/drones3030063.
Puko, T., and Ferek, K.S. 2019. Interior department grounds aerial drone
fleet, citing risk from Chinese manufacturers. Wall Street J.Available
Canadian Science Publishing
Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021 21
from https://www.wsj.com/articles/interior-dept- grounds-aerial- dr
one-fleet-citing-risk-from-chinese-manufacturers-11572473703
[accessed 2 August 2022].
Radjawali, I., and Pye, O. 2017. Drones for justice: inclusive technology
and river-related action research along the Kapuas. Geogr. Helv. 72:
17–27. doi:10.5194/gh-72-17-2017.
Radjawali, I., Pye, O., and Flitner, M. 2017. Recognition through
reconnaissance? Using drones for counter-mapping in
Indonesia. J. Peasant Stud. 44(4): 817–833. doi:10.1080/03066150.
2016.1264937.
Rogers, S., Livingstone, W., and Manning, I. 2020. Comparing the spa-
tial accuracy of digital surface models from four unoccupied aerial
systems: photogrammetry versus LiDAR. Remote Sens. 12(17): 2806.
doi:10.3390/rs12172806.
Rogers, S., Singh, K.K., Mathews, A.J., and Cummings, A.R. 2022. Drones
and geography: who is using them and why? Prof. Geogr. doi:10.1080/
00330124.2021.2000446.
Rosnell, T., and Honkavaara, E. 2012. Point cloud generation from aerial
image data acquired by a quadrocopter type micro unmanned aerial
vehicle and a digital still camera. Sensors, 12: 453–480. doi:10.3390/
s120100453.
Sankey, T., Donager, J., McVay, J., and Sankey, J.B. 2017. UAV
lidar and hyperspectral fusion for forest monitoring in the
southwestern USA. Remote Sens. Environ. 195: 30–43. doi:10.1016/
j.rse.2017.04.007.
Santana, L.S., Ferraz, G.A.E.S., Marin, D.B., Barbosa, B.D.S., Dos Santos,
L.M., Ferraz, P.F.P., et al. 2021. Influence of flight altitude and control
points in the georeferencing of images obtained by unmanned aerial
vehicle. Eur. J. Remote Sens. 54(1): 59–71. doi:10.1080/22797254.2020.
1845104.
Sanz-Ablanedo, E., Chandler, J.H., Rodríguez-Pérez, J.R., and Ordóñez,
C. 2018. Accuracy of unmanned aerial vehicle (UAV) and SfM pho-
togrammetry survey as a function of the number and location of
ground control points used. Remote Sens. 10: 1606. doi:10.3390/
rs10101606.
Schaeer, S.E., Jiménez-Lizárraga, M., Rodriguez-Sanchez, S.V., Cuellar-
Rodríguez, G., Aguirre-Calderón, O.A., Reyna-González, A.M., and Es-
cobar, A. 2021. Detection of bark beetle infestation in drone im-
agery via thresholding cellular automata. J. Appl. Remote Sens. 15(1):
016518. doi:10.1117/1.JRS.15.016518.
Shahbazi, M. 2021. Professional drone mapping. In Unmanned aerial sys-
tems. Edited by A. Koubaa and A.T. Azar. Academic Press (Elsevier),
London. pp. 439–464. ISBN: 9780128202760.
Shook, E., Bowlick, F.J., Kemp, K.K., Ahlqvist, O., Carbajeles-Dale, P., DiB-
iase, D., et al. 2019. Cyber literacy for GIScience: toward formal-
izing geospatial computing education. Prof. Geogr. 71(2): 221–238.
doi:10.1080/00330124.2018.1518720.
Simic Milas, A., Cracknell, A.P., and Warner, T.A. 2018. Drones—t h e t h i r d
generation source of remote sensing data. Int. J. Remote Sens. 39(21):
7125–7137. doi:10.1080/01431161.2018.1523832.
Singh, K.K., and Frazier, A.E. 2018. A meta-analysis and review of un-
manned aircraft system (UAS) imagery for terrestrial applications. Int.
J. Remote Sens. 39(15–16): 5078–5098. doi:10.1080/01431161.2017.
1420941.
Singh, K.K., Vogler, J.B., Shoemaker, D.A., and Meentemeyer, R.K. 2012.
LiDAR-Landsat data fusion for large-area assessment of urban land
cover: balancing spatial resolution, data volume and mapping accu-
racy. ISPRS J. Photogramm. Remote Sens. 74: 110–121. doi:10.1016/j.
isprsjprs.2012.09.009.
Slonecker, E.T., Shaw, D.M., and Lillesand, T.M. 1998. Emerging legal and
ethical issues in advanced remote sensing technology. Photogramm.
Eng. Remote Sens. 64(6): 589–595.
Smith, G.M., and Milton, E.J. 1999. The use of the empirical line method
to calibrate remotely sensed data to reflectance. Int. J. Remote Sens.
20(13): 2653–2662. doi:10.1080/014311699211994.
Snavely, N., Seitz, S.M., and Szeliski, R. 2008. Modeling the world from
internet photo collections. Int. J. Comput. Vision, 80(2): 189–210.
doi:10.1007/s11263-007-0107-3.
Stark, M., Heckmann, T., Piermattei, L., Dremel, F., Kaiser, A., Machowski,
P., et al. 2021. From consumer to enterprise grade: how the choice
of four UAS impacts point cloud quality. Earth Surf. Processes Land-
forms, 46: 2019–2043. doi:10.1002/esp.5142.
Stefanik, K.V., Gassaway, J.C., Kochersberger, K., and Abbott, A.L. 2011.
UAV-based stereo vision for rapid aerial terrain mapping. GISci. Re-
mote Sens. 48(1): 24–49. doi:10.2747/1548-1603.48.1.24.
Stöcker, C., Bennett, R., Nex, F., Gerke, M., and Zevenbergen, J. 2017. Re-
view of the current State of UAV regulations. Remote Sens. 9(5): 459.
doi:10.3390/rs9050459.
Stöcker, C., Nex, F., Koeva, M., and Gerke, M. 2020. High-quality UAV-
based orthophotos for cadastral mapping: guidance for optimal
flight configurations. Remote Sens. 12(21): 3625. doi:10.3390/
rs12213625.
Strahler, A.H., Woodcock, C.E., and Smith, J.A. 1986. On the nature of
models in remote-sensing. Remote Sens. Environ. 20(2): 121–139.
doi:10.1016/0034-4257(86)90018-0.
Suomalainen, J., Oliveira, R.A., Hakala, T., Koivumäki, N., Markelin, L.,
Näsi, R., and Honkavaara, E. 2021. Direct reflectance transformation
methodology for drone-based hyperspectral imaging. Remote Sens.
Environ. 266: 112691. doi:10.1016/j.rse.2021.112691.
Swayze, N.C., Tinkham, W.T., Creasy, M.B., Vogeler, J.C., Homan, C.M.,
and Hudak, A.T. 2022. Influence of UAS flight altitude and speed
on aboveground biomass prediction. Remote Sens. 14: 1989. doi:10.
3390/rs14091989.
Thomas, A.F., Frazier, A.E., Mathews, A.J., and Cordova, C.E. 2020. Im-
pacts of abrupt terrain changes and grass cover on vertical accuracy
of UAS-SfM derived elevation models. Pap. Appl. Geogr. 6(4): 336–351.
doi:10.1080/23754931.2020.1782254.
Tmu ši ´
c, G., Manfreda, S., Aasen, H., James, M.R., Gonçalves, G., Ben-Dor,
E., et al. 2020. Current practices in UAS-based environmental moni-
toring. Remote Sens. 12(6): 1001. doi:10.3390/rs12061001.
Turner, D., Lucieer, A., and De Jong, S.M. 2015. Time series analysis of
landslide dynamics using an unmanned aerial vehicle (UAV). Remote
Sens. 7: 1736–1757. doi:10.3390/rs70201736.
Turner, W. 2014. Sensing biodiversity: sophisticated networks are re-
quired to make the best use of biodiversity data from satellites
and in situ sensors. Science, 346(6207): 301–302. doi:10.1126/science.
1256014.
U.S. DOD (Department of Defense). 2021. Department statement on DJI
systems. Available from https://www.defense.gov/News/Releases/Rele
ase/Article/2706082/department-statement-on-dji-systems/ [accessed
2 August 2022].
U.S. DOI (Department of the Interior). 2020. Secretarial order 3379: tem-
porary cessation of non-emergency unmanned aircraft systems eleet
operations. Available from https://www.doi.gov/sites/doi.gov/files/eli
ps/documents/signed-so-3379-uas-1.29.2020-508.pdf [accessed 2 Au-
gust 2022].
U.S. Supreme Court. 1945. United States v. Causby, 328 U.S. 256. Avail-
able from https://tile.loc.gov/storage-services/service/ll/usrep/usrep32
8/usrep328256/usrep328256.pdf [accessed 26 July 2023].
Ullman, S. 1979. The interpretation of structure from motion. Proc. R.
Soc. Lond. 203(1153): 405–426. doi:10.1098/rspb.1979.0006.
URISA (Urban and Regional Information Systems Association). 2003. GIS
code of ethics. Available from https://www.urisa.org/about-us/gis-cod
e-of -ethics/ [accessed: 31 July 2022].
Valbuena, R., Mauro, F., Rodriguez-Solano, R., and Manzanera, J.A. 2010.
Accuracy and precision of GPS receivers under forest canopies in a
mountainous environment. Spanish J. Agric. Res. 8(4): 1047–1057.
doi:10.5424/sjar/2010084-1242.
Vargas-Ramírez, N., and Paneque-Gálvez, J. 2019. The global emer-
gence of community drones (2012–2017). Drones, 3: 76. doi:10.3390/
drones3040076.
Vautherin, J., Rutishauser, S., Schneider-Zapp, K., Choi, H.F., Chovan-
cova, V., Glass, A., and Strecha, C. 2016. Photogrammetric accu-
racy and modeling of rolling shutter cameras. ISPRS Ann. Pho-
togramm. Remote Sens. Spatial Inf. Sci. III–3: 139–146. doi:10.5194/
isprsannals-III-3-139-2016.
Ventura, D., Bonifazi, A., Gravina, M.F., Belluscio, A., and Ardizzone,
G. 2018. Mapping and classification of ecologically sensitive marine
habitats using unmanned aerial vehicle (UAV) imagery and object-
based image analysis (OBIA). Remote Sens. 10(9): 1331. doi:10.3390/
rs10091331.
Vergnano, A., Franco, D., and Godio, A. 2022. Drone-borne ground-
penetrating radar for snow cover mapping. Remote Sens. 14(7): 1763.
doi:10.3390/rs14071763.
Canadian Science Publishing
22 Drone Syst. Appl. 11: 1–22 (2023) | dx.doi.org/10.1139/dsa-2023-0021
Wachowiak, M.P., Walters, D.F., Kovacs, J.M., Wachowiak-Smolíková, R.,
and James, A.L. 2017. Visual analytics and remote sensing imagery
to support community-based research for precision agriculture in
emerging areas. Comput. Electron. Agric. 143: 149–164. doi:10.1016/
j.compag.2017.09.035.
Wackrow, R., and Chandler, J.H. 2008. A convergent image configuration
for DEM extraction that minimises the systematic eects caused by
an inaccurate lens model. Photogramm. Rec. 23: 6–18. doi:10.1111/j.
1477-9730.2008.00467.x.
Wackrow, R., and Chandler, J.H. 2011. Minimising systematic error sur-
faces in digital elevation models using oblique convergent imagery.
Photogrammetric Record 26: 16–31. doi:10.1111/j.1477-9730.2011.
00623.x.
Wallace, L., Lucieer, A., Malenovský, Z., Turner, D., and Vopˇ
enka, P. 2016.
Assessment of forest structure using two UAV techniques: A compari-
son of airborne laser scanning and structure from motion (SfM) point
clouds. Forests, 7: 62. doi:10.3390/f7030062.
Wallace, L., Lucieer, A., Watson, C., and Turner, D. 2012. Development of
a UAV-lidar system with application to forest inventory. Remote Sens.
4(6): 1519–1543. doi:10.3390/rs4061519.
Watts, A.C., Ambrosia, V.G., and Hinkley, E.A. 2012. Unmanned air-
craft Systems in remote sensing and scientific research: classification
and considerations of use. Remote Sens. 4: 1371–1692. doi:10.3390/
rs4061671.
Whitehead, K., and Hugenholtz, C.H. 2015. Applying ASPRS accuracy
standards to surveys from small unmanned aircraft systems (UAS).
Photogramm. Eng. Remote Sens. 81(10): 787–793. doi:10.14358/PERS.
81.10.787.
Wright, T. 2017. U.S. Army puts a halt to its use of Chinese-
made DJI drones. Smithsonian Mag. Available from https:
//www.smithsonianmag.com/air-space-magazine/us- army-stops-
using-chinese-made-dji-drones-180964324/ [accessed 2 August 2022].
Xu, K., Gong, Y., Fang, S., Wang, K., Lin, Z., and Wang, F. 2019. Radio-
metric calibration of UAV remote sensing image with spectral angle
constraint. Remote Sens. 11: 1291. doi:10.3390/rs11111291.
Yang, B., Hawthorne, T.L., Hessing-Lewis, M., Duy, E.J., Reshitnyk, L.Y.,
Feinman, M., and Searson, H. 2020. Developing an introductory
UAV/drone mapping training program for seagrass monitoring and
research. Drones 4(4): 70. doi:10.3390/drones4040070.
Yoo, L.-S., Lee, J.-H., Lee, Y.-K., Jung, S.-K., and Choi, Y. 2021. Application
of a drone magnetometer system to military mine detection in the
demilitarized zone. Sensors, 21(9): 3175. doi:10.3390/s21093175.
Young, D.J., Koontz, M.J., and Weeks, J. 2022. Optimizing aerial imagery
collection and processing parameters for drone-based individual tree
mapping in structurally complex conifer forests. Methods Ecol. Evol.
13: 1447–1463. doi:10.1111/2041-210X.13860.
Zimmerman, T., Jansen, K., and Miller, J. 2020. Analysis of UAS flight al-
titude and ground control point parameters on DEM accuracy along
a complex, developed coastline. Remote Sens. 12: 2305. doi:10.3390/
rs12142305.
Additional resources
GeoTED-UAS, https://vsgc.odu.edu/geoted-uas/
Guyana Civil Aviation Authority (GCAA). (2022). GCAA Advi-
sory Circular: Unmanned Aerial Vehicles (UAVs): AC No:
GCAA AC/UAV/001
ICAO (International Civil Aviation Organization). The UCAO
UAS Toolkit. https://www.icao.int/safety/UA/UASToolkit
James, M. R., Chandler, J. H., Eltner, A., Fraser, C.,
Miller, P. E., Mills, J. P., Noble, T., Robson, S., and
Lane, S. N. (2019) Guidelines on the use of structure-
from-motion photogrammetry in geomorphic research.
Earth Surface Processes and Landforms 44: 2081–2084.
Doi:10.1002/esp.4637.
NCAT (National Center for Autonomous Technologies), https:
//ncatech.org/starting-a-drone-program/
UASTEP (Unmanned Aircraft System Operations Technician
Education Program). https://uastep.org/
UNAVCO, https://www.unavco.org/projects/project-support/
uas-support/uas-support.html
USGS National Uncrewed Systems Oce, https://www.usgs.g
ov/programs/national-land-imaging-program/science/nati
onal-uncrewed-systems-of fice
U.S. FAA (Federal Aviation Administration). https://www.faa.
gov/uas/
... Although the cotton plant itself was not the primary target for wild pig consumption, it incurred damage as pigs rooted for newly planted seeds or remnants from the previous year's peanuts. Because our flights were flown at 100 m altitude, resulting in a GSD of >2.5 cm, the resulting resolution was too coarse and we were unable to discern damage at the necessary fine scale (Mathews et al. 2023) given the typical range of cotton stem diameters is 0.7-1.60 cm (Zong et al., 2012). The irregular branched growth patterns of cotton, which left ground views exposed even at maturity, posed additional challenges that were difficult to determine given our spatial resolution. ...
Thesis
Full-text available
Agriculture is essential for human sustenance and global economies, cultures, and societies. However, wildlife damage to crops can significantly diminish productivity, necessitating effective mitigation strategies. Among the most destructive species are wild pigs (Sus scrofa), renowned for their severe impact on row crop damage through consumption, rooting, and trampling. In our study, we assessed the extent of wild pig damage to row crop fields in southern Alabama, USA. We utilized aerial imagery collected via unoccupied aircraft systems (UAS) and developed detection models using deep learning algorithms to quantify damage. Additionally, we evaluated the economic ramifications of wild pig damage on row crops and analyzed surrounding landscape elements as potential predictors of field predation by wild pigs. We successfully developed detection models with over 90% accuracy for corn and peanut crops. However, our attempts to develop a similar model for cotton proved infeasible due to flying at too high of an altitude, resulting in a ground sampling distance (GSD) with a resolution that was too large. Corn experienced more frequent damage compared to peanuts and the average amount of damage was greater for damaged corn fields (0.12 ha, 6.28% overall field damage) than damaged peanut fields (0.08 ha, 0.38% overall field damage). However, the cumulative losses were greater for peanut (n = 23 fields, $5,675.18, averaged $16.13/ha across damaged fields) than corn (n = 6 fields, $3,164.05, $49.21/ha). Furthermore, crop type, distance to water, and landscape patch density were significant contributors to the likelihood of wild pig-induced damage. Our findings offer valuable insights for policymakers, landowners, and wildlife managers striving to combat the challenges posed by wild pig predation in agricultural landscapes. By integrating ecological understanding with practical management strategies, we can effectively address the adverse impacts of wild pig predation and sustain agricultural productivity.
... Drones provide precise geographical and temporal data, crucial for understanding the connections between environmental factors and disease spread (Daud et al., 2020). Drones are intended to enhance current remote sensing technologies by offering a new level of usefulness and localized mapping applications to established methods (Mathews et al., 2023). An exploratory study was undertaken to examine the mapping of microorganisms in the atmosphere and the transmission of microbial traffic, such as the flu, within the same species. ...
Article
Full-text available
This work presents an overview of the application of satellite imagery or remote sensing (RS) data for vector-borne disease (VBD) monitoring in sub-Saharan Africa. We discussed the different vector-borne disease conditions that are prevalent and endemic in parts of sub-Saharan Africa and how satellite imagery (RS data) can be used in monitoring these conditions. Key disease conditions considered include malaria, human African trypanosomiasis, leishmaniasis, lymphatic filariasis, loa loa filariasis, rift valley fever, dengue, yellow fever, and rickettsioses. Furthermore, we explored some of the current ways remote sensing data and geographical information systems (GIS) are being applied to monitoring these diseases. We discuss the efficacy of using strong spatial modelling techniques combined with RS data to enhance our comprehension of the role that environmental conditions play in influencing VBD vectors and transmission alongside the utilization of GIS/RS technologies for vector-borne illness surveillance, prevention, and control. Finally, we discussed how drone technology and new remote sensing platforms can provide improved monitoring and the impact of such improvements from a population health standpoint.
... It offers a relatively low-cost and userfriendly platform for acquiring very high spatial resolution data over small study areas (Orrin et al. 2020). UASs facilitate research at unprecedented geographic scales, bridging the low-altitude data collection gap between ground-based field/survey methods and moderate-altitude aerial data collection conducted from occupied or piloted aircraft (Adam et al. 2023). ...
Article
Full-text available
Light pollution from artificial night-time lights (ANTLs) is a global health, economic, and environmental issue. Remote sensing and unoccupied aerial systems (UASs) provide efficient and cost-effective ways to study ANTL spatial patterns and dynamics over large areas. With ultrahigh-resolution images that can identify individual light sources, UAS offers more detailed image than satellite imagery. However, standardisation and optimisation of camera and flight settings during the acquisition ANTL UAS images is lacking. The aim of this paper is to determine the camera and flight settings to capture high-quality ANTL using a DJI Matrice 300 RTK aircraft with a Zenmuse P1 camera. It emphasises the importance of selecting appropriate camera settings for high-quality ANTL images, which can benefit future research. Results show significant image quality gains when camera and flight settings are chosen appropriately in relation to the lighting conditions. We present three experiments demonstrating a range of camera settings, and we provide practical recommendations for high-quality night-time image collection. The optimal camera settings were determined to be an exposure time of 0.0166 s, ISO of 25600, and aperture of 2.97. This experiment produced outstanding results, with 85% of images having a blur extent below 0.40.
... Their versatile applications have prompted extensive study and exploration within these varied industries. Furthermore, UAVs are evolving beyond their initial role in sensing activities, as highlighted in references such as [1,2], to play a crucial role in the execution of operational tasks. This shift underscores the expanding importance of UAVs, not only in data gathering but also in practical implementation operations. ...
Article
Full-text available
Inspired by the limited battery life of multi-rotor unmanned aerial vehicles (UAVs), this research investigated hierarchical real-time control of UAVs with the generation of energy-optimal reference trajectories. The goal was to design a reference generator and controller based on optimal-control theory that would guarantee energy consumption close to optimal with lower computational cost. First, a least-squares-estimation-(LSE) algorithm identified the parameters of the UAV mathematical model. Then, by considering a precise electrical model for the brushless DC motors and rest-to-rest maneuvers, the extraction of clear rules to compute the optimal mission time and generate ’energetic trajectories’ was performed. These rules emerged from analyzing the optimal-control strategy results that minimized the consumption over many simulations. Afterward, a hierarchical controller tracked those desired energetic trajectories identified as sub-optimal. Numerical experiments compared the results regarding trajectory tracking, energy performance index, and battery state of charge (SOC). A co-simulation framework consisting of commercial software tools, Simcenter Amesim for the physical modeling of the UAV, and Matlab-Simulink executed numerical simulations of the implemented controller.
Article
Full-text available
Drones have emerged as a cost‐effective solution to detect and map plant invasions, offering researchers and land managers flexibility in flight design, sensors and data collection schedules. A systematic review of trends in drone‐based image collection, data processing and analytical approaches is needed to advance the science of invasive species monitoring and management and improve scalability and replicability. We systematically reviewed studies using drones for plant invasion research to identify knowledge gaps, best practices and a path toward advancing the science of invasive plant monitoring and management. We devised a database of 33 standardized reporting parameters, coded each study to those parameters, calculated descriptive statistics and synthesized how these technologies are being implemented and used. Trends show a general increase in studies since 2009 with a bias toward temperate regions in North America and Europe. Most studies have focused on testing the validity of a machine learning or deep learning image classification technique with fewer studies focused on monitoring or modelling spread. Very few studies used drones for assessing ecosystem dynamics and impacts such as determining environmental drivers or tracking re‐emergence after disturbance. Overall, we noted a lack of standardized reporting on field survey design, flight design, drone systems, image processing and analyses, which hinders replicability and scalability of approaches. Based on these findings, we develop a standard framework for drone applications in invasive species monitoring to foster cross‐study comparability and reproducibility. We suggest several areas for advancing the use of drones in invasive plant studies including (1) utilizing standardized reporting frameworks to facilitate scientific research practices, (2) integrating drone data with satellite imagery to scale up relationships over larger areas, (3) using drones as an alternative to in‐person ground surveys and (4) leveraging drones to assess community trait shifts tied to plant fitness and reproduction.
Article
Full-text available
Information on a crop’s three-dimensional (3D) structure is important for plant phenotyping and precision agriculture (PA). Currently, light detection and ranging (LiDAR) has been proven to be the most effective tool for crop 3D characterization in constrained, e.g., indoor environments, using terrestrial laser scanners (TLSs). In recent years, affordable laser scanners onboard unmanned aerial systems (UASs) have been available for commercial applications. UAS laser scanners (ULSs) have recently been introduced, and their operational procedures are not well investigated particularly in an agricultural context for multi-temporal point clouds. To acquire seamless quality point clouds, ULS operational parameter assessment, e.g., flight altitude, pulse repetition rate (PRR), and the number of return laser echoes, becomes a non-trivial concern. This article therefore aims to investigate DJI Zenmuse L1 operational practices in an agricultural context using traditional point density, and multi-temporal canopy height modeling (CHM) techniques, in comparison with more advanced simulated full waveform (WF) analysis. Several pre-designed ULS flights were conducted over an experimental research site in Fargo, North Dakota, USA, on three dates. The flight altitudes varied from 50 m to 60 m above ground level (AGL) along with scanning modes, e.g., repetitive/nonrepetitive, frequency modes 160/250 kHz, return echo modes (1n), (2n), and (3n), were assessed over diverse crop environments, e.g., dry corn, green corn, sunflower, soybean, and sugar beet, near to harvest yet with changing phenological stages. Our results showed that the return echo mode (2n) captures the canopy height better than the (1n) and (3n) modes, whereas (1n) provides the highest canopy penetration at 250 kHz compared with 160 kHz. Overall, the multi-temporal CHM heights were well correlated with the in situ height measurements with an R2 (0.99–1.00) and root mean square error (RMSE) of (0.04–0.09) m. Among all the crops, the multi-temporal CHM of the soybeans showed the lowest height correlation with the R2 (0.59–0.75) and RMSE (0.05–0.07) m. We showed that the weaker height correlation for the soybeans occurred due to the selective height underestimation of short crops influenced by crop phonologies. The results explained that the return echo mode, PRR, flight altitude, and multi-temporal CHM analysis were unable to completely decipher the ULS operational practices and phenological impact on acquired point clouds. For the first time in an agricultural context, we investigated and showed that crop phenology has a meaningful impact on acquired multi-temporal ULS point clouds compared with ULS operational practices revealed by WF analyses. Nonetheless, the present study established a state-of-the-art benchmark framework for ULS operational parameter optimization and 3D crop characterization using ULS multi-temporal simulated WF datasets.
Article
Full-text available
The management of low-density savannah and woodland forests for carbon storage presents a mechanism to offset the expense of ecologically informed forest management strategies. However, existing carbon monitoring systems draw on vast amounts of either field observations or aerial light detection and ranging (LiDAR) collections, making them financially prohibitive in low productivity systems where forest management focuses on promoting resilience to disturbance and multiple uses. This study evaluates how UAS altitude and flight speed influence area-based aboveground forest biomass model predictions. The imagery was acquired across a range of UAS altitudes and flight speeds that influence the efficiency of data collection. Data were processed using common structures from motion photogrammetry algorithms and then modeled using Random Forest. These results are compared to LiDAR observations collected from fixed-wing manned aircraft and modeled using the same routine. Results show a strong positive relationship between flight altitude and plot-based aboveground biomass modeling accuracy. UAS predictions increasingly outperformed (2–24% increased variance explained) commercial airborne LiDAR strategies as acquisition altitude increased from 80–120 m. The reduced cost of UAS data collection and processing and improved biomass modeling accuracy over airborne LiDAR approaches could make carbon monitoring viable in low productivity forest systems.
Article
Full-text available
Ground-penetrating radar (GPR) is one of the most commonly used instruments to map the Snow Water Equivalent (SWE) in mountainous regions. However, some areas may be difficult or dangerous to access; besides, some surveys can be quite time-consuming. We test a new system to fulfill the need to speed up the acquisition process for the analysis of the SWE and to access remote or dangerous areas. A GPR antenna (900 MHz) is mounted on a drone prototype designed to carry heavy instruments, fly safely at high altitudes, and avoid interference of the GPR signal. A survey of two test sites of the Alpine region during winter 2020–2021 is presented, to check the prototype performance for mapping the snow thickness at the catchment scale. We process the data according to a standard flow-chart of radar processing and we pick both the travel times of the air–snow interface and the snow–ground interface to compute the travel time difference and to estimate the snow depth. The calibration of the radar snow depth is performed by comparing the radar travel times with snow depth measurements at preselected stations. The main results show fairly good reliability and performance in terms of data quality, accuracy, and spatial resolution in snow depth monitoring. We tested the device in the condition of low snow density (
Article
Full-text available
This article presents a novel method for predicting the sugarcane harvesting date and productivity using a three-band imaging radar. Taking advantage of working with a multi-band radar, this system was employed to estimate the above-ground biomass (AGB), achieving a root-mean-square error (RMSE) of 2 kg m−2 in sugarcane crops, which is an unprecedented result compared with other works based on the Synthetic Aperture Radar (SAR) system. By correlating the field measurements of the ripening index (RI) with the AGB measurements by radar, an indirect estimate of the RI by the radar is obtained. It is observed that the AGB reaches its maximum approximately 280 days after planting and the maximum RI, which defines the harvesting date, approximately 360 days after planting for the species IACSP97-4039. Starting from an AGB map collected by the radar, it is then possible to predict the harvesting date and the corresponding productivity with competitive average errors of 8 days and 10.7%, respectively, with three months in advance, whereas typical methods employed on a test site achieve an average error of 30 days with three months in advance. To the best of our knowledge, it is the first time that a multi-band radar is employed for productivity prediction in sugarcane crops.
Article
Full-text available
Drones have equipped geographers with the capacity to collect high-quality geospatial data at multiple spatial, spectral, and temporal resolutions. Although the adoption of drones is increasing across geography, knowledge of those using this technology and their practices is limited. The purpose of this article is to understand who is using drones in geography, how they are using them, and what future opportunities exist. We collected data from eighty-eight survey respondents, predominantly based in the United States but a handful from Australia, Canada, the European Union, and the United Kingdom. The findings from our Web-based survey show that about 85 percent of geographers using drones are White. Female respondents made up only about 30 percent of respondents, although they represented about 75 percent of the eighteen to twenty-four age group. Although the word drone has a negative connotation, most users (~38 percent) prefer it, followed by unmanned aerial vehicle (~21 percent) and unmanned aerial systems (~19 percent). Only 22 percent of geographers have more than six years of drone experience, suggesting the rapid growth in use and popularity among geographers. Off-the-shelf drones are the most desirable, perhaps due to their low cost and ease of use. Overall, drones in geography are considered positive and have introduced a new era of small extent geospatial analyses.
Chapter
Since the vast majority of UAS operations involves collecting aerial imagery, this chapter focuses on imaging sensors (digital cameras), which are widely used for geospatial projects. The successful execution of any mapping project requires a tremendous amount of planning. Planning must be done by an experienced professional who is familiar with all aspects of mapping. Mission planning involves: 1. Defining product specifications and accuracy; 2. Studying area maps; 3. Researching operational site restrictions; 4. Planning the aerial imagery and selecting the imaging sensor; 5. Planning the ground controls; 6. Selecting procedures, personnel and production instruments; 7. Estimating costs; and 8. Developing a delivery schedule. This chapter details the most important activities involved in mission planning and flight plan development.
Chapter
An unoccupied aircraft system or UAS is comprised of an aircraft (typically, fixed- or rotary-wing) and the equipment needed to remotely operate the aircraft including a remote controller, additional ground control (e.g., computing station), and communication between components. UAS are otherwise known as drones, unoccupied/uncrewed/unmanned aerial vehicles (UAVs), and remotely piloted aircraft systems (RPAS) among others. In most cases, UAS refer to non-military small UAS (sUAS) that adhere to civil aviation size/weight requirements such as weighing less than 55 pounds (about 25 kilograms) in the United States as regulated by the Federal Aviation Administration (FAA) or the equivalent civil aviation authority in other countries. To geographers and geographic information scientists (GIScientists), UAS enable the collection of very high-spatial and -temporal resolution aerial data for a relatively low-cost compared to more traditional means (e.g., field-based surveys, weather balloons, occupied/piloted aircraft, satellites). The low-cost and relative ease of operation of UAS supports adoption of the technology in geographic research and teaching efforts. GIScientists including remote sensing experts have been quick to incorporate UAS into their work to examine geographic problems at unprecedented spatial and temporal scales. In this way, UAS as a topic cannot be separated from remote sensing because the technology is primarily used to remotely sense Earth’s surface using optical sensors (e.g., images, video), lidar, and radar. Digital photogrammetric methods tie together overlapping aerial images to generate three-dimensional (3D) point clouds and orthophotos. Specifically, images are processed using a suite of computer vision algorithms referred to together as Structure from Motion-Multiview Stereo (SfM-MVS, SfM for short) photogrammetry. Geographers focusing on areas outside of geospatial techniques, though, also utilize UAS. These include physical geographers interested in modeling landforms and fluvial processes as well as human geographers interested in community-based work, social research, and/or ethical issues with UAS usage and data collection. It is important to note though that the topic of UAS is multi- and interdisciplinary. Design and development of UAS technology requires contributions from engineering and computer science among other fields, and UAS operation involves aspects of airspace regulation, aviation law, and policy. Here we focus on conceptual and applied aspects of UAS from the perspective of geography, but references from cognate fields are included when needed.
Article
Advances in unmanned aerial vehicle (UAV) technology have fostered its use in a wide range of areas, such as agriculture and forestry, surveillance and security, and infrastructure inspection. One of the advantages of UAVs is their ability to conduct remote inspection and sensing by placing different kinds of sensors on board them. In this sense, UAV-based ground-penetrating radar (GPR) systems are of particular interest as they bring together the advantages of UAVs and GPR, resulting in contactless subsurface sensing and imaging systems capable of performing a fast scanning of difficult-to-access scenarios. This contribution reviews the advances on UAV-based GPR systems, describing their architecture and subsystems. In particular, an analysis of different UAV-based GPR systems is presented, focusing on the technical solutions adopted in each case and the detection capabilities that have been achieved. Attention will be also given to the methodologies implemented to obtain 3D high-resolution images of the underground. Finally, the main challenges faced by these systems concerning further improvements of the scanning throughput and the detection accuracy will be discussed.
Article
Recent advances in remotely piloted aerial systems (‘drones’) and imagery processing enable individual tree mapping in forests across broad areas with low‐cost equipment and minimal ground‐based data collection. One such method involves collecting many partially overlapping aerial photos, processing them using ‘structure from motion’ (SfM) photogrammetry to create a digital 3D representation and using the 3D model to detect individual trees. SfM‐based forest mapping involves myriad decisions surrounding methods and parameters for imagery acquisition and processing, but it is unclear how these individual decisions or their combinations impact the quality of the resulting forest inventories. We collected and processed drone imagery of a moderate‐density, structurally complex mixed‐conifer stand. We tested 22 imagery collection methods (altering flight altitude, camera pitch and image overlap), 12 imagery processing parameterizations (image resolutions and depth map filtering intensities) and 286 tree detection methods (algorithms and their parameterizations) to create 7,568 tree maps. We compared these maps to a 3.23‐ha ground reference map of 1,775 trees >5 m tall that we created using traditional field survey methods. The accuracy of individual tree detection (ITD) and the resulting tree maps was generally maximized by collecting imagery at high altitude (120 m) with at least 90% image‐to‐image overlap, photogrammetrically processing images into a canopy height model (CHM) with a twofold upscaling (coarsening) step and detecting trees from the CHM using a variable window filter after applying a moving window mean smooth to the CHM. Using this combination of methods, we mapped trees with an accuracy exceeding expectations for structurally complex forests (for canopy‐dominant trees >10 m tall, sensitivity = 0.69 and precision = 0.90). Remotely measured tree heights corresponded to ground‐measured heights with R ² = 0.95. Accuracy was higher for taller trees and lower for understorey trees and would likely be higher in less dense and less structurally complex stands. Our results may guide others wishing to efficiently produce broad‐extent individual tree maps of conifer forests without investing substantial time tailoring imagery acquisition and processing parameters. The resulting tree maps create opportunities for addressing previously intractable ecological questions and informing forest management.
Article
We offer a review and research agenda for critical remote sensing, defined as inquiries and scientific practices cognizant of the embedding of power within the production, analysis, and instrumentalization of satellite imagery. First, we consider critiques of the satellite gaze. Second, we chronicle remote sensing’s evolving political economy, examining the technology’s use by governments, scientists, and commercial and non-governmental actors. Then, we review practices of critical remote sensing, categorized as research 1) exposing injustices; 2) engaging situated knowledges; and 3) empowering marginalized actors. Lastly, we suggest five areas for intertwining critiques and practices and consider possibilities for counter remote sensing.