Unmanned Aerial Vehicles UAVs Collision Avoidance
Unmanned Aerial Vehicles UAVs Collision Avoidance
fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000.
Digital Object Identifier 10.1109/ACCESS.2017.DOI
ABSTRACT Moving towards autonomy, unmanned vehicles rely heavily on state-of-the-art collision
avoidance systems (CAS). A lot of work is being done to make the CAS as safe and reliable as
possible, necessitating a comparative study of the recent work in this important area. The paper provides a
comprehensive review of collision avoidance strategies used for unmanned vehicles, with the main emphasis
on unmanned aerial vehicles (UAV). It is an in-depth survey of different collision avoidance techniques that
are categorically explained along with a comparative analysis of the considered approaches w.r.t. different
scenarios and technical aspects. This also includes a discussion on the use of different types of sensors for
collision avoidance in the context of UAVs.
INDEX TERMS autonomous aerial vehicles, autonomous vehicles, collision avoidance, active and passive
sensors, optimisation-based, force-field based, sense and avoid, geometry based
VOLUME 4, 2016 1
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
ally around the globe [16], [17]. Airplanes are safer; reports detection, is the first step for any collision avoidance system.
from CNN and the Aviation Safety Network show that the In this phase, sensors are utilised, in order to perceive the
amount of annual deaths caused by commercial flight acci- environment and detect obstacles. There are various different
dents is in the range of a few hundreds [18], [19] in average. types of sensors available in the market, but they can all
However, helicopter and private/military airplane accidents be categorised as either active or passive sensors based on
are not included in these statistics. The data for causes of fatal the principle of their functionality (see Section 2). Active
accidents from January 1960 to December 2015 compiled by sensors have their own source which transmits light or emits
planecrashinfo.com shows that about 58% of the accidents a wave and read the reflected back-scatter. On the other hand,
were due to the human error [20]. This human factor can be passive sensors only read the energy discharged by the object,
minimised by integrating intelligent decision making capa- from another source e.g. sunlight, reflected by the object. An
bilities such as obstacle detection, collision avoidance, and action for collision avoidance can be categorised into four
path planning with the autopilot system to make the sys- major approaches: geometric in which location and velocity
tem more autonomous. In that way, intelligent autonomous information of the node/UAV and obstacles is utilised, usu-
collision avoidance methods can significantly contribute to ally by simulating the trajectories, to perform the reformation
making airplanes even safer and saving human lives. More- of nodes to avoid the collision, force-field in which attrac-
over, with the increasing usage of unmanned vehicles and tive/repulsive forces are manipulated for collision avoidance,
especially the exponential increase in the applications of optimised through which the already known parameters of
UAVs in public areas and our everyday lives, the need for obstacles are used to optimise the route, and sense and avoid
intelligent and highly reliable collision avoidance systems is through which run-time decisions for avoidance are made on
obvious and indisputable from the viewpoint of public safety. the basis of obstacle detection.
In contrast with collision avoidance in cars, UAVs have the Collision avoidance systems range from either simply
ability of reaching difficult to reach and dangerous areas warning the operator of the vehicle [22] to complex process
while posing no potential danger to humans. Hence, UAVs of autonomously controlling the system either completely
should be designed to be completely autonomous and able or partially to avoid the collision. The actuators can be
to fly without colliding with other objects, which requires either applying brakes or steering the vehicle away from the
fundamental research [21]. detected obstacle. Initially, the research in the field was based
on advanced highways (ground vehicles), which provided a
good base for advancements also in the areas of intelligent
aerial and surface vehicles [23], [24]. The authors in [25]
provide an interesting approach in classifying the collision
avoidance into global and local path planning problems.
Where the global/conventional path planning reacts to the
changes in the environment and generates the optimal routes
while keeping the whole environment under consideration,
whereas in collision avoidance also referred to as local path
planning, the changes in the environment are dealt with
locally as they are detected and an avoidance maneuver is
performed accordingly to avoid the collisions to get back to
the originally planned path.
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
navigate without being explicitly controlled [26]. Due to the composed of sense, detect, and collision avoidance as shown
ability to work in a collaborative and cooperative manner, in Figure 3. The first step is to sense, in which the system ob-
swarms of UAVs are gaining even more attention in the serves its surroundings or the environment. As soon as some
research community. The deployment of swarms or multiple point of interest i.e., an obstacle comes within the range, the
UAVs adds significant advantages over single UAVs and has detection phase of the system tries to assess the risk. Based
demand in vast and diverse areas, for instance in military or on this, the collision avoidance module does the necessary
commercial use, search and rescue, monitoring traffic, threat calculations to compute the amount of deviation needed from
detection especially at borders, and atmospheric research the original path in order to avoid the potential collision. As
purposes [27]–[29]. In a challenging dynamic environment, soon as the calculations have been done, the system performs
tasks may become increasingly difficult for UAVs due to the necessary maneuver to successfully avoid the obstacle.
on-board payload limitations (e.g., sensors, batteries), power Different subcategories, the comparison among different
constraints, reduced visibility due to bad weather (e.g., rain, categories and the aforementioned subcategories are dis-
dust), and complications in remote monitoring. The robotics cussed in more detail in each corresponding section. The
community is striving hard to address these challenges and rest of the paper is organised as follows. Section 2 briefly
to bring the technological level suited for the demanding overviews and explains obstacle detection, and passive and
environments ensuring success and safe navigation of the active sensors are discussed in detail. Section 3 focuses on
unmanned vehicles [30]–[32]. Obstacle detection and col- collision avoidance approaches and the environmental effects
lision avoidance are one of the most challenging issues in detail. The available methods and solutions are discussed
for autonomous vehicles and become even more critical in Section 4 along with conclusion.
in dynamic environments with multiple UAVs and moving
obstacles [33]–[35]. II. PERCEPTION: OBSTACLE DETECTION
In an autonomous drone/swarm of drones, a collision is Perception is the first step in any collision avoidance system.
said to have taken place between a drone and any other In order to detect obstacles, the drone should be able to
object, i.e., another drone or an external object or obstacle, perceive its surroundings and environment. For this, it needs
when the distance between them is less than the predeter- to be equipped with one or more sensors working as a
mined collision radius Rc [36]. The collision radius and perception unit [37]. For remote sensing systems, sensors
detection range are illustrated in Figure 2. The condition can such as imaging sensors with diverse resolutions are the
be mathematically expressed as: essential components. The usage of sensors is quite diverse,
depending on the needs. Some of the sensors that can be
||ru − ro || < Rc (1) used in observation are LiDAR, visual cameras, thermal or
where ru and ro are the position vectors of the drone and IR cameras, and solid-state or optomechanical devices [38],
the object, respectively. [39]. The types of sensors are fundamentally divided based
An object is detected, i.e., obstacle detection takes place, on their spectral sensitivity, and the electromagnetic spectrum
when the distance between the drone and the object is less (see Figure 4) of the bands used by remote sensing systems
than the detection range and the object is in the field of view [40].
of the on-board sensor system. This can be mathematically In order to detect an obstacle, different types of sensors are
expressed as: used which can be mainly categorised into two sets:
• Passive Sensors
||ru − ro || < (dRange & F OV ) (2) • Active Sensors
where dRange is the detection range radius and FOV is the
field of view of the drone dependent on the equipped sensor A. PASSIVE SENSORS
system. Passive sensors are the ones that detect the energy discharged
A collision avoidance system (CAS) for an unmanned by the objects or the scenery under observation. Most of the
vehicle is responsible for ensuring that no collisions happen passive sensors being employed in sensing applications are
with any obstacle whether moving or stationary. A CAS, optical or visual cameras, thermal or infrared (IR) cameras,
in order to be able to do that, must address the following and spectrometres [41], [42]. There are different types of
questions: cameras which work on different wavelengths (see Figure 4),
• How to detect an obstacle and determine its attributes for instance, visible light, infrared (short-wave, near-wave,
e.g., its velocity, size, and position mid-wave, and long-wave infrared), and ultraviolet band
• How to determine if the object is approaching and there (UV). In [43], the authors present a methodology for tracking
is a risk of collision and real-time detection of a vehicle using acoustic signals.
• How to perform actual collision avoidance based on the From noisy data, they extract the robust spatial features and
calculations done then process them through sequential state estimation to
There may be different descriptions of CASs stressing acquire the output. They verify the proposed methodology
over different sides of the system, but basically a CAS is with practical acoustic data.
VOLUME 4, 2016 3
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
reads the reflected signal [57], [58]. Their ability to penetrate weather environments. The setup is composed of a small-
the atmosphere in most conditions is due to the fact that sized radar sensor and obstacle collision avoidance system
majority of such sensors work in the microwave portion of (OCAS) processor. The data generated by the radar, such as
the spectrum. Examples of ranging sensors are: LiDARs [59], the velocity of the obstacles, azimuth angles, range of the
radars [60], sonar or ultrasonic sensors [61], [62], and active obstacle, is used by OCAS to calculate the avoidance criteria
infrared sensors [63], [64]. Such sensors have fast response, and send the commands to the flight controller to perform
require less processing power, can scan larger areas quickly, necessary maneuvers to avoid collisions. They evaluated the
are less affected by the weather and lighting conditions, and performance of the system and at the required detection
can return the parameters of interest of the obstacles, such range, the probability of detecting an obstacle is more than
as distance and angles, accurately. The authors in [65] use 90%. For collision avoidance, four different scenarios were
millimetre wave (MMW) radar. In their system, by observing used to analyse its performance. And the results showed that
the echoes produced by radar signals the distance between the even if there is an error in the radar data, successful collision
object and the vehicle is calculated for detecting and tracking avoidance probability is more than 85% due to the defined
the objects. The performance is also evaluated in different safety margins.
weather conditions and for different distances. Although In [72], the authors provide a comprehensive study of the
the radar based solutions are appealing, they are either too advantages of using radar sensors with UAVs for obstacle
expensive or heavy as a payload for smaller robots such as detection and the detection and calculation of other attributes
battery operated UAVs [66], [67]. of the detected obstacle, such as the velocity of the obstacle
and the angular information using multichannel radars. Fur-
1) Radar thermore, using forward facing radars, radar’s simultaneous
A radio detection and ranging (radar) sensor functions by multitarget range capability, the detection of targets in the
transmitting a radio signal which upon encountering an ob- wide angular range of ±60o in azimuth is shown by exper-
ject bounces off of it back to the radar. Depending on the time imental results. In [73], in order to implement the proposed
it took for the signal to bounce back, the distance between the autonomous collision avoidance strategy, the authors utilised
object and the radar is calculated. Radar systems have been Ultra-Wideband (UWB) collocated MIMO radar. One of the
around for decades; they have good resistance to weather key benefits of radar cognition is the ability to adapt UWB-
conditions and hence are also applied to airborne systems. MIMO radar transmission waveform for providing improved
Although airborne radar systems are quite expensive, they are detection and therefore to guide the UAV it provides approxi-
commonly used to provide data due to their accuracy. mation of the collision points. In [74], the authors study radar
Radars are based on either continuous waves or pulsed systems for sense and avoid on UAVs as they are one of
waves. A continuous-wave radar, as the name suggests, the most reliable all-weather sensors that precisely provide
emits a continuous stream of linearly modulated signals (also the ranging and closing speeds. A detailed analysis for three
known as frequency modulated signals), while a pulsed-wave radar bands, i.e., S (3 GHz), Ka (35 GHz), and X (10 GHz)
radar emits powerful and short bursts of signals and hence bands is provided and the advantages and disadvantages are
suffers from a blind spot in contrast to the continuous-wave discussed for each individual band. After studying the radar
radars [68]. Radars are also used to detect the motion of bands for sense and avoid technique, the authors concluded
the objects such as their speeds. For instance, if an object with X-band the most favourable solution due to its ease of
is moving towards the radar, the frequency of the echo or installation as it can be integrated in the UAV frame without
bounced off signal increases, and the change in the frequency extra volumes, and its cost, and performance such as its
is used to calculate the speed at which the object is moving ability to provide good angular accuracy. In [75], the authors
[69]. Microwave radar sensors are insensitive to weather con- investigate the performance of radar sensor and proposed the
ditions but have relatively low frequency band and therefore design of a prototype miniature lightweight X-band radar
do not provide a sufficient angular resolution. However, the sensor for UAVs, due to the capability of radar sensors to
millimeter wave radar sensors have benefits such as a finer provide comprehensive identification and detection of the
angular resolution and small size but are sensitive to weather targets/obstacles for the UAVs. The Doppler shift caused
conditions [70]. The angular resolution is dependent on the due to the propulsion of the UAV, is used for the reliable
aperture size of the antenna but can be enhanced to some detection of the targets and subsequently utilised to enhance
extent by increasing the frequency. the maneuvers of the UAV for avoiding the collisions. The
Radars are appropriate for outdoor applications due to authors claim that this detection and identification process is
their immunity against environmental conditions such as scalable and can be used for larger vehicles as well.
the ability to operate irrespective of lighting conditions or
overcast weather, and wide range coverage. However, only 2) LiDAR
obstacle detection can be done but exact reconstruction of A light detection and ranging (LiDAR) sensor works very
an object’s dimensions is not possible with radars due to similar way to radars. LiDAR sensors operate in two parts:
their low output resolution [71]. The authors in [69] used one part emits laser pulses onto the surface(s) and the other
small-sized radar for acquiring real-time range under all reads their reflection to measure the time it took for each
VOLUME 4, 2016 5
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
pulse to bounce back in order to calculate the distance. Data Table 1 shows that all the sensors have some limitations
collection using LiDAR is fast and also extremely accurate. and strengths over the others, making it evident that not one
LiDAR systems have become more affordable during the past specific sensor can be used to cover the collision avoidance
two decades. Furthermore, over the years LiDAR sensors problem comprehensively. More than one sensor can be used
have become much smaller and compact in size and lighter to cover larger area and make up for blind spots, or multiple
in weight as compared with the earlier versions and are now sensor types can be fused together, where the weakness of
feasible for mounting on small and micro UAVs as well [71], one sensor can be counterbalanced by the other(s).
[76]. The systems based on LiDAR, especially 1D and 2D According to Table 1, it can be understood that mainly ac-
LiDAR sensors, are more economical than the radars. The tive sensors have better accuracy in contrast with inactive or
authors in [59] tested their developed system under various passive sensors. However, active sensors have higher power
conditions with good accuracy, with different types of laser consumption as compared with passive sensors, as active
scanner mounted on a vehicle. 3D LiDARs, also known as 2- sensors first transmit the signal and after that capture the data
axis LiDARs, are conventional sensors for 3D mapping or 3D for computation, while passive sensors rely on some external
obstacle detection [77]. Due to the continuous movement and power source for transmission, such as sun light or object’s
ranging of LiDAR, motion warping present in the acquired own source, and only reads the signal for computational
data makes the usage of these LiDARs strenuous. The authors purposes. Another important comparison is the processing
in [78] suggest that a way to overcome this is by incorporat- requirement. The data captured via active sensors is directed
ing other sensors along with LiDAR. Exact pose estimation data, i.e., the data is specified for the detection, it does not
of objects can be made with 3D LiDARs only. The authors in contain unnecessary data, like what exists in e.g., cameras.
[79] present a solution for distortion, caused by the motion, This makes the processing of active sensor data easier as
by extracting intensity images from the 3D LiDAR scans and compared with passive sensor’s data. Another issue as the
matching the visual features. consequence of this phenomenon is the computation power
Since LiDAR uses a short wavelength, it has the ability of processing the data that for active sensors is lower than
to detect small objects and can reconstruct a monochrome for passive sensors. For instance, in the case of camera(s),
coloured 3D image of the environment. LiDAR’s major due to heavy computations for image processing and filtering
weakness is that it cannot detect transparent objects such the unnecessary information from the image, the processing
as clear glass. Therefore, LiDAR needs to be accompanied power is higher as compared with a LiDAR sensor whose
by another sensor, such as an ultrasonic sensor, that can data is directed.
overcome this issue. The other interesting analysis based on Table 1 is the
effect of noise, e.g., weather condition or light sensitivity, on
3) Sonar the data. Active sensors, because of directing the captured
Ultrasonic sensors work on the principle of emitting sound data and having their own source for transmitting waves,
waves and listening to its reflection back from an object to are less prone to noise than passive sensors. For instance,
calculate the distance between the object and the sensor [80], LiDAR or ultrasonic sensors work in most environmental
[81]. The sound waves are generated at a frequency too high, conditions with or without daylight; cameras, on the other
i.e., 25-50 KHz, for the human hearing frequency band [82]. hand, need optimal lighting to be able to create an image of
The basic principle used to calculate the distance is similar to the environment.
the one used by radars or LiDARs, i.e., emit a wave, wait until Furthermore, using a passive sensor for detection of ob-
the bounced off wave from an object arrives, and calculate stacles is highly questionable as the algorithm designed may
the distance based on how long it took for the wave to reflect not be able to distinguish between one or more objects in
back by using this simple formula: scenery and can thereby cause collisions. One major example
of such an issue is that of a fatal accident involving a Tesla
(v ∗ t) car, whose collision avoidance system failed to distinguish
d= (3) between a brightly-lit sky and a tractor trailer [83].
2
where d is the distance, v is the speed of the wave, and t is III. COLLISION AVOIDANCE
the time of flight.
Ultrasonic sensors are readily available and are much
cheaper than most of the other ranging sensors available in
the market. Unlike LiDARs, ultrasonic sensors are unaffected
by the transparency of the object; for instance, LiDARs have
difficulty in detecting clear glass while ultrasonic sensors are
not affected by the colour of the objects. However, if the FIGURE 5: Reactive Collision Avoidance
object reflects the sound wave in a different direction than the
receiver or if its material has the characteristics of absorbing In general, collision avoidance approaches work on either
sound, the sonic sensor’s readings will be unreliable. of the two principles: reactive or deliberative planning. Fig-
6 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
TABLE 1: Sensor attribute comparison for Obstacle Detection: short (0-100 m), medium (100 - 1000 m), long ( > 1000 m)
ure 5 shows that in reactive control the agent/robot gathers sense-and-avoid methods that mainly focus on reducing the
the information about its surroundings using local on-board computational cost, with short response time, by simplifying
sensors and react based on that information. It allows for the process of collision avoidance to an individual detection
rapid response to sudden changes in the environment. How- and avoidance of obstacles for each drone and deviating the
ever, reactive control can lead to a local minimum, may get drone from its original path when needed, independently
stuck in it, and may require another navigational technique to of the other drones’ plans [36], [91], [92]. Each method is
overcome this problem. explained in more detail in the following sections.
In deliberative planning, shown in Figure 6, the agent
senses and updates the environmental map. Once the map A. GEOMETRIC METHODS
has been updated, an optimal path with collision free route
Geometric approaches rely on the analysis of geometric
is calculated keeping the initial goal as a reference, and
attributes to make sure that the defined minimum distances
that optimal route plan is then executed. For this, an accu-
between agents, e.g. UAVs, is not breached. This is accom-
rate map of the environment is needed to be able to work
plished by computing the time to collision by utilising the
perfectly, which requires more computational power to do
distances between the UAVs and their velocities. If Auto-
all the required computations. This approach, as such, is
matic Dependent Surveillance Broadcast (ADS-B) sensing
not suitable for dynamic environments in which variables
is used to obtain the mentioned attributes, the usage of the
change over time. Hence, a hybrid approach, that can switch
method is restricted due to the sensitivity of ADS-B towards
between the reactive and deliberative modes depending on
noise. It is also classified as cooperative sensing as ADS-
the environmental needs, is more appropriate.
B needs cooperation between UAVs. However, if a UAV
is equipped with a vision-based sensor that can detect an
obstacle’s location, size and velocity using a passive device, a
non-cooperative sensing methodology is obtained, drastically
increasing the amount of on-board processing required [87],
[93]–[95].
To optimally solve the problem of collision between two
aircrafts, the authors in [96] present an analytical approach
FIGURE 6: Deliberative Collision Avoidance for a planar case to resolve that conflict. Utilising the ge-
ometric characteristics of the trajectories, closed-form ana-
Collision avoidance algorithms can be categorised into the lytical solutions are acquired for optimal combinations of
following major methods: 1) geometric methods that work commands for resolving the conflict. Minimum deviation
by computing the distance between the agent/UAV and the from the normal flight plan is achieved by minimising the
obstacle utilising the information such as velocities of both velocity vector changes. In [97], using a mixed geometric
UAV and obstacle and location of obstacle [84]–[87]; 2) and collision cone approach along with the information such
force-field methods, in which main idea is inspired by at- as the coordinates and the velocities of the aircrafts, conflict
tractive or repulsive electric forces that exist among charged avoidance in a 3D environment is achieved. However, for the
objects. In a swarm of drones, each UAV node is considered most general cases, the authors rely on numerical optimisa-
a charged particle, and attractive or repulsive forces between tion methods and acquire analytical results for special cases
them and the obstacles are used to generate the path or the only.
route to be taken [4], [88]; 3) optimisation-based methods In [98], the authors study geometry based collision avoid-
that aim at finding the optimal or near-optimal solutions for ance strategies for a swarm of UAVs. The proposed ap-
path planning and motion characteristics of each drone w.r.t. proach uses line-of-sight vectors in combination with relative
the other drones and obstacles. These techniques rely on velocity vectors while considering the dynamic constraints
static objects, with known locations and sizes, for calculating of a formation. By calculating a collision envelope, each
the efficient route within a finite time period [89], [90]; and 4) UAV can determine the available direction for avoiding a
VOLUME 4, 2016 7
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
collision and decide whether the formation can be kept and the obstacles. In dynamic environments, these attributes
while avoiding collisions. For cooperative UAVs in a 3D of the obstacles are not known in advance. In [106], the
environment, [99] proposes a method for providing a selected authors present an idea of placing a potential field around
UAV with an optimal flight path. Considering changes only a robot rather than obstacles. In [107], the authors propose an
in vertical directions, the authors use an integration equation artificial potential field for finding the shortest path between
of distance, track adjustment costs and time, under certain starting and destination points. A robot is repelled from and
restrictions such as performance and distance constraints, attracted towards obstacle and target points, respectively, due
to generate an optimal flight path to be navigated upon. to the repulsive and attractive forces generated by those
An approach in which tracking control is integrated with a respective points. Based on these two types of forces, the
geometric collision avoidance method is proposed in [100]. robot calculates the aggregate force which then determines
Upon detection of obstacles, the obstacles with the highest the characteristics of the robot’s motion. A major drawback
risk are first selected. Then, a boundary sphere is generated of this method is that, for symmetric environments, it is very
for each obstacle to define the safe and risk areas, and tangent sensitive to local minima and therefore does not necessarily
lines from the UAV to the sphere, together with information lead to a globally optimised solution.
on the direction of the UAV’s movement, are used to calculate In [104], the authors proposed a novel artificial potential
a collision detection angle to determine the best direction of field approach called "enhanced curl-free vector field" for
deviation to avoid the possible collision. optimal collision free routes generation under dynamic con-
In [101], the authors presented a new methodology of Fast ditions with multiple obstacles, where other UAVs are also
Geometric Avoidance Algorithm (FGA) based on kinemat- considered as moving obstacles as well. Instead of utilising
ics, the probability of collisions, and navigational limitations the conventional potential field, in this approach, enhanced
by combining the geometric avoidance and the selection curl-free vector field, i.e., conservative field, is used by gen-
of start time from critical avoidance. In multiple obstacles erating the field around the obstacle and determining the field
scenario, instead of avoiding the obstacles simultaneously, vectors, i.e., direction of the curl-free vector, based on the
FGA can assign different threat levels to obstacles based on velocity vector of dynamic obstacles and the corresponding
the critical time for avoidance and avoid them sequentially position vector’s angle from UAV to the obstacle and the
and hence increasing the avoidance success rate. Simulation path angle of the UAV. The usability of this approach was
results, in same environment, showed a comprehensive re- tested with simulations, however, this approach still needs
duction in the computational time for FGA as compared to to be validated in 3D environments with static and dynamic
other similar way-point generation method. variables.
In [102], the authors proposed a methodology of guiding In [108], the authors present an optimised artificial poten-
the UAVs from mission start to destination whilst avoiding tial field algorithm to provide smooth and safe trajectories for
colliding with any obstacles in their path and keeping a track UAVs in a 3D space. The proposed optimised artificial poten-
of the pre-defined trajectory. In order to achieve this opti- tial field (APF) algorithm provides an improvement over the
mally, the authors propose combining the collision avoidance traditional APF algorithms by considering other UAVs and
control with the trajectory control of the system while solving their interactions as part of the method. The algorithm sees
these tasks independently and later combining them by a other UAVs as dynamic obstacles while planning navigation
designed movement strategy. Making the computations sim- towards the target. The authors simulated various scenarios to
plified and faster as collision avoidance control is provoked test their algorithm for unreachable target problem that exists
only in the presence of obstacles. A tracking control law is in the classical APF algorithm. Furthermore, the optimised
designed by computing the tracking errors, from the geomet- navigation was also tested through simulations where the al-
rical relation between the UAV and pre-defined trajectory, to gorithm allows the UAV to plan at every instant while taking
make sure that the UAV stays as close to the reference as into account the obstacles, other UAVs, and the destination.
possible. Similarly, upon detection of a possible collision, In [109], a vehicular collision avoidance algorithm based
collision avoidance control calculates the risk zones and on artificial potential fields is presented. By relying on the
angles to calculate the best avoidance maneuver. However, dimensions and also on the shape of the potential fields
the effectiveness of the proposed approach was tested under of the obstacles/vehicles, the algorithm can appropriately
static conditions and more work is required to verify and guide a vehicle either to slow down or accelerate to pass
validate the usability in dynamic environments. another vehicle, depending on the vehicles’ velocities and the
surrounding traffic. However, this method has its limitations.
B. FORCE-FIELD METHODS For instance, complex maneuvers can take place around other
Force-field methods, also known as potential field methods, vehicles due to local minima. Moreover, the step size used for
use the concept of a repulsive or attractive force field either to the time function has to be precisely adjusted, because a too
repel an agent/robot from an obstacle or to attract it towards large time step can cause collisions or unstable behaviour.
a target [94], [103]–[105]. The position where the obstacles The authors in [110] propose a 1D virtual force field
lie in the environment and their shape must be known, as methodology for detection of moving obstacles. They claim
this approach relies on the motion and geometry of the robot that the problem of efficiency loss in a traditional obstacle
8 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
force field (OFF) method is due to the lack of taking the ob- next best set of commands is chosen and evaluated similarly.
stacles’ motion into account. This can be resolved by the pro- The process can involve several recalculations of the cost
posed prediction based obstacle force field method. Focusing function to eventually find the optimal collision-free solution.
on unmanned ground vehicles (UGV), the approach equips In [115], the authors propose a new methodology, based
a UGV with a frequency modulated continuous wave radar on particle swarm optimisation, for path planning of au-
to determine the predicted obstacle force field (POFF) to tonomous vehicles in unknown environments. In this ap-
accommodate the problem of moving obstacles, addressing proach, the data on the environment gathered by the sensors
thereby the major weakness of conventional 1D virtual force is utilised by assigning different weights to different types of
field algorithms. Using the obstacle’s velocity, the time-to- territories, and based on those weights the algorithm classi-
collision is calculated, based on which the approach predicts fies different possibilities of navigating through the terrain.
the estimated point of impact and generates the POFF. The algorithm then selects the optimal path based on this
In [111], the authors consider a robot a particle in a force classification.
field. Upon insertion of the robot in the potential field, the
repulsive forces generated by the obstacles will repel the D. SENSE & AVOID METHODS
robot away from them, and the attractive forces generated Sense-and-avoid methods mainly focus on reducing the
by the target will attract the robot towards it. Experimental computational power needed, with short response times, by
results through simulations showed that the response of this simplifying the process of collision avoidance to individual
approach can be fast and reactive for static environments, detection and avoidance of obstacles, to control the path
requiring further work in analysing the response of this ap- of each drone in a swarm without knowledge on the plans
proach in dynamic environments. Furthermore, the proposed of other drones. In a formation, the location of each drone
algorithm does not tackle the problem of local minima where w.r.t. the other drones is defined, and the collision avoidance
the sum of attractive and repulsive forces is zero. process deals with individual path planning for drones in
order to avoid possible crashes both between drones within
C. OPTIMISATION BASED METHODS the swarm and between drones and external obstacles in
Optimisation based methods rely on calculation of the avoid- the environment. Sense and avoid based collision avoidance
ance trajectory based on geographical information. Prob- is known for its ability to react quickly and it is therefore
abilistic search algorithms aim to provide the best search an appropriate method for dynamic environments. In this
areas based on the available uncertain information. To ad- approach, an agent/robot is equipped with different types of
dress the high computational complexity of these algorithms, sensors such as LiDAR, sonar, and radar. For instance, radar
several optimisation methods have been developed, such as reacts quickly to any object that comes within the detection
ant-inspired algorithms, genetic algorithms, Bayesian opti- range of the sensor, even though it cannot see the details of
misation, gradient descent based methods, particle swarm the object [36], [116], [117].
optimisation, greedy methods, and local approximations. In A 2D LiDAR based approach, proposed in [118], presents
[112], for instance, the authors use a minimum time search a methodology where the objects are classified into two
algorithm with ant colony optimisation to ensure successful categories, static or dynamic. The algorithm is also capable
calculation of optimised collision-free search paths for UAVs of approximating the velocities of the dynamic obstacles.
under communication-related constraints. The proposed algorithm is demonstrated to be efficient as
Focusing on unmanned surface vehicles (USV), the au- compared with similar existing methodologies in terms of
thors in [113] discuss collision detection and path plan- required computational power and memory.
ning methods by considering global and local path planners, The authors in [119] use a computer vision technique for
analysing the most common techniques from the classical detecting animals and avoiding collisions with them. They
graph search theory as well as intelligent methods like ar- have used more than 2200 images to train their system
tificial neural networks and evolutionary algorithms. The and performed tests based on video clips of animals on
authors highlight the inadequacy of existing methods by highways. The algorithm provides satisfactory results with
concluding that almost none of the existing approaches ap- 82.5% accuracy and successfully detects animals in order to
propriately address sea or weather conditions and/or involve avoid collisions. However, the proposed solution is highly
the dynamics of the vessel when the path is generated. Hence, speed dependent and will not help in preventing collisions
further studies are needed in this area. at speeds exceeding 35km/h. In fact, at higher speeds, it may
In [114], the authors present an algorithm that predicts not be able to detect objects at all. Furthermore, the provided
the next coordinates of a UAV based on the set of possible solution can have a very poor performance especially in bad
commands it is going to execute in a short period of time. weather conditions, in low or too bright lighting, in foggy
The algorithm formulates a cost function for the optimal tra- conditions, as well as in shiny (highly reflective) surround-
jectory by considering the target coordinates and the current ings.
position of the UAV. Based on this cost function, the best set In [120], the authors use five ultrasonic (US) sensors along
of future commands is selected. Then, a collision detection with a predefined neural network module in MATLAB to tri-
method is applied, and if a potential collision is found, the angulate and detect the precise position and shape of objects.
VOLUME 4, 2016 9
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
They consider three different shaped objects for their testing. the target. This system is not viable in dynamic environments
Furthermore, the five US sensors used in their solution are nor in bad lighting conditions. Furthermore, the robot is
more than required for locating a detected object, as the totally dependent on the visibility of the lines and does not
precise 2D location can be found using only two US sensors, take into account the presence of an obstacle on a line itself,
and the third dimension (depth) can be found by adding the lacking dynamic capabilities completely in such situations.
third US sensor. Moreover, their results are satisfactory only In [127], the author proposes to equip vehicles with adap-
when the objects are regular shaped; they report that their tive cruise control along with a collision avoidance system
neural network is not able to correctly identify objects with in such a way that collisions with other vehicles are au-
irregular shapes. tonomously avoided by braking at slower speeds and by
In [3], the authors use low-cost sensors (US sensors and IR steering at higher speeds. In [128], forward-looking cameras
scanners) to develop a simple solution for obstacle detection are used for real-time obstacle detection and avoidance. The
and collision avoidance. They employ inertial and optical presented fuzzy control based method is in principle applica-
flow sensors as a distance derivative for reference to get better ble to different types of unmanned vehicles; in the paper, it
data fusion. The resulting solution has a low computational is experimented on a small quadrotor UAV. The authors use a
cost, saving memory and computing time, and it enables camera that is mounted in front of UAV to avoid collisions via
a UAV to efficiently avoid collisions without any need for visual servoing through image processing. In this approach,
simultaneous localisation and mapping. the collected data is wirelessly sent to a laptop for further
In [121], the authors fuse an US sensor with a binocu- processing, where obstacles are marked with specific colours,
lar stereo vision camera to implement object detection and and this information is then employed to guide the UAV
avoidance. A new path is calculated by an algorithm based around the obstacles. The algorithm avoids the obstacles by
on the Rapidly exploring Random Tree (RRT) scheme, using pushing them to either the left or right side of the image. A
stereo vision as the main approach to detect obstacles around potential problem in this setup is that communication delays
a UAV. The US sensor is utilised especially in situations between the drone and the controlling computer can lead to
where the camera fails to detect the obstacles [121]. an accident in situations where an obstacle is very close or
In [122], a real-time 3D vehicle detection method (RT3D) moves rapidly towards the UAV.
is proposed, using a pure LiDAR point cloud to predict In [129], the authors propose a methodology which uses
the location, orientation and size of vehicles. The authors two cameras for detecting the obstacles in the range of 30 to
use a pre-RoI pooling (region of interest pooling) convolu- 100 meters and up to the speed of about 22km/h. In order to
tion technique to pre-process most of the data in order to differentiate between the sea and sky, this approach relies on
maximise efficiency. Furthermore, to increase the detection the sea-sky line and assumes that the obstacles are moving
accuracy of location, orientation and size of vehicles, they in a regular manner. Different filters are applied to detect the
also propose a pose-sensitive feature map design activated by obstacles. A limitation of the scheme is that it does not take
relative poses of vehicles. Using the KITTI benchmarks data- into account rough sea waves, haphazardly moving obstacles
set [123], [124], they demonstrate that the designed RT3D and overcast situations.
system delivers a competitive accuracy compared with the In [130], a simulated UAV equipped with a LiDAR sensor
existing state-of-the-art methods, reportedly being also the is inspected using a feed-forward based algorithm. The UAV
first approach that completes detection within 0.09s, i.e. in is mainly controlled by the operator, and the algorithm esti-
a time shorter than the scan period of mainstream LiDAR mates the path of the UAV by using the current inputs from
sensors. the operator and the future for a predefined period of time.
The author in [125] proposes a 3D reactive obstacle avoid- The algorithm checks for any possible collisions with objects
ance technique. The algorithm detects an obstacle in a UAV’s and diverts the UAV from the original path when needed by
path, makes the craft hover on its position, calculates the keeping it as close to the operator’s input as possible.
best escape route, and then instructs the UAV accordingly.
This proposed method was efficient enough in detection of E. ENVIRONMENTAL EFFECTS
various obstacles such as trees, communication towers, with Environmental disturbances exist in all industrial systems
the mean collision time being 0.08ms and the mean escape and have a huge impact over especially UAVs and therefore
point search time being 0.49ms. The method is demonstrated are one of the key factors in the design of stability controllers
using stereo vision and laser-based sensing schemes. The of such systems. This environmental disturbance, such as
limitation of such methodology is the on-board memory for safe and controlled landing of the UAVs under dynamic
3D maps as the escape point search can only be done within conditions such as oscillatory or moving platforms [131]
the bounds of the saved map, so increasing the size of map or maintaining the geometric configuration of multiple or
will increase the efficiency of escape point search. swarm of UAVs [132] or wind effect [133], is estimated by
In [126], the authors propose a solution in which the the designed controller and then a feedback control action is
possible paths are represented by lines with different colours. taken based on that. Different methodologies or algorithms
A robot having the ability to distinguish between various designed to deal with such uncertainties have the common
colours can then select the desired line autonomously to reach goal of estimation of uncertainties or disturbances to design
10 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
a compensation controller that minimises their effect on and a receiver for reading incoming waves reflected back
the system. Such methods are also referred to as distur- from objects in the environment. A passive sensor, in con-
bance/uncertainty estimation and attenuation (DUEA) [134]– trast, only detects the light or energy discharged or reflected
[136]. by objects, relying on an external source of energy to be
In [136], in order to optimise the coverage in urban areas, present. For instance, a camera, a passive sensor, relies on
presence of obstacles, the authors proposed a method of tri- an external light source to illuminate the scenery for it to
angular mesh generation which also considers the wind field work properly, whereas LiDAR, an active sensor, emits its
and perform online adjustments accordingly to minimise the own laser pulses onto the scene under observation and reads
energy loss due to the identified wind field. For wind field the back-scatter for further processing. Therefore, accuracy
identification, the proposed methodology analyses the be- of data provided by a camera depends on the quality and
haviour of the wind vector statistically and for sequencing/re- intensity of an external light source, while LiDAR does not
sequencing of way-points and optimisation of trajectories have such a limitation.
it is then added to the next generation autonomous UAS As active sensors contain both transmitter and receiver,
flight management systems. 11% improvement of energy they consume in general more power than passive sensors
consumption is reported in the presence of wind, while in the that just read data. On the other hand, active sensors capture
presence of gusts of wind an energy efficiency of upto 9% is directed data, i.e., reflected versions of the signals emitted by
reported in the results. the sensors themselves, which simplifies the data processing
In [137], the authors proposed an ce rejection control phase significantly. In the case of passive sensors such as
(ADRC) guidance law for collision avoidance of UAVs to visual cameras, the computational requirements are very
tackle with the instability caused by disturbances such as high, because the raw image data needs to be thoroughly
wind, sensor noise, and unknown obstacle acceleration. The filtered and processed to find the relevant points of interest.
designed ADRC controller was overlapped with the collision Consequently, a camera based collision avoidance approach
avoidance as a stabilising feedback control system. The sta- has a high computational cost, making it challenging es-
bility of the nonlinear ADRC is proved using simulations and pecially for scenarios where very fast object detection and
the results show that the designed technique can deal with decision making is needed. On the other hand, in appropriate
multiple disturbances. lighting conditions, it can provide more detailed information
In [138], the authors developed a two mode controller to on the environment than an approach based on an active
tackle with extreme winds that may take the UAV out of its sensor such as LiDAR, sonar or radar. Having said that,
stability bounds. The designed controller functions in normal lower processing needs (i.e. faster response times) and better
mode if the thrust and sensor limitations are not exceeded tolerance against difficult lighting and weather conditions
by the environmental conditions or in case they are, then make ranging systems more suitable for efficient collision
the controller switches to the drift mode, in which a drift avoidance compared with camera based methods.
frame is generated, based on the UAV’s thrust, drag and wind Discussed collision avoidance approaches can be com-
estimation, and the stabilising trajectories are generated. The pared from different perspectives and by defining different
stabilised trajectory is generated by the UAV by intertial evaluation metrics. The evaluation metrics generally are de-
frame trajectory tracking requirement in the drift frame. termined based on the expected goals of the algorithm in
Authors validated the performance of the designed controller its use case and limitations of the platform. Each collision
through simulations by comparing the performance of con- avoidance algorithm has its own pros and cons w.r.t. the
troller equipped UAV with the UAV which does not utilise different evaluation metrics that make the algorithm suit-
the drift mode. able for a specific use case. An overview of the advantages
and disadvantages of the most common methods in the-
IV. DISCUSSION AND CONCLUSION state-of-the-art is shown in Table 2. To illustrate a general
In the previous sections, we presented a comprehensive liter- comparison among different aspects of the algorithms, we
ature review on collision avoidance systems and strategies have categorised the algorithms independently based on ten
used for unmanned vehicles. As any collision avoidance evaluation metrics that are depicted in the table and explained
system needs a means to be able to sense or perceive its as follows:
surroundings, we also analysed the different types of sensors The first metric is real-time performance: The RTP of
relevant to unmanned vehicles, classifying them into active sense and avoid and geometric is better than compared to the
and passive devices in a traditional manner. The considered force-field and optimisation methods, as sense and avoid does
collision avoidance approaches were divided into four main not required too much processing to avoid any changes in
categories: geometric methods, force-field methods, optimi- the environment i.e., obstacles approaching. Also geometric
sation based methods, and sense and avoid methods. These methods are fast and computationally light. However, disad-
different classes of approaches have some benefits and trade- vantage of geometric methods as compared to sense&avoid
offs that are assessed and summarised in this section. is that in geometric the time of computation and algorithm
An active sensor has its own transmitter, a source of en- complexity is highly dependant on the algorithm implemen-
ergy, for emitting a wave, with a given range of wavelengths, tation.
VOLUME 4, 2016 11
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
TABLE 2: Performance comparison between state-of-the-art collision avoidance approaches: real-time performance (RTP),
velocity constraint (VC), static and dynamic environment (SDE), deadlock (DL), swarm compatibility (SC), robustness (R),
3D compatibility (D), communication dependence (CD), escape trajectories (ET), pre-mission path planning (PPP)
CA Approach References RTP VC SDE DL SC R D CD ET PPP
[98] [101] [139] [140] 3 3 3 7 3 3 3D 7 3 7
[141] 3 3 7 7 3 7 3D 3 3 3
Geometric [142] 3 7 7 7 3 3 3D 3 3 3
[102] [143] [144] 3 3 3 7 3 3 3D 3 3 3
[104] [108] 3 3 3 7 3 3 3D 3 3 3
[110] 3 3 3 7 3 3 1D 7 3 3
Force-Field [107] [111] 3 7 7 3 7 7 2D 7 3 3
[145] 3 7 3 7 3 3 2D 7 3 3
[113] [114] [146] 7 7 7 7 7 7 2D 3 3 3
[115] 3 7 3 7 3 3 2D 3 3 3
Optimisation [147] 3 3 3 7 3 3 3D 3 3 3
[118] [3] 3 3 3 3 3 3 3D 7 3 7
[120] [129] 3 7 7 3 7 7 3D 7 7 7
Sense & Avoid [122] 3 3 3 7 7 3 3D 7 7 3
[125] 3 7 3 7 3 3 3D 7 3 7
The second metric is velocity constraint (VC), i.e., the ve- dynamic environments.
locity of the obstacles is taken into consideration: According The eighth metric is communication dependence:
to the literature reviewed, taking VC into consideration, it is Sense&avoid methods do not have communication depen-
to handle VC using sense&avoid and geometric approaches, dence as they work locally and take decisions locally without
however force-field and optimisation methods are more suit- communicating with other UAVs or systems. Some discussed
able for pre-defined planning and does not take into account literature based on force-field methods rely on communi-
the UAV dynamic at each interval. cation with other UAVs, while most other work does not,
The third metric is static and dynamic environment: For showing that force-field methods do not rely that much on
handling the dynamic environments, sense&avoid approach CD and it depends on the model and implementation. Other
is the easiest and lightest since it offers local computations to approaches, however do rely on communication with other
react to any changes observed by the on-board sensor system nodes/UAVs.
and can work both indoors and outdoors in static or dynamic The ninth metric is escape trajectories: The escape trajec-
environments. Force-field does not have good performance tories offered by different approaches can be summarised
in narrower passages and in dynamic environments it has a as: sense&avoid offer escape trajectories at run-time and
common issue of leading to a local minima. Optimisation on locally, the escape trajectories for optimisation methods are
the other hand is best suitable for static environments, as it pre-defined based on the optimised path chosen, force-field
requires pre-planning and has to optimise the whole routine methods offer escape trajectories based on the E-field that
for any changes detected. It would also require more memory offers attraction/repulsion, and geometric methods have pro-
to store the large areas of map for better optimisation. tocol based escape trajectories.
The fourth metric is deadlock: Optimisation and geometric The tenth metric is pre-mission path planning:
methods does not have the deadlock/local minima issue. Sense&avoid and geometric methods do not require pre-
Force-field methods can lead to a local minima, however, mission path planning.In geometric methods path planning is
sense&avoid methods do not handle this issue locally and done based on the collision cone and the velocity obstacle.
require another methodology to tackle this issue. Optimisation and force-field methods require pre-mission
The fifth metric is swarm compatibility: All the mentioned path planning to perform optimally.
approaches can be utilised for large teams of UAVs. However, Based on the discussion and our understanding, we provide
sense&avoid method requires the assistance of an additional a summarised attributes table as in shown in Table 3. Among
algorithm for communication handling between the UAVs. the approaches, the geometric and force-field methods have
The sixth metric is robustness: All mentioned approaches the highest complexity level in terms of algorithm design
are capable of being robust depending on the way they are (computational cost). The optimisation based methods are of
implemented. medium complexity, while the sense and avoid approaches
The seventh metric is dimensions: Sense&avoid, geomet- rank lowest for complexity in this comparison.
ric, and optimisation methods have a lot of work handling The geometric and force-field approaches are communica-
3D environments. However, a lot researchers are focusing tion dependent. i.e., they rely on close interaction between
on testing feasibility of utilising force-field methods for 3D the individual agents/robots constituting a swarm. The opti-
12 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
TABLE 3: CAA comparison w.r.t. summarised attributes: indoors (In), outdoors (Out)
misation based methods are quite static and have therefore no Furthermore, further research and development can be
concept of communication. The sense and avoid approaches directed on the extension and validation of the developed
are not communication dependent either as they are based on algorithms in 3-dimensional environments with dynamic
local sensing of the environment and local processing of the constraints bringing the simulations closer to real world
information in each individual agent/robot separately. environments and moving towards the real-time testing. For
According to Table 3, it is quite evident that among the instance, the 3-D collision avoidance algorithm designed
approaches the optimisation based collision avoidance meth- in [95], collision avoidance and navigation using transla-
ods are suitable only for static environments, since the whole tional coordinates in [116], formation control and collision
environment needs to be known in detail, and the optimal so- avoidance in [36], efficiency of the designed controller for
lutions are discovered based on high-definition maps and pre- countering the environmental disturbances in [133], can be
defined coordinates. Hence, they require pre-mission path further extended and tested under various realistic scenarios.
planning unlike the other considered approaches.
The sense and avoid methods run locally and do not require REFERENCES
any pre-planning, are the most robust among the considered [1] A. Mcfadyen and L. Mejias, “A survey of autonomous vision-
approaches, and are suitable not only for static indoor and based see and avoid for unmanned aircraft systems,” Progress in
outdoor environments, but also for dynamic indoor and out- Aerospace Sciences, vol. 80, pp. 1 – 17, 2016. [Online]. Available:
http://www.sciencedirect.com/science/article/pii/S0376042115300208
door environments. In contrast, the force-field methods are
[2] C. Goerzen, Z. Kong, and B. Mettler, “A Survey of Motion Planning
only suitable for static indoor or outdoor environment, as they Algorithms from the Perspective of Autonomous UAV Guidance,”
require more processing time and do not provide appropriate Journal of Intelligent and Robotic Systems, vol. 57, no. 1, p. 65, Nov.
results for dynamic environments on their own, without help 2009. [Online]. Available: https://doi.org/10.1007/s10846-009-9383-1
[3] N. Gageik, P. Benz, and S. Montenegro, “Obstacle detection and collision
of other approaches. avoidance for a uav with complementary low-cost sensors,” IEEE Access,
Based on the literature studied, there is a clear trade-off be- vol. 3, pp. 599–609, 2015.
tween computational time requirements, complexity, optimal [4] M. Senanayake, I. Senthooran, J. C. Barca, H. Chung,
J. Kamruzzaman, and M. Murshed, “Search and tracking algorithms
solution requirements, pre-mission path planning, and the for swarms of robots: A survey,” Robotics and Autonomous
ability to adapt to static/dynamic environments. Depending Systems, vol. 75, pp. 422 – 434, 2016. [Online]. Available:
on the demands of operational requirements, in which the http://www.sciencedirect.com/science/article/pii/S0921889015001876
algorithm is to be deployed, the appropriate algorithm needs [5] S. Milani and A. Memo, “Impact of drone swarm formations in 3d
scene reconstruction,” in 2016 IEEE International Conference on Image
to be selected or one can also look into combining more than Processing (ICIP), Sep. 2016, pp. 2598–2602.
one collision avoidance techniques (or two layered collision [6] N. Mohamed, J. Al-Jaroodi, I. Jawhar, A. Idries, and F. Mohammed, “Un-
avoidance strategy [148]) to meet their needs. Moreover, to manned aerial vehicles applications in future smart cities,” Technological
Forecasting and Social Change, p. 119293, 2018. [Online]. Available:
ensure the safety of the UAVs, the deployment of sense and http://www.sciencedirect.com/science/article/pii/S0040162517314968
avoid methods, which are the simplest among the considered [7] H.-M. Huang and E. R. Messina, “Autonomy levels for unmanned
approaches and robust with low data overheads and low systems (alfus) frameworkvolume ii: Framework models initial version,”
Tech. Rep., 2007.
response times, would be a safe choice in all kinds of en-
[8] H. Chen, X. Wang, and Y. Li, “A survey of autonomous control for
vironments for avoiding the static/dynamic obstacles locally. uav,” in 2009 International Conference on Artificial Intelligence and
However, a more efficient path planning algorithm needs to Computational Intelligence, vol. 2, 2009, pp. 267–271.
be integrated along with it to make sure it does not get stuck [9] W. Zhang, G. Zelinsky, and D. Samaras, “Real-time accurate object
detection using multiple resolutions,” in 2007 IEEE 11th International
in a local minima and manages to reach the destination after Conference on Computer Vision, Oct 2007, pp. 1–8.
avoiding the collisions. Furthermore, since sense and avoid [10] S. N. Shinde and S. Chorage, “Unmanned ground vehicle,” International
approach is not dependant on any external communications Journal of Advanced Engineering, Management and Science, vol. 2,
no. 10, 2016.
and reacts immediately to any chances in the environment,
[11] J. Iqbal, S. M. Pasha, K. Baizid, A. A. Khan, and J. Iqbal, “Computer
has quick response times, and low data overheads, so it can be vision inspired real-time autonomous moving target detection, tracking
used as a failsafe/standalone approach to ensure the safety of and locking,” Life Science Journal, vol. 10, no. 4, pp. 3338–3345, 2013.
the UAVs especially for highly dynamic environments, where [12] H. Chao, Y. Cao, and Y. Chen, “Autopilots for small unmanned aerial
vehicles: A survey,” International Journal of Control, Automation and
situations can change rapidly and a high degree of adaptivity Systems, vol. 8, no. 1, pp. 36–44, Feb. 2010. [Online]. Available:
and flexibility is of utmost importance. https://doi.org/10.1007/s12555-010-0105-z
VOLUME 4, 2016 13
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
[13] C. Zhuge, Y. Cai, and Z. Tang, “A novel dynamic obstacle avoidance [36] J. N. Yasin, M. H. Haghbayan, J. Heikkonen, H. Tenhunnen, and
algorithm based on collision time histogram,” Chinese Journal of Elec- J. Plosila, “Formation maintenance and collision avoidance in a swarm of
tronics, vol. 26, no. 3, pp. 522–529, 2017. drones,” in Proceedings of the 3rd International Symposium on Computer
[14] H. Chao, Y. Cao, and Y. Chen, “Autopilots for small fixed-wing un- Science and Intelligent Control, ser. ISCSIC ’19. New York, NY, USA:
manned air vehicles: A survey,” in 2007 International Conference on ACM, 09 2019.
Mechatronics and Automation. IEEE, 2007, pp. 3144–3149. [37] C. H. Everett, “Survey of collision avoidance and ranging
[15] A. Vijayavargiya, A. Sharma, Anirudh, A. Kumar, A. Kumar, sensors for mobile robots,” Robotics and Autonomous Systems,
A. Yadav, A. Sharma, A. Jangid, and A. Dubey, “Un- vol. 5, no. 1, pp. 5 – 67, 1989. [Online]. Available:
manned Aerial Vehicle,” Imperial Journal of Interdisciplinary http://www.sciencedirect.com/science/article/pii/0921889089900419
Research, vol. 2, no. 5, May 2016. [Online]. Available: [38] S. U. Kamat and K. Rasane, “A survey on autonomous navigation
http://www.imperialjournals.com/index.php/IJIR/article/view/733 techniques,” in 2018 Second International Conference on Advances in
[16] “WHO | Global status report on road safety 2013.” [Online]. Available: Electronics, Computers and Communications (ICAECC), Feb 2018, pp.
http://www.who.int/violence_injury_prevention/road_safety_status/2013/en/ 1–6.
[17] “WHO | World report on road traf- [39] C. A. Wargo, G. C. Church, J. Glaneueski, and M. Strout, “Unmanned
fic injury prevention.” [Online]. Available: aircraft systems (uas) research and future analysis,” in 2014 IEEE
https://www.who.int/violence_injury_prevention/publications/road_traffic/ Aerospace Conference, March 2014, pp. 1–16.
world_report/en/ [40] C. Toth and G. Jóźków, “Remote sensing platforms and
[18] “Is 2014 the deadliest year for flights? Not even close.” [Online]. sensors: A survey,” ISPRS Journal of Photogrammetry
Available: http://www.cnn.com/interactive/2014/07/travel/aviation-data/ and Remote Sensing, vol. 115, 2016. [Online]. Available:
[19] “Aviation Safety Network releases 2018 air- http://www.sciencedirect.com/science/article/pii/S0924271615002270
liner accident statistics,” Jan. 2019. [Online]. [41] J. Kim, S. Hong, J. Baek, E. Kim, and H. Lee, “Autonomous vehicle
Available: https://news.aviation-safety.net/2019/01/01/aviation-safety- detection system using visible and infrared camera,” in 2012 12th Inter-
network-releases-2018-airliner-accident-statistics/ national Conference on Control, Automation and Systems, Oct 2012, pp.
[20] “Accident statistics.” [Online]. Available: 630–634.
http://www.planecrashinfo.com/cause.htm [42] C. R. Wang and J. J. Lien, “Automatic vehicle detection using local
[21] D. H. Shim, H. Chung, H. J. Kim, and S. Sastry, “Autonomous explo- features—a statistical approach,” IEEE Transactions on Intelligent Trans-
ration in unknown urban environments for unmanned aerial vehicles,” in portation Systems, vol. 9, no. 1, pp. 83–96, March 2008.
in Proc. AIAA GNC Conference, 2005. [43] M. Mizumachi, A. Kaminuma, N. Ono, and S. Ando, “Robust sensing
[22] R. J. Kiefer, D. K. Grimm, B. B. Litkouhi, and V. Sadekar, “Collision of approaching vehicles relying on acoustic cue,” in 2014 International
avoidance system,” Jul. 17 2007, uS Patent 7,245,231. Symposium on Computer, Consumer and Control, June 2014, pp. 533–
[23] A. Vahidi and A. Eskandarian, “Research advances in intelligent collision 536.
avoidance and adaptive cruise control,” IEEE Transactions on Intelligent [44] M. B. van Leeuwen and F. C. A. Groen, “Vehicle detection with a mobile
Transportation Systems, vol. 4, no. 3, pp. 143–153, Sep. 2003. camera: spotting midrange, distant, and passing cars,” IEEE Robotics
[24] Z. Liu, Y. Zhang, C. Yuan, L. Ciarletta, and D. Theilliol, “Collision avoid- Automation Magazine, vol. 12, no. 1, pp. 37–43, March 2005.
ance and path following control of unmanned aerial vehicle in hazardous [45] G. Recchia, G. Fasano, D. Accardo, A. Moccia, and L. Paparone, “An
environment,” Journal of Intelligent & Robotic Systems, vol. 95, no. 1, optical flow based electro-optical see-and-avoid system for uavs,” in 2007
pp. 193–210, 2019. IEEE Aerospace Conference, March 2007, pp. 1–9.
[25] A. Mujumdar and R. Padhi, “Evolving philosophies on autonomous [46] F. Kóta, T. Zsedrovits, and Z. Nagy, “Sense-and-avoid system devel-
obstacle/collision avoidance of unmanned aerial vehicles,” Journal of opment on an fpga,” in 2019 International Conference on Unmanned
Aerospace Computing, Information, and Communication, vol. 8, no. 2, Aircraft Systems (ICUAS), June 2019, pp. 575–579.
pp. 17–41, 2011. [Online]. Available: https://doi.org/10.2514/1.49985 [47] A. Mcfadyen, A. Durand-Petiteville, and L. Mejias, “Decision strategies
[26] A. Foka and P. Trahanias, “Real-time hierarchical pomdps for au- for automated visual collision avoidance,” in 2014 International Confer-
tonomous robot navigation,” in in: IJCAI Workshop Reasoning with ence on Unmanned Aircraft Systems (ICUAS). IEEE, 2014, pp. 715–
Uncertainty in Robotics, 2005. 725.
[27] R. Murray, “Recent research in cooperative control of multi-vehicle [48] R. Beard and J. Saunders, “Reactive vision based obstacle avoidance with
systems,” Journal of Dynamic Systems Measurement and Control, vol. camera field of view constraints,” in AIAA Guidance, Navigation and
129, pp. 571–598, 09 2007. Control Conference and Exhibit, 2008, p. 7250.
[28] G. Ladd and G. Bland, “Non-military applications for small uas plat- [49] S. Saha, A. Natraj, and S. Waharte, “A real-time monocular vision-based
forms,” in AIAA Infotech@ Aerospace Conference and AIAA Un- frontal obstacle detection and avoidance for low cost uavs in gps denied
manned... Unlimited Conference, 2009, p. 2046. environment,” in 2014 IEEE International Conference on Aerospace
[29] L. He, P. Bai, X. Liang, J. Zhang, and W. Wang, “Feedback formation Electronics and Remote Sensing Technology, Nov 2014, pp. 189–195.
control of uav swarm with multiple implicit leaders,” Aerospace Science [50] L. Mejias, S. McNamara, J. Lai, and J. Ford, “Vision-based detection and
and Technology, vol. 72, pp. 327 – 334, 2018. [Online]. Available: tracking of aerial targets for uav collision avoidance,” in 2010 IEEE/RSJ
http://www.sciencedirect.com/science/article/pii/S1270963816309816 International Conference on Intelligent Robots and Systems, Oct 2010,
[30] S. S. Esfahlani, “Mixed reality and remote sensing application pp. 87–92.
of unmanned aerial vehicle in fire and smoke detection,” Journal [51] S. A. S. Mohamed, M.-H. Haghbayan, J. Heikkonen, H. Tenhunen, and
of Industrial Information Integration, 2019. [Online]. Available: J. Plosila, “Towards real-time edge detection for event cameras based
http://www.sciencedirect.com/science/article/pii/S2452414X18300773 on lifetime and dynamic slicing,” in Proceedings of the International
[31] K. P. Valavanis, “Unmanned aircraft systems: the current state-of-the-art,” Conference on Artificial Intelligence and Computer Vision (AICV2020),
2016. A.-E. Hassanien, A. T. Azar, T. Gaber, D. Oliva, and F. M. Tolba, Eds.
[32] C. A. Wargo, G. C. Church, J. Glaneueski, and M. Strout, “Unmanned Cham: Springer International Publishing, 2020, pp. 584–593.
aircraft systems (uas) research and future analysis,” in 2014 IEEE [52] T. Lee, D. Yi, and D. Cho, “A monocular vision sensor-based obstacle
Aerospace Conference. IEEE, 2014, pp. 1–16. detection algorithm for autonomous robots,” Sensors, vol. 16, no. 3, p.
[33] C. Zhuge, Y. Cai, and Z. Tang, “A novel dynamic obstacle avoidance 311, 2016.
algorithm based on collision time histogram,” Chinese Journal of Elec- [53] A. U. Haque and A. Nejadpak, “Obstacle avoidance using stereo camera,”
tronics, vol. 26, no. 3, pp. 522–529, 2017. CoRR, vol. abs/1705.04114, 2017.
[34] X. Wang, V. Yadav, and S. N. Balakrishnan, “Cooperative uav formation [54] D. Falanga, S. Kim, and D. Scaramuzza, “How fast is too fast? the role of
flying with obstacle/collision avoidance,” IEEE Transactions on Control perception latency in high-speed sense and avoid,” IEEE Robot. Autom.
Systems Technology, vol. 15, no. 4, pp. 672–679, July 2007. Lett., vol. 4, no. 2, pp. 1884–1891, Apr. 2019.
[35] S. Huang, R. S. H. Teo, and K. K. Tan, “Collision avoidance [55] W. Hartmann, S. Tilch, H. Eisenbeiss, and K. Schindler, “Determination
of multi unmanned aerial vehicles: A review,” Annual Reviews of the uav position by automatic processing of thermal images,” Inter-
in Control, vol. 48, pp. 147 – 164, 2019. [Online]. Available: national Archives of the Photogrammetry, Remote Sensing and Spatial
http://www.sciencedirect.com/science/article/pii/S1367578819300598 Information Sciences, vol. 39, p. B6, 2012.
14 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
[56] Citing a web page with no author. [Online]. Available: Robotics, vol. 24, no. 8-9, pp. 699–722, 2007. [Online]. Available:
http://www.irobot.com/For-the-Home/Vacuum-Cleaning/Roomba.aspx https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.20209
[57] M. Rouse and M. Haughn, “What is active sen- [78] J. Zhang and S. Singh, “Visual-lidar odometry and mapping: low-drift,
sor? - definition from whatis.com.” [Online]. Available: robust, and fast,” in 2015 IEEE International Conference on Robotics and
https://internetofthingsagenda.techtarget.com/definition/active-sensor Automation (ICRA), May 2015, pp. 2174–2181.
[58] “7. active sensors - european space agency.” [Online]. Available: [79] C. H. Tong, S. Anderson, H. Dong, and T. D. Barfoot, “Pose
https://www.esa.int/Education/7._Active_sensors interpolation for laser-based visual odometry,” Journal of Field
[59] F. Nashashibi and A. Bargeton, “Laser-based vehicles tracking and clas- Robotics, vol. 31, no. 5, pp. 731–757, 2014. [Online]. Available:
sification using occlusion reasoning and confidence estimation,” in 2008 https://onlinelibrary.wiley.com/doi/abs/10.1002/rob.21537
IEEE Intelligent Vehicles Symposium, June 2008, pp. 847–852. [80] A. Tahir, J. Böling, M.-H. Haghbayan, H. T. Toivonen, and J. Plosila,
[60] H.-J. Cho and M.-T. Tseng, “A support vector machine approach to “Swarms of unmanned aerial vehicles — a survey,” Journal of Industrial
cmos-based radar signal processing for vehicle classification and speed Information Integration, vol. 16, p. 100106, 2019. [Online]. Available:
estimation,” Mathematical and Computer Modelling, vol. 58, no. 1, pp. http://www.sciencedirect.com/science/article/pii/S2452414X18300086
438 – 448, 2013, financial IT Security and 2010 International [81] S. Chaulya and G. Prasad, “Chapter 2 - mine transport surveillance
Symposium on Computational Electronics. [Online]. Available: and production management system,” in Sensing and Monitoring
http://www.sciencedirect.com/science/article/pii/S0895717712003020 Technologies for Mines and Hazardous Areas, S. Chaulya and
[61] F. Zhang, J. Chen, H. Li, Y. Sun, and X. S. Shen, “Distributed active G. Prasad, Eds. Elsevier, 2016, pp. 87 – 160. [Online]. Available:
sensor scheduling for target tracking in ultrasonic sensor networks,” http://www.sciencedirect.com/science/article/pii/B9780128031940000027
Mob. Netw. Appl., vol. 17, no. 5, pp. 582–593, Oct. 2012. [Online]. [82] J. M. Armingol, J. Alfonso, N. Aliane, M. Clavijo, S. Campos-Cordobés,
Available: http://dx.doi.org/10.1007/s11036-011-0311-9 A. de la Escalera, J. del Ser, J. Fernández, F. García, F. Jiménez, A. M.
[62] H. Li, D. Miao, J. Chen, Y. Sun, and X. Shen, “Networked ultrasonic López, M. Mata, D. Martín, J. M. Menéndez, J. Sánchez-Cubillo,
sensors for target tracking: An experimental study,” in GLOBECOM D. Vázquez, and G. Villalonga, “Chapter 2 - environmental perception
2009 - 2009 IEEE Global Telecommunications Conference, Nov 2009, for intelligent vehicles,” in Intelligent Vehicles, F. Jiménez, Ed.
pp. 1–6. Butterworth-Heinemann, 2018, pp. 23 – 101. [Online]. Available:
[63] L. Korba, S. Elgazzar, and T. Welch, “Active infrared sensors for mo- http://www.sciencedirect.com/science/article/pii/B9780128128008000023
bile robots,” IEEE Transactions on Instrumentation and Measurement, [83] “Source: Tesla suspects camera failure
vol. 43, no. 2, pp. 283–287, April 1994. in crash,” Jul 2016. [Online]. Available:
[64] G. Benet, F. Blanes, J. Simó, and P. Pérez, “Using infrared sensors https://eu.detroitnews.com/story/business/autos/2016/07/29/tesla-
for distance measurement in mobile robots,” Robotics and Autonomous crash-failure/87754264/
Systems, vol. 40, no. 4, pp. 255 – 266, 2002. [Online]. Available: [84] H. Shakhatreh, A. H. Sawalmeh, A. Al-Fuqaha, Z. Dou, E. Almaita,
http://www.sciencedirect.com/science/article/pii/S0921889002002713 I. Khalil, N. S. Othman, A. Khreishah, and M. Guizani, “Unmanned
[65] C. Blanc, R. Aufrère, L. Malaterre, J. Gallice, and J. Alizon, aerial vehicles (uavs): A survey on civil applications and key research
“Obstacle detection and tracking by millimeter wave radar,” challenges,” IEEE Access, vol. 7, pp. 48 572–48 634, 2019.
IFAC Proceedings Volumes, vol. 37, no. 8, pp. 322 – [85] A. Chakravarthy and D. Ghose, “Obstacle avoidance in a dynamic
327, 2004, iFAC/EURON Symposium on Intelligent Autonomous environment: a collision cone approach,” IEEE Transactions on Systems,
Vehicles, Lisbon, Portugal, 5-7 July 2004. [Online]. Available: Man, and Cybernetics - Part A: Systems and Humans, vol. 28, no. 5, pp.
http://www.sciencedirect.com/science/article/pii/S1474667017319961 562–574, Sep. 1998.
[66] B. Korn and C. Edinger, “Uas in civil airspace: Demonstrating “sense [86] A. Alexopoulos, A. Kandil, P. Orzechowski, and E. Badreddin, “A
and avoid” capabilities in flight trials,” in 2008 IEEE/AIAA 27th Digital comparative study of collision avoidance techniques for unmanned aerial
Avionics Systems Conference, Oct 2008, pp. 4.D.1–1–4.D.1–7. vehicles,” in 2013 IEEE International Conference on Systems, Man, and
[67] M. P. Owen, S. M. Duffy, and M. W. M. Edwards, “Unmanned aircraft Cybernetics, Oct 2013, pp. 1969–1974.
sense and avoid radar: Surrogate flight testing performance evaluation,” [87] Payal, Akashdeep, and C. Raman Singh, “A summarization of collision
in 2014 IEEE Radar Conference, May 2014, pp. 0548–0551. avoidance techniques for autonomous navigation of uav,” in Proceedings
[68] E. B. Quist and R. W. Beard, “Radar odometry on fixed-wing small of UASG 2019, K. Jain, K. Khoshelham, X. Zhu, and A. Tiwari, Eds.
unmanned aircraft,” IEEE Transactions on Aerospace and Electronic Cham: Springer International Publishing, 2020, pp. 393–401.
Systems, vol. 52, no. 1, pp. 396–410, February 2016. [88] B. M. Albaker and N. A. Rahim, “Unmanned aircraft collision detection
[69] Y. K. Kwag and C. H. Chung, “Uav based collision avoidance radar and resolution: Concept and survey,” in 2010 5th IEEE Conference on
sensor,” in 2007 IEEE International Geoscience and Remote Sensing Industrial Electronics and Applications, June 2010, pp. 248–253.
Symposium, July 2007, pp. 639–642. [89] H. Pham, S. A. Smolka, S. D. Stoller, D. Phan, and J. Yang,
[70] Y. K. Kwag and J. W. Kang, “Obstacle awareness and collision avoidance “A survey on unmanned aerial vehicle collision avoidance
radar sensor system for low-altitude flying smart uav,” in The 23rd Digital systems,” CoRR, vol. abs/1508.07723, 2015. [Online]. Available:
Avionics Systems Conference (IEEE Cat. No.04CH37576), vol. 2, Oct http://arxiv.org/abs/1508.07723
2004, pp. 12.D.2–121. [90] N. E. Smith, R. Cobb, S. J. Pierce, and V. Raska, Optimal
[71] S. A. S. Mohamed, M. Haghbayan, T. Westerlund, J. Heikkonen, H. Ten- Collision Avoidance Trajectories via Direct Orthogonal Collocation
hunen, and J. Plosila, “A survey on odometry for autonomous navigation for Unmanned/Remotely Piloted Aircraft Sense and Avoid Operations.
systems,” IEEE Access, vol. 7, pp. 97 466–97 486, 2019. [Online]. Available: https://arc.aiaa.org/doi/abs/10.2514/6.2014-0966
[72] P. Hügler, F. Roos, M. Schartel, M. Geiger, and C. Waldschmidt, “Radar [91] B. M. Albaker and N. A. Rahim, “A survey of collision avoidance ap-
taking off: New capabilities for uavs,” IEEE Microwave Magazine, proaches for unmanned aerial vehicles,” in 2009 International Conference
vol. 19, no. 7, pp. 43–53, 2018. for Technical Postgraduates (TECHPOS), Dec 2009, pp. 1–7.
[73] Y. A. Nijsure, G. Kaddoum, N. Khaddaj Mallat, G. Gagnon, and [92] X. Prats, L. Delgado, J. Ramírez, P. Royo, and E. Pastor, “Requirements,
F. Gagnon, “Cognitive chaotic uwb-mimo detect-avoid radar for au- issues, and challenges for sense and avoid in unmanned aircraft systems,”
tonomous uav navigation,” IEEE Transactions on Intelligent Transporta- Journal of Aircraft, vol. 49, pp. 677–687, 05 2012.
tion Systems, vol. 17, no. 11, pp. 3121–3131, 2016. [93] J. Park, H. Oh, and M. Tahk, “Uav collision avoidance based on geometric
[74] S. Kemkemian, M. Nouvel-Fiani, P. Cornic, P. L. Bihan, and P. Garrec, approach,” in 2008 SICE Annual Conference, Aug 2008, pp. 2122–2126.
“Radar systems for “sense and avoid” on uav,” in 2009 International [94] A. Mujumdar and R. Padhi, “Nonlinear geometric and differential ge-
Radar Conference "Surveillance for a Safer World" (RADAR 2009), ometric guidance of uavs for reactive collision avoidance,” Journal of
2009, pp. 1–6. Guidance, Control, and Dynamics, vol. 34, p. 69, 07 2009.
[75] A. Moses, M. J. Rutherford, M. Kontitsis, and K. P. Valavanis, “Uav- [95] C. Y. Tan, S. Huang, K. K. Tan, and R. S. H. Teo, “Three dimensional
borne x-band radar for collision avoidance,” Robotica, vol. 32, no. 1, p. collision avoidance for multi unmanned aerial vehicles using velocity
97–114, 2014. obstacle,” Journal of Intelligent & Robotic Systems, vol. 97, no. 1, pp.
[76] J. Zhang and S. Singh, “Loam: Lidar odometry and mapping in real- 227–248, 2020.
time.” in Robotics: Science and Systems, vol. 2, 2014, p. 9. [96] K. Bilimoria, “A geometric optimization approach to aircraft conflict
[77] A. Nüchter, K. Lingemann, J. Hertzberg, and H. Surmann, resolution,” in 18th Applied Aerodynamics Conference, 2000, p. 4265.
“6d slam—3d mapping outdoor environments,” Journal of Field [Online]. Available: https://arc.aiaa.org/doi/abs/10.2514/6.2000-4265
VOLUME 4, 2016 15
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
[97] J. Goss, R. Rajvanshi, and K. Subbarao, “Aircraft conflict detection and Intelligence, and Communications Technology (IAICT), July 2019, pp.
resolution using mixed geometric and collision cone approaches,” in 57–63.
AIAA Guidance, Navigation, and Control Conference and Exhibit, 2004, [116] J. N. Yasin, S. Mohamed, M.-H. Haghbayan, J. Heikkonen, H. Tenhunen,
p. 4879. [Online]. Available: https://arc.aiaa.org/doi/abs/10.2514/6.2004- and J. Plosila, “Navigation of autonomous swarm of drones using transla-
4879 tional coordinates,” in Advances on Practical Applications of Agents and
[98] J. Seo, Y. Kim, S. Kim, and A. Tsourdos, “Collision avoidance strategies Multi-Agent Systems. Springer, 2020.
for unmanned aerial vehicles in formation flight,” IEEE Transactions on [117] X. Yu and Y. Zhang, “Sense and avoid technologies with applications
Aerospace and Electronic Systems, vol. 53, no. 6, pp. 2718–2734, Dec to unmanned aircraft systems: Review and prospects,” Progress in
2017. Aerospace Sciences, vol. 74, pp. 152 – 166, 2015. [Online]. Available:
[99] J. Tang, L. Fan, and S. Lao, “Collision avoidance for multi-uav based http://www.sciencedirect.com/science/article/pii/S0376042115000020
on geometric optimization model in 3d airspace,” Arabian Journal for [118] M. Wang, H. Voos, and D. Su, “Robust online obstacle detection and
Science and Engineering, vol. 39, no. 11, pp. 8409–8416, Nov 2014. tracking for collision-free navigation of multirotor uavs in complex
[Online]. Available: https://doi.org/10.1007/s13369-014-1368-0 environments,” in 2018 15th International Conference on Control, Au-
[100] L. N. N. T. Ha, D. H. P. Bui, and S. K. Hong, “Nonlinear control for tomation, Robotics and Vision (ICARCV), Nov 2018, pp. 1228–1234.
autonomous trajectory tracking while considering collision avoidance [119] S. U. Sharma and D. J. Shah, “A practical animal detection and collision
of uavs based on geometric relations,” Energies, vol. 12, no. 8, 2019. avoidance system using computer vision technique,” IEEE Access, vol. 5,
[Online]. Available: https://www.mdpi.com/1996-1073/12/8/1551 pp. 347–358, 2017.
[101] Z. Lin, L. Castano, E. Mortimer, and H. Xu, “Fast 3d collision avoidance [120] M. C. De Simone, Z. B. Rivera, and D. Guida, “Obstacle
algorithm for fixed wing uas,” Journal of Intelligent & Robotic Systems, avoidance system for unmanned ground vehicles by using ultrasonic
vol. 97, no. 3, pp. 577–604, 2020. sensors,” Machines, vol. 6, no. 2, 2018. [Online]. Available:
[102] Ha, Bui, and Hong, “Nonlinear control for autonomous trajectory http://www.mdpi.com/2075-1702/6/2/18
tracking while considering collision avoidance of uavs based on [121] Y. Yu, W. Tingting, C. Long, and Z. Weiwei, “Stereo vision based
geometric relations,” Energies, vol. 12, no. 8, p. 1551, Apr 2019. obstacle avoidance strategy for quadcopter uav,” in 2018 Chinese Control
[Online]. Available: http://dx.doi.org/10.3390/en12081551 And Decision Conference (CCDC), June 2018, pp. 490–494.
[103] O. Khatib, “Real-time obstacle avoidance for manipulators and mobile [122] Y. Zeng, Y. Hu, S. Liu, J. Ye, Y. Han, X. Li, and N. Sun, “Rt3d: Real-time
robots,” in Proceedings. 1985 IEEE International Conference on Robotics 3-d vehicle detection in lidar point cloud for autonomous driving,” IEEE
and Automation, vol. 2, March 1985, pp. 500–505. Robotics and Automation Letters, vol. 3, no. 4, pp. 3434–3440, Oct 2018.
[104] D. Choi, K. Lee, and D. Kim, Enhanced Potential Field-Based Collision [123] A. Geiger, P. Lenz, and R. Urtasun, “Are we ready for autonomous
Avoidance for Unmanned Aerial Vehicles in a Dynamic Environment, driving? the kitti vision benchmark suite,” in 2012 IEEE Conference on
2020. [Online]. Available: https://arc.aiaa.org/doi/abs/10.2514/6.2020- Computer Vision and Pattern Recognition, June 2012, pp. 3354–3361.
0487 [124] J. Fritsch, T. Kühnl, and A. Geiger, “A new performance measure and
[105] M. Radmanesh, M. Kumar, P. H. Guentert, and M. Sarim, “Overview evaluation benchmark for road detection algorithms,” in 16th Interna-
of path-planning and obstacle avoidance algorithms for uavs: A tional IEEE Conference on Intelligent Transportation Systems (ITSC
comparative study,” Unmanned Systems, vol. 06, no. 02, pp. 95–118, 2013), Oct 2013, pp. 1693–1700.
2018. [Online]. Available: https://doi.org/10.1142/S2301385018400022 [125] S. Hrabar, “Reactive obstacle avoidance for rotorcraft uavs,” in 2011
[106] A. A. Holenstein and E. Badreddin, “Collision avoidance in a behavior- IEEE/RSJ International Conference on Intelligent Robots and Systems,
based mobile robot design,” in Proceedings. 1991 IEEE International Sep. 2011, pp. 4967–4974.
Conference on Robotics and Automation, April 1991, pp. 898–903 vol.1. [126] K. M. Hasan, A. Al-Nahid, K. J. Reza, S. Khatun, and M. R. Basar,
[107] J. Oroko and G. Nyakoe, “Obstacle avoidance and path “Sensor based autonomous color line follower robot with obstacle avoid-
planning schemes for autonomous navigation of a mobile ance,” in 2013 IEEE Business Engineering and Industrial Applications
robot: A review,” Proceedings of Sustainable Research and Colloquium (BEIAC), April 2013, pp. 598–603.
Innovation Conference, vol. 0, no. 0, pp. 314–318 url = [127] S. Kanarachos, “A new method for computing optimal obstacle avoid-
http://sri.jkuat.ac.ke/ojs/index.php/proceedings/article/view/237, 2014. ance steering manoeuvres of vehicles,” International Journal of Vehicle
[108] J. Sun, J. Tang, and S. Lao, “Collision avoidance for cooperative uavs Autonomous Systems, vol. 7, pp. 73–95, 12 2009.
with optimized artificial potential field algorithm,” IEEE Access, vol. 5, [128] M. A. Olivares-Mendez, P. Campoy, I. Mellado-Bataller, and L. Mejias,
pp. 18 382–18 390, 2017. “See-and-avoid quadcopter using fuzzy control optimized by cross-
[109] M. T. Wolf and J. W. Burdick, “Artificial potential functions for highway entropy,” in 2012 IEEE International Conference on Fuzzy Systems, June
driving with collision avoidance,” in 2008 IEEE International Conference 2012, pp. 1–7.
on Robotics and Automation, May 2008, pp. 3731–3736. [129] H. Wang, Z. Wei, S. Wang, C. S. Ow, K. T. Ho, and B. Feng, “A
[110] C. Y. Kim, Y. H. Kim, and W.-S. Ra, “Modified 1d virtual vision-based obstacle detection system for unmanned surface vehicle,”
force field approach to moving obstacle avoidance for autonomous in 2011 IEEE 5th International Conference on Robotics, Automation and
ground vehicles,” Journal of Electrical Engineering & Technology, Mechatronics (RAM), Sep. 2011, pp. 364–369.
vol. 14, no. 3, pp. 1367–1374, May 2019. [Online]. Available: [130] D. Bareiss, J. van den Berg, and K. K. Leang, “Stochastic automatic
https://doi.org/10.1007/s42835-019-00127-8 collision avoidance for tele-operated unmanned aerial vehicles,” in 2015
[111] A. Azzabi and K. Nouri, “Path planning for autonomous mobile robot IEEE/RSJ International Conference on Intelligent Robots and Systems
using the potential field method,” in 2017 International Conference on (IROS), Sep. 2015, pp. 4818–4825.
Advanced Systems and Electric Technologies (IC_ASET), Jan 2017, pp. [131] A. Tahir, J. Böling, M. H. Haghbayan, and J. Plosila, “Navigation system
389–394. for landing of swarm of autonomous drones on a movable surface,” in
[112] S. Pérez-Carabaza, J. Scherer, B. Rinner, J. A. López- 34th International conference on modeling and simulation (ECMS), ser.
Orozco, and E. Besada-Portas, “Uav trajectory optimization ECMS 20, 2020.
for minimum time search with communication constraints [132] A. Tahir, J. Boling, M. H. Haghbayan, and J. Plosila, “Comparison of
and collision avoidance,” Engineering Applications of Artificial linear and nonlinear methods for distributed control of a hierarchical
Intelligence, vol. 85, pp. 357 – 371, 2019. [Online]. Available: formation of uavs,” IEEE Access, 2020.
http://www.sciencedirect.com/science/article/pii/S0952197619301411 [133] M. B. Rhudy, J. N. Gross, and Y. Gu, Stochastic Wind Modeling and
[113] R. Polvara, S. Sharma, J. Wan, A. Manning, and R. Sutton, “Obstacle Estimation for Unmanned Aircraft Systems, 2019. [Online]. Available:
avoidance approaches for autonomous navigation of unmanned surface https://arc.aiaa.org/doi/abs/10.2514/6.2019-3111
vehicles,” Journal of Navigation, vol. 71, no. 1, p. 241–256, 2018. [134] J. Han, “From pid to active disturbance rejection control,” IEEE Transac-
[114] E. Boivin, A. Desbiens, and E. Gagnon, “Uav collision avoidance using tions on Industrial Electronics, vol. 56, no. 3, pp. 900–906, 2009.
cooperative predictive control,” 2008 16th Mediterranean Conference on [135] W. Chen, J. Yang, L. Guo, and S. Li, “Disturbance-observer-based control
Control and Automation, pp. 682–688, 2008. and related methods—an overview,” IEEE Transactions on Industrial
[115] S. Biswas, S. G. Anavatti, and M. A. Garratt, “A particle swarm optimiza- Electronics, vol. 63, no. 2, pp. 1083–1095, 2016.
tion based path planning method for autonomous systems in unknown [136] L. Rodriguez, F. Balampanis, J. A. Cobano, I. Maza, and A. Ollero,
terrain,” in 2019 IEEE International Conference on Industry 4.0, Artificial “Wind efficient path planning and reconfiguration of uas in future atm,”
16 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
in Twelfth USA/Europe Air Traffic Management Research and Develop- SHERIF A.S. MOHAMED received the BA de-
ment Seminar (ATM2017), Seattle (Washington), USA, 2017. gree in Electrical, Electronics and Communication
[137] N. Zhang, W. Gai, G. Zhang, and J. Zhang, “An active Engineering from Ain Shams University, Egypt, in
disturbance rejection control guidance law based collision 2011. He received an MS degree in Electronics
avoidance for unmanned aerial vehicles,” Aerospace Science and and Information Engineering from Kunsan Na-
Technology, vol. 77, pp. 658 – 669, 2018. [Online]. Available: tional University, South Korea, in 2016. He is cur-
http://www.sciencedirect.com/science/article/pii/S1270963817323015 rently a Ph.D. student in the University of Turku,
[138] K. Cole and A. M. Wickenheiser, “Trajectory generation for uavs in
Finland. His research interests include vision-
unknown environments with extreme wind disturbances,” arXiv preprint
based navigation algorithms for autonomous ve-
arXiv:1906.09508, 2019.
[139] C. Tan, S. Huang, K. Tan, R. Teo, W. Liu, and F. Lin, “Collision avoidance hicles, embedded systems, swam intelligence and
design on unmanned aerial vehicle in 3d space,” Unmanned Systems, machine learning.
vol. 06, 09 2018.
[140] Y. Jenie, E.-J. Van Kampen, C. De Visser, J. Ellerbroek, and J. Hoekstra,
“Three-dimensional velocity obstacle method for uncoordinated avoid-
ance maneuvers of unmanned aerial vehicles,” Journal of Guidance,
Control, and Dynamics, pp. 1–12, 07 2016.
[141] D. Bareiss and J. van den Berg, “Reciprocal collision avoidance for robots MOHAMMAD/HASHEM HAGHBAYAN re-
with linear dynamics using lqr-obstacles,” 05 2013, pp. 3847–3853. ceived the BA degree in computer engineering
[142] J. van den Berg, D. Wilkie, S. Guy, M. Niethammer, and D. Manocha, from Ferdowsi University of Mashhad, the MS
“Lqg-obstacles: Feedback control with collision avoidance for mobile
degree in computer architecture from University
robots with motion and sensing uncertainty,” Proceedings - IEEE Inter-
of Tehran, Iran, and PhD with honour from Uni-
national Conference on Robotics and Automation, pp. 346–353, 05 2012.
[143] E. Anderson, “Quadrotor implementation of the three-dimensional dis- versity of Turku, Finland.
tributed reactive collision avoidance algorithm,” Ph.D. dissertation, Uni- Since 2018 he is Post-Doc and lecturer in Uni-
versity of Washington, 2011. versity of Turku, Finland. His research interests
[144] P. Conroy, D. Bareiss, M. Beall, and J. Berg, “3-d reciprocal collision include high-performance energyefficient archi-
avoidance on physical quadrotor helicopters with on-board sensing for tectures for autonomous systems and artificial
relative positioning,” 11 2014. intelligence. He has several years of experience working in industry and
[145] S. Roelofsen, D. Gillet, and A. Martinoli, “Reciprocal collision avoidance designing IP cores as well as developing research tools.
for quadrotors using on-board visual detection,” in 2015 IEEE/RSJ Inter-
national Conference on Intelligent Robots and Systems (IROS), 2015, pp.
4810–4817.
[146] S. Gabriela and I. Andrei, “Automated conflict resolution in air traffic
management,” INCAS BULLETIN, vol. 9, pp. 91–104, 03 2017.
[147] H. Zhu and J. Alonso-Mora, “Chance-constrained collision avoidance for
mavs in dynamic environments,” IEEE Robotics and Automation Letters, JUKKA HEIKKONEN has been a professor of
vol. 4, no. 2, pp. 776–783, 2019. computer science of University of Turku, Finland,
[148] Z. Zhang, S. Zhao, and X. Wang, “Research on collision avoidance of since 2009. His current research as the head of the
fixed-wing uav,” in Proceedings of the 2019 4th International Conference Algorithms and Computational Intelligent (ACI)
on Automation, Control and Robotics Engineering, ser. CACRE2019. research group is related to data analytics, machine
New York, NY, USA: Association for Computing Machinery, 2019. learning and autonomous systems. He has worked
[Online]. Available: https://doi.org/10.1145/3351917.3351933 at top level research laboratories and Center of
Excellences in Finland and international organiza-
tions (European Commission, Japan) and has led
many international and national research projects.
He has authored more than 150 scientific articles.
VOLUME 4, 2016 17
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.3000064, IEEE Access
18 VOLUME 4, 2016
This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.