0% found this document useful (0 votes)
31 views10 pages

Aise 28 Aise200026

This document discusses automated drone-based aircraft inspection using deep learning. It describes how automated inspection could increase safety by reducing accidents during manual inspection, decrease aircraft downtime, and improve damage detection accuracy. The document also notes that while deep learning has improved object detection accuracy, more work is needed to address uncertainties and improve the reliability of neural networks for safety-critical tasks like aircraft inspection. It proposes testing scenarios to evaluate a drone-based inspection system's reliability under different conditions.

Uploaded by

bd25qvn6c5
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views10 pages

Aise 28 Aise200026

This document discusses automated drone-based aircraft inspection using deep learning. It describes how automated inspection could increase safety by reducing accidents during manual inspection, decrease aircraft downtime, and improve damage detection accuracy. The document also notes that while deep learning has improved object detection accuracy, more work is needed to address uncertainties and improve the reliability of neural networks for safety-critical tasks like aircraft inspection. It proposes testing scenarios to evaluate a drone-based inspection system's reliability under different conditions.

Uploaded by

bd25qvn6c5
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

72 Intelligent Environments 2020

C.A. Iglesias et al. (Eds.)


© 2020 The authors and IOS Press.
This article is published online with Open Access by IOS Press and distributed under the terms
of the Creative Commons Attribution Non-Commercial License 4.0 (CC BY-NC 4.0).
doi:10.3233/AISE200026

Automated Drone-Based Aircraft


Inspection
Soufiane BOUARFAa,1 Joselito SERAFICO a
a
Abu Dhabi Polytechnic, Al Ain, United Arab Emirates

Abstract. Deep learning combined with autonomous drones is increasingly seen as


an enabler of automated aircraft inspection which can support engineers detect and
classify a wide range of defects. This can help increase the accuracy of damage
detection, reduce aircraft downtime, and help prevent inspection accidents.
However, a key challenge in neural networks is that their stability is not yet well
understood mainly due to the large number of dimensions and the complexity of
their shapes. This paper illustrates this challenge through a use case that applies
MASK R-CNN to detect aircraft dents. The results show that environmental
factors such as raindrops can lead to false positives. The paper also proposes
various test scenarios that need to be considered by the developers of the drone-
based inspection concept to increase its reliability.

Keywords. Reliability, aircraft inspection, drones, convolutional neural network

1. Introduction

The current aircraft maintenance inspection process has not evolved during the last 40
years despite the rapid advances in technology. It is not only time consuming as it
requires a long time to prepare work platforms, ground support, and anti-fall straps to
conduct inspection; but also dangerous as reports of injuries during inspection are not
uncommon. Automating the inspection process can therefore increase workplace safety
by reducing incidents, and reduce costs related to maintenance which remains the
second largest cost for airlines after fuel (e.g. manhours, equipment, training, PPE costs,
etc.). In addition, automation will enable a more objective assessment of damage as
different inspectors can have different assessments. This would prevent from the failure
to detect critical damage as it was the case for the Aloha Airlines Flight 243 [1] and
recently the Virgin Australia Regional Airlines ATR72 [2].
There is no doubt that computer vision will soon revolutionize aircraft inspection
as it’s already the case other domains that require visual assessment. This is not
surprising given that object detection errors by a machine decreased from 26% in 2011
to only 3% in 2016 which is less than the human error of 5% [3]. The main driver
behind these improvements is deep learning which had a significant impact on robotic
perception following the design of AlexNet in 2012. In medical imaging diagnosis for
instance, technology has become so good that the FDA has recently approved many use
cases [4]. In the automotive industry, companies such as Tesla and Waymo are working
towards fully driverless cars enabled by computer vision technology that can detect

1
Corresponding Author, Abu Dhabi Polytechnic, P.O.Box: 66844, Al Ain, UAE; E-mail:
soufiane.bouarfa@outlook.com.
S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection 73

various objects around the car. In production and manufacturing environments,


computer vision is used for the external assessment of product quality and equipment
such as tanks, pressure vessels, and pipes. In agriculture, computer vision algorithms
are integrated with drones that can scan large fields in a matter of minutes. Images are
collected and processed to help farmers make informed decisions about their crops. The
captured images include soil and crop conditions to monitor for any stress or disease
[5].
Recently the aviation community started to develop the drone-based aircraft
inspection concept. For instance, Ubisense and Mrodrone [6,7] have developed an
inspection system that has been tested by Easyjet and is planned to be rolled out across
Easyjet’s European bases. KLM Engineering & Maintenance is also experimenting
with drones to inspect their aircraft and the research program is currently in its second
trial phase. Another initiative is the Air-Cobot project [8] which aims at automating
aircraft visual inspection and involves various academic and industrial partners
including Airbus. In the same vein, the authors have also recently developed a
Convolutional Neural Network to detect aircraft dents [9]. All these efforts aim at
obtaining a good accuracy in defect detection and classification. However, as sensors
can sometimes be unreliable, the question remains how well the system would perform
in real-life outside the hangar? How to further improve it knowing that the slightest
error in object detection can potentially lead to an aircraft accident if defects went
unnoticed? And how to increase the confidence of aircraft engineers [10,11] in the
drone-based aircraft inspection?
These are all challenging questions as most of the research efforts in deep learning
focus on improving the detection accuracy, but lag when it comes to neural network
safety/stability and dealing with various types of uncertainties. This is mainly due to
the lack of understanding of deep learning technologies even by AI experts. This can be
explained by the large number of dimensions and complexity of the shape of neural
networks. In addition, there has been little emphasis on standards and methodologies
which can lead to a stable and reliable intelligent environment [12].To tackle this
problem, this paper proposes different test scenarios that need to be considered by the
aviation community in order to make drone-based inspection more reliable. The
scenarios can be used to further improve the stability of neural networks and robustness
of the decision-support system and validate the concept.
This paper is organized as follows. Section 1 provides the introduction. Section 2
presents the motivation behind automating aircraft inspection. Section 3 presents the
use case of using MASK R-CNN to detect dents and illustrates some of the challenges.
Section 4 proposes test scenarios that need to be considered by the developers of the
automated drone-based inspection concept to increases its reliability. The conclusion is
provided in section 5.

2. Why Automate Aircraft Inspection?

The aircraft inspection process is a recurrent process that needs to be conducted at


every flight cycle. The level of inspection depends on different factors. For instance, if
the aircraft was subject to abnormal events, a thorough inspection for potential damage
is required. Examples of abnormal events include bird strike, lightning strike, hard
landings, and encountering turbulent air. Table 1 shows examples of the inspections
required for the King Air following abnormal events.
74 S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection

Table 1. Areas to be inspected after abnormal conditions [13].

Automated aircraft inspection basically aims at automating the visual inspection


process normally carried out by aircraft engineers. i.e. It aims at detecting defects that
are visible on the aircraft skin which are usually structural defects. These can include
dents, lightning strike damage, surface finish defects, fasteners defects, corrosion,
cracks, just to name a few. Automatic defect detection can be enabled by using a drone-
based system that can scan the aircraft and detect/ classify a wide range of defects all in
a very shorty time. Eliminating the manual process can lead to a significant impact on
aircraft operators with numerous benefits including but not limited to:
 Reduction of time spent on maintenance: The drone can quickly reach difficult
places such as the flight control surfaces in both wings and empennage. This
in turn will reduce man hours and preparation time as engineers would need
heavy equipment such as cherry pickers to have more scrutiny. The inspection
time can be even further reduced if the drone-based system is able to assess
the severity of the damage and the affected aircraft structure with reference to
aircraft manuals.
 Reduction in safety incidents and PPE related costs: Engineers no longer need
to work at heights or expose themselves to hazardous areas e.g. in case of
dangerous aircraft conditions or the presence of toxic chemicals. This also
reduces costs on Personal Protective Equipment.
 Reduction in Aircraft-On-Ground (AOG) time: Time savings on inspection
time can lead to reductions of up to 70% in turnaround times.
 Reduction in decision time: Defect detection will be much more accurate and
faster compared to the current visual inspection process. E.g. it takes operators
between 6 to 8 hours to find lightning strike damage. This can be reduced up
to one hour with an automated drone-based system. Such time savings can
free up aircraft engineers from dull tasks and making them focus on more
important tasks. This is especially desired given the projected need of aircraft
engineers in various regions of the world according to a recent Boeing study.
 Objective damage assessment and reduction of human error: If the dataset
used by the neural network is annotated by a team of experts who had to reach
consensus on what is a damage and what not, then detection of defects will be
S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection 75

more objective. Consequently, human errors such as failing to detect critical


damage (e.g. due to fatigue or time pressure) will be prevented.
 Augmentation of Novices Skills: It takes a novice 10000 hrs. to become an
experienced inspector. Using a decision-support system can significantly
augment the skills of novices.

3. Automated Defect Detection Using MASK R-CNN

To demonstrate the concept of automated drone-based inspection, the authors have


applied MASK R-CNN to automatically detect aircraft dents [9]. Mask R-CNN is an
instance segmentation model which enables the identification of pixel-wise delineation
of the object class of interest. In order to get instance segmentation for a particular
image, two main tasks are required: First, the bounding box-based object detection
(Localization task) which uses similar architecture as faster R-CNN [14]. The only
difference in Mask R-CNN is the Regions of Interest (ROI) step. Instead of using ROI
pooling, it uses ROI align to allow the pixel to pixel preserve of ROIs and prevent
information loss. Second, the semantic segmentation which allows segmenting
individual objects at pixel within a scene, irrespective of the shapes. Semantic
segmentation uses a Fully Convolutional Network which creates binary masks around
the bounding box objects through creating pixel-wise classification of each region.
Hence, Mask R-CNN minimizes the total loss.
The neural network was trained with 55 photos containing aircraft dents. Because
the dataset was small, it was decided to use a 10-fold cross validation approach [15].
So, the dataset was split into 10 equally sized parts 9 of which were used for training
and 1 one for testing. The experiment included 10 different combinations of training
and test pairs. The performance of each fold was evaluated using precision and recall
(see Table 2). During the initial 15 epochs of training, the RESNET weights were kept
constants, while the layers of the head of MASK-RCNN were trained and finetuned.
The head includes the Region Proposal Network, Masking, and Bounding Boxes,
among others. Then another 5 epochs were used to continue training the head of the
Mask R-CNN structure including the RESNET layer. An important improvement in
both precision and recall was noticed. This could be explained by the fact that the
RESNET layer functions as a feature extractor and therefore training it leads to more
true positives.

Table 2. Average results corresponding to 10 folds. Precision = TP / (TP +FP), Recall = TP / (TP + FN), TP:
True Positives, FP: False Positives, FN: False Negatives. Confidence Threshold = 73%.
Performance Training head only (15 epochs) Training head (15 epochs) + RESNET (5 epochs)
Train Size 49.5 49.5
Test Size 5.5 5.5
TP 5.7 6.9
FP 3.8 3.0
FN 6.1 5.4
Precision 53.6% 69.13%
Recall 46.2% 57.32%
76 S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection

Analyzing the results show that factors such as lighting and environmental conditions
can mislead the model. For instance, raindrops and rivets can be confused with dents
(Figure 1). Therefore, additional training with data containing these anomalies is
required, and more experiments need to include these types of scenarios during the
physical testing of the drone.

Figure 1. False Positive examples from Fold 10 test set where raindrops and rivets are confused with dents.
The manually labeled photo is on the left while the prediction is shown on the right.

4. Test Scenarios

This section presents different scenarios which can be used to improve the reliability of
automated drone-based aircraft inspection. These scenarios can also be used to design
neural networks architectures specifically tailored to the aircraft inspection problem.

4.1. Environmental & Diurnal Effects

Environmental effects such as rain, sand, and salt can drastically affect object detection
performance. As shown in [9], It might be a challenging task for the drone to detect
defects under this type of scenarios because it remains a challenging task even for
humans. However, equipping the drone with advanced scanning hardware might
resolve this problem. In addition, Diurnal effects such as changes in light and
temperature can also affect detect detection [9] (Figures 2-3). This could be an issue if
the drone scans the aircraft from a fixed angle, as aircraft engineers usually inspect
aircraft parts from different angles in order not to miss critical damage. A potential
solution is to use multi-drone teaming and swarming with the help of light beams.
S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection 77

Figure 2. The gaps could be due to lose fasteners in edge of skin lap. Defect can easily be missed if not
inspected standing on a work stand. The light plays an important role in damage detection. Photo taken at
Abu Dhabi Polytechnic Hangar.

Figure 3. Dents on an engine cowling of Falcon 20 at Abu Dhabi Polytechnic Hangar.

4.2. Allowable Damage

Not all defects detected by human operators must be repaired. When an aircraft
engineer detects a dent in the rear fairing skin for instance, he performs various
reasoning processes (Figure 4). E.g. looking whether the structure is primary or
secondary, consulting aircraft documentation to check if the defect is allowable or must
be repaired. Engineers also look at the type of material affected (e.g. chemically milled
section, composite structure, and laminated honeycomb) as different materials have
different properties. Therefore, to reduce false positives, the drone should be able to
distinguish between allowable and non-allowable damage.
78 S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection

Figure 4. Damage examination and evaluation by a human inspector.

For instance, taking a scratch on the Falcon 20 as an example damage. Its importance
depends on the nature of the scratched material, the shape of the scratch, and the depth
of the scratch [16]. Scratches with sharp edges, triangular or trapezoidal bottom are the
worst. Deep scratches usually eliminate the protective coating and reduce the cross
section of the stressed material. The drone should be able to precisely measure the
depth of the scratch using advanced scanning hardware and compare it to the thickness
of the protective coating shown in table 3. A scratch with a depth less than the coating
thickness will be considered negligible (less than 0,04 mm). It is prohibited to smooth
out such scratches as it could lead to the reduction of thickness of the protective coating
and corrosion. If the scratch is deeper than 0,04 repair actions are needed. These
include eliminating the ‘notching effect’, protecting against corrosion, and patching the
scratched area. Another example damage includes dents. An allowable dent must not
exceed a specific length (Figure 5) and must be free from sharp creases, gouges, or
cracks. Similar requirements exist for other types of damage e.g. cracks, localized
impact, corrosion, wear, etc.

Table 3. Coating thicknesses on the Falcon 20.


Material nominal thickness [mm] Thickness of Aluminum Coating each Face [1/100 mm]
0.3 to 0.6 4 to 6
0.8 to 1.6 4 to 6
2 to 3.5 4 to 8
4 to 6 4 to 10

The above requirements show that the drone algorithm should include relevant data
from the aircraft Structure Repair Manual, Integrated Parts Catalogue, and Aircraft
Maintenance Manual. The challenge would be the ability to assess and evaluate the
damage similar to what an expert does. Therefore, experiments with the drone-based
S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection 79

Figure 5. Allowable Dent [17].

inspection system should include various scenarios of both allowable and non-
allowable damage.

4.3. Rare/Unknown damage

Not all defects on aircraft have been encountered before (e.g. Figure 6). Therefore, the
drone should be able to also detect unknown defects or very rare defects. This can be
achieved by using unsupervised anomaly detection with Generative Adversarial
Networks (GANs). With this method, it becomes possible to address the challenging
task of detecting defects that were never seen before or are very rare. This is important
when it is unclear what an anomaly is going to look like, or when there is no labeled
data to train an image classifier with. Through training a GAN only with normal
aircraft pictures that do not contain defects, it learns what healthy aircraft look like and
would flag anything unusual. This can be double checked by operators to take actions if
necessary.

Figure 6. Example of a rare defect: Gunshot Damage of an A330-300 hit by a bullet in Congo on April 11th,
2020.
80 S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection

5. Conclusion

Automated drone-based aircraft inspection is a promising approach to further optimize


aircraft maintenance operations. The concept could lead to important cost savings for
aircraft operators as less time is spent on maintenance. Furthermore, inspection risk can
also be reduced as engineers would no longer need to work at heights. However,
significant research efforts are still needed to test the concept under various conditions
and make it more reliable. This paper has proposed test scenarios to be considered by
the system designers to further develop the concept and connected them with
requirements that the automated drone-based system should satisfy. The requirements
include 1) the ability to detect and classify defects under different environmental and
diurnal conditions; 2) the ability to distinguish between an allowable damage and non-
allowable damage thereby reducing false positives; and 3) the ability to detect rare or
unknown damage that was never encountered before.

References

[1] Wikipedia. Aloha Airlines Flight 243. 2020 May 14. Available from:
https://en.wikipedia.org/wiki/Aloha_Airlines_Flight_243
[2] Aeroassurance. ATR72 Missed Damage: Maintenance Lessons. 2009 Oct 12. Available from:
http://aerossurance.com/safety-management/atr72-missed-damage/
[3] Gina S. Google Brain chief: AI tops humans in computer vision, and healthcare will never be the same.
2017 Sep 27. Available from: https://siliconangle.com/2017/09/27/google-brain-chief-jeff-dean-ai-
beats-humans-computer-vision-healthcare-will-never/
[4] Carfagno, J. 5 FDA Approved Uses of AI in Healthcare. 2019 Jul 18. Available from:
https://www.docwirenews.com/docwire-pick/future-of-medicine-picks/fda-approved-uses-of-ai-in-
healthcare/
[5] Precisionhawk. Drone-based Aerial Intelligence in Precision Agriculture. 2019 Jul 5. Available from:
https://www.precisionhawk.com/hubfs/PrecisionHawk%20PrecisionAnalytics%20Agriculture%20Solu
tion%20Brief%202019.pdf
[6] MRODrone. Drone-based aircraft damage inspection system. 2020 May 13. Available from:
https://www.mrodrone.net/
[7] Ubisense. Ubisense and MRO drone launch world’s first smart hangar solution. 2018 Apr 11. Available
from: https://ubisense.com/ubisense-and-mro-drone-launch-worlds-first-smart-hangar-solution/
[8] Jovancevic I, Larnier S, Orteu JJ, Sentenac T. Automated exterior inspection of an aircraft with a pan-
tilt-zoom camera mounted on a mobile robot. Journal of Electronic Imaging, Society of Photo-optical
Instrumentation Engineers. 2015 Nov 30; 24 (6), pp.061110. Available from:
https://www.spiedigitallibrary.org/journals/Journal-of-Electronic-Imaging/volume-24/issue-
6/061110/Automated-exterior-inspection-of-an-aircraft-with-a-pan-
tilt/10.1117/1.JEI.24.6.061110.short?SSO=1
[9] Bouarfa S, Dogru A, Arizar R, Aydogan R, Serafico J. Towards Automated Aircraft Maintenance
Inspection. A use Case of Detecting Aircraft Dents Using Mask R-CNN. 2020 Jan 5. AIAA scientific
forum, 0389. Available from: https://doi.org/10.2514/6.2020-0389
[10] Hornos MJ, Rodriguez-Dominguez C. Increasing user confidence in intelligence environments. Journal
of Reliable Intelligent Environments. 2018 May 25. 4: 71-73. Available from:
https://link.springer.com/article/10.1007/s40860-018-0063-4
[11] Corno, F. User expectations in intelligent environments. Journal of Reliable Intelligent Environments.
2018 Jul 26. 4:189-198. Available from: https://link.springer.com/article/10.1007/s40860-018-0068-z
[12] Augusto JC, Coronato A. Introduction to the Inaugural Issue of the Journal of Reliable Intelligent
Environments. Journal of Reliable Intelligent Environments. 2015 May 1. 1:1-10. Available from:
https://link.springer.com/article/10.1007/s40860-015-0005-3
[13] Textron Aviaion Inc. Beechcraft King Air 90 Series. Structural Inspection and Repair Manual. Chapter
51. 2020 Jan 1, Revision D2. Available from: https://ww2.txtav.com
S. Bouarfa and J. Serafico / Automated Drone-Based Aircraft Inspection 81

[14] Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards Real-Time Object Detection with Region
Proposal Networks. Computer Vision and Pattern Recognition. 2016 Jan 6. Available from:
arXiv:1506.01497
[15] Alpaydin E. Introduction to Machine Learning. The MIT Press. 2010. Available from:
https://dl.acm.org/citation.cfm?id=1734076
[16] Avions Marcel Dassault. Fan Jet Falcon Structural Repair Manual. 1974 Jun. Chapter 51-10-11.
[17] Boeing. Structural Repair Manual 727. 2000 Nov 25. Chapter 51-40-6. P9.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy