Ultra-High-Resolution Mapping of Posidonia Oceanica (L.) Delile Meadows Through Acoustic, Optical Data and Object-Based Image Classification
Ultra-High-Resolution Mapping of Posidonia Oceanica (L.) Delile Meadows Through Acoustic, Optical Data and Object-Based Image Classification
Marine Science
and Engineering
Article
Ultra-High-Resolution Mapping of Posidonia
oceanica (L.) Delile Meadows through Acoustic,
Optical Data and Object-based Image Classification
Sante Francesco Rende 1, * , Alessandro Bosman 2 , Rossella Di Mento 1 , Fabio Bruno 3 ,
Antonio Lagudi 4 , Andrew D. Irving 5 , Luigi Dattola 6 , Luca Di Giambattista 2,7 ,
Pasquale Lanera 1 , Raffaele Proietti 1 , Luca Parlagreco 1 , Mascha Stroobant 8 and
Emilio Cellini 9
1 ISPRA, Italian National Institute for Environmental Protection and Research, Via Vitaliano Brancati 60,
00144 Rome, Italy; rossella.dimento@isprambiente.it (R.D.M.); pasquale.lanera@isprambiente.it (P.L.);
raffaele.proietti@isprambiente.it (R.P.); luca.parlagreco@isprambiente.it (L.P.)
2 Istituto di Geologia Ambientale e Geoingegneria, Consiglio Nazionale delle Ricerche (CNR-IGAG),
Via Eudossiana 18, 00184 Rome, Italy; alessandro.bosman@cnr.it (A.B.); luca.digiambattista@uniroma1.it (L.D.G.)
3 Department of Mechanical, Energy and Management Engineering—DIMEG, University of Calabria,
87036 Arcavacata di Rende, Italy; fabio.bruno@unical.it
4 3D Research Srl, 87036 Quattromiglia, Italy; antonio.lagudi@3dresearch.it
5 Coastal Marine Ecosystems Research Centre, Central Queensland University,
Norman Gardens, QLD 4701, Australia; a.irving@cqu.edu.au
6 ARPACAL—Regional Agency for the Environment—Centro di Geologia e Amianto, 88100 Catanzaro, Italy;
l.dattola@arpacal.it
7 Sapienza University of Rome—DICEA, Via Eudossiana 18, 00184 Rome, Italy
8 Research and Technological Transfer Area, Research Projects Unit, University of Florence,
50121 Florence, Italy; mascha.stroobant@unifi.it
9 ARPACAL—Regional Agency for the Environment—CRSM, Regional Marine Strategy Centre,
88100 Catanzaro, Italy; e.cellini@arpacal.it
* Correspondence: francesco.rende@isprambiente.it
Received: 20 July 2020; Accepted: 17 August 2020; Published: 22 August 2020
Abstract: In this study, we present a framework for seagrass habitat mapping in shallow (5–50 m) and
very shallow water (0–5 m) by combining acoustic, optical data and Object-based Image classification.
The combination of satellite multispectral images-acquired from 2017 to 2019, together with Unmanned
Aerial Vehicle (UAV) photomosaic maps, high-resolution multibeam bathymetry/backscatter and
underwater photogrammetry data, provided insights on the short-term characterization and
distribution of Posidonia oceanica (L.) Delile, 1813 meadows in the Calabrian Tyrrhenian Sea. We used
a supervised Object-based Image Analysis (OBIA) processing and classification technique to create
a high-resolution thematic distribution map of P. oceanica meadows from multibeam bathymetry,
backscatter data, drone photogrammetry and multispectral images that can be used as a model for
classification of marine and coastal areas. As a part of this work, within the SIC CARLIT project, a field
application was carried out in a Site of Community Importance (SCI) on Cirella Island in Calabria
(Italy); different multiscale mapping techniques have been performed and integrated: the optical and
acoustic data were processed and classified by different OBIA algorithms, i.e., k-Nearest Neighbors’
algorithm (k-NN), Random Tree algorithm (RT) and Decision Tree algorithm (DT). These acoustic
and optical data combinations were shown to be a reliable tool to obtain high-resolution thematic
maps for the preliminary characterization of seagrass habitats. These thematic maps can be used
for time-lapse comparisons aimed to quantify changes in seabed coverage, such as those caused by
anthropogenic impacts (e.g., trawl fishing activities and boat anchoring) to assess the blue carbon
sinks and might be useful for future seagrass habitats conservation strategies.
Keywords: Pléiades-HR satellite images; multibeam bathymetry; UAV; seafloor mapping; habitat
mapping; object-based image analysis (OBIA); seagrass; Mediterranean Sea; Tyrrhenian Sea
1. Introduction
Figure
Figure 1.
1. (a)
1. (a) Three-dimensional
Three-dimensional (3D) (3D) reconstruction
reconstruction ofof seafloor
seafloor morphology
morphology and and water
water column
column data
data
from
from multibeam
multibeam bathymetry
bathymetry (Multibeam
(Multibeam Echo-Sounder
Echo-Sounder (MBES))
(MBES)) for
for characterizing
characterizing the
the matte
matte and
and the
the
canopy
canopy height
height of
height of P.
P. oceanica.
P. oceanica. (b)
oceanica. (b) detail
detail of
of acoustic
acoustic fan
fan view
view collected
collected by
by multibeam
multibeam showing
showing the
the height
height
of
of canopy
canopy and
and the
the matter
matter of
matter of P.
of P. oceanica
P.oceanica meadows
oceanicameadows
meadowson on
onaaasandy
sandy seafloor.
sandyseafloor.
seafloor.
Figure
Figure 2.
2. Diagram
Diagram and comparison between different data sources and their resolution at the seafloor,
Figure 2. Diagram and
and comparison
comparison between
between different
different data
data sources
sources and
and their
their resolution
resolution at
at the
the seafloor,
seafloor,
for
for seabed
seabed and
and habitat
habitat mapping:
mapping: Multispectral
Multispectral Satellite
Satellite Image
Image (MSSI);
(MSSI); High-resolution
High-resolution Multibeam
Multibeam
for seabed and habitat mapping: Multispectral Satellite Image (MSSI); High-resolution Multibeam
Echo-Sounder System
Echo-Sounder System (MBES);
System(MBES); Development
(MBES);Development
Development Vehicle
Vehicle for
for Scientific
Scientific Survey
Survey (DEVSS);
(DEVSS); Unmanned
Unmanned
Echo-Sounder Vehicle for Scientific Survey (DEVSS); Unmanned Aerial
Aerial
Aerial Vehicle
Vehicle (UAV).
(UAV).
Vehicle (UAV).
J. Mar. Sci. Eng. 2020, 8, 647 4 of 25
However, the MBES surveys are limited in very shallow water areas especially at a 5 m depth
where vessel navigation might be difficult and dangerous and the swath coverage is very limited
(generally 3–4 times the depth of the seabed) and, hence, might significantly increase the surveying
time and its costs.
1.4. Application of Unmanned Aerial Vehicles (UAVs) and Autonomous Surface Vehicles (ASVs) Systems in
Seagrass Habitat Mapping
Marine observation techniques are usually divided into two major categories: (1) remotely
acquired data and (2) field measurements that are required for validation. The first category includes
satellite and aerial images, while the second includes techniques of ground-truth sampling like
underwater images, videos, underwater photogrammetry, in situ measurements and sampling
procedures. For seagrass habitat assessment, very fine resolution Unmanned Aerial Vehicle (UAV)
imagery has been effective for density coverage mapping and for detecting changes in small patches
and landscape features; these assessments would not have been possible with satellite or aerial
photography [10,35–38]. However, the application of UAVs for mapping and monitoring of seagrass
habitats has been limited by the optical characteristic of the water (e.g., turbidity), and the environmental
conditions (e.g., solar elevation angle, cloud cover and wind speed) during image acquisition [39].
As such, most research has been confined to clear shallow tropical waters [40,41] or small subsets
of exposed intertidal seagrass beds in temperate regions [36,37]. When it comes to Autonomous
Surface Vehicles (ASVs), they are considered promising approaches in the marine science community.
An immediate advantage is related to the collection of ground-truth data (optic and acoustic) with
single-beam echosounders and images from underwater photogrammetry cameras, which ensures
very high accuracy in very shallow waters [9,42,43]. This approach could represent, in the future, a
replacement of diver seagrass investigations (i.e., snorkeling or scuba diving [11]) especially during
either impractical or dangerous, field campaigns.
images [56]. The UAV images and OBIA classifications have also been used together for the mapping
of seagrass
maps habitats
through [37,39,40,57].
the analysis of satellite images [56]. The UAV images and OBIA classifications have also
been used together for the mapping of seagrass habitats [37,39,40,57].
2. Materials and Methods
2. Materials
In this and Methods
work, an integrated approach has been followed for the detection and mapping of
seagrass. Inwork,
In this particular, multispectral
an integrated images
approach has from
been satellite,
followedaerial
for theand underwater
detection digital images
and mapping from
of seagrass.
UAV/ASV,
In particular,Underwater Towed
multispectral imagesVideo
from Camera
satellite, Systems (UTCS)
aerial and and acoustic
underwater data from
digital images fromunderwater
UAV/ASV,
sonar technology
Underwater Towed have
Videobeen
Cameraused to map
Systems the seagrass
(UTCS) meadows
and acoustic data fromand monitor sonar
underwater their technology
extent and
condition
have been (Figure
used to 2).
map the seagrass meadows and monitor their extent and condition (Figure 2).
2.1.
2.1. Study Sites and Geomorphological Characterization
Characterization
Fieldwork
Fieldwork was carried
carried out
out at
at Cirella
Cirella Island,
Island, aa Site
Site ofof Community
Community Importance
Importance in in the
the Southern
Southern
Tyrrhenian ◦ 410 54.9300 N-Long 15◦ 480 8.0100 E). Four
TyrrhenianSea Sea(Lat
(Lat39
39°41′54.93″ N-Long 15°48′8.01″ E). Four kinds of surveys in 2018 and 2019 were were
carried
carried out
out (Multibeam,
(Multibeam, UAV UAVand andDevelopment
DevelopmentVehicle VehicleforforScientific
ScientificSurvey
Survey(DEVSS)
(DEVSS)and andUTCS).
UTCS).
Cirella
Cirella Island
Island is
is located
located onon the
the continental
continental shelf
shelf at
at aa distance
distance of
of 600
600 mm from
from the
the coast
coast and
and 2.5
2.5 km
km
from
from the
the shelf
shelf break
break (Figure
(Figure3).3). The
The island
island is
is composed
composed of of an
an emerged
emerged portion
portionofofabout
about0.07
0.07kmkm22 and
and
has 2 ; both
has aagreater
greater submerged
submerged portion of about
portion 0.3 km
of about 0.3 km2; portions belong belong
both portions to a carbonate unit of limestone
to a carbonate unit of
rocks fromrocks
limestone the Cretaceous [58]. The [58].
from the Cretaceous deeper
Theseabed
deepermorphologies (from about
seabed morphologies (from27about
m to27 41mmtowater
41 m
depth) around Cirella Island are characterized by semiflat bottom with a low gradient of about 1–1.5 ◦
water depth) around Cirella Island are characterized by semiflat bottom with a low gradient of about
toward the shelf
1–1.5° toward thebreak
shelf(Figure 3a), while
break (Figure 3a),inwhile
shallow water the
in shallow morphologies
water are quite
the morphologies arerugged with a
quite rugged
local ◦ . These morphologies are produced by rocky outcrops and by an extensive
with gradient up to 70up
a local gradient to 70°. These morphologies are produced by rocky outcrops and by an
matte of P.oceanica.
extensive matte of P.oceanica.
P. oceanica
P. oceanica isis located
located both
both onon sediment,
sediment, represented
represented mainly
mainly by by sand,
sand, and
and onon rocky
rocky outcrops.
outcrops.
Overall, the
Overall, the coverage
coverage of of P.
P.oceanica
oceanicaisisabout
about0.54
0.54 km
km of
22 of which
which about
about 65%65% isis located
located onon rocky
rocky outcrops.
outcrops.
The matte height is quite variable due to the rocky outcrop (maximum
The matte height is quite variable due to the rocky outcrop (maximum height: 1.5 m). height: 1.5 m).
AroundCirella
Around CirellaIsland,
Island,the thedepth
depthand and distribution
distribution of of P. oceanica
P. oceanica is uneven.
is uneven. On On the western
the western side,side,
it is
it is located
located from from
a deptha depth
of 15 mof while
15 m while
on theon the eastern
eastern side it side it is located
is located from a from
depthaofdepth
3 m. of 3 m.depths
These These
depthsthe
reflect reflect the prevailing
prevailing direction direction of the waves.
of the waves. Indeed,Indeed,
most of most
theofwaves
the waves
comecomefromfromthe W theand
W and
SW
SW sectors
sectors (caused
(caused by Ponente
by Ponente and and Sirocco
Sirocco winds,
winds, respectively,
respectively, as shown
as shown in thein elaboration
the elaboration of theof the
the
the directional wave analyzed from NOAA database from 1979 to 2009).
directional wave analyzed from NOAA database from 1979 to 2009). This suggests that their effectThis suggests that their effect
in shallow
in shallow water
water is is more
more relevant
relevant inin the
the western
western sector
sector of of the
the study
study area
area which
which is is exposed
exposed to to both
both
predominantwave
predominant wavedirections.
directions.
Figure 4.
Figure 4. Pléiades
Pléiades image
image correction
correction of
of the (a) True Color
Color Image
Image (TCI);
(TCI); (b)
(b) Pléiades
Pléiades image
image after
after
atmospheric correction;
atmospheric correction;(c)
(c)Pléiades
Pléiadesimage
imageafter
afterwater
watercolumn
columncorrection.
correction.
The
The second
second processing
processing performed
performed aa correction
correction to
to compensate
compensate for for the
the effects
effects of
of light
light intensity
intensity
attenuation as depth increases. This correction was made using the procedure
attenuation as depth increases. This correction was made using the procedure proposed by [60]proposed by [60] andandby
identifying and choosing homogeneous sandy bottoms at different depths.
by identifying and choosing homogeneous sandy bottoms at different depths. The aim was to The aim was to identify,
through
identify,appropriate regressions,
through appropriate the coefficients
regressions, that allow
the coefficients thatcorrelating the reflectance
allow correlating values values
the reflectance of the
bottom pixels to the depth (Figure 4c).
of the bottom pixels to the depth (Figure 4c).
The
The water
water column
column correction
correctionwaswas carried
carried out
out with
with the
the ERDAS
ERDAS IMAGINE
IMAGINE software
software viavia the
the Spatial
Spatial
Modeler
Modeler tool
tool [61].
[61].
The
The bands
bandsofofthe thePléiades
Pléiadessatellite
satelliteimage,
image,used to obtain
used the water
to obtain correction
the water coefficients,
correction are blue,
coefficients, are
green and red:
blue, green andin red:
this specific case, three
in this specific bands
case, threeoriginated from the blue–green,
bands originated blue–red and
from the blue–green, green–red
blue–red and
pairs have been
green–red pairsobtained
have been from image from
obtained processing.
image The three resulting
processing. bands
The three were then
resulting bands combined
were then to
create a new color image that was considered more appropriate for the classification
combined to create a new color image that was considered more appropriate for the classification phase (compared
to the unprocessed
phase (compared tooriginal image). original image).
the unprocessed
J. Mar. Sci. Eng. 2020, 8, 647 7 of 25
Table 1. Number of ground-truth data derived by the acoustic data MBES, and collected through the
Underwater Towed Camera Systems (UTCs), ASVs DEVSS and UAV.
2.4. UAV Survey and Processing for Digital Terrain and Marine Model Generation
The images were collected across Cirella Island in July 2019 via a Parrot Anafi Work UAV.
This model has an onboard camera with a 1/2.4-inch CMOS sensor which captures 21-megapixel images
(.jpeg format) and an f/2.4 lens with an 84◦ field of view. Automated flights were carried out with the
Pix4D Capture free drone flight planning mobile application with an 80% overlap on both axes, and a
flight altitude that ranged between 60–75 m depending on the total size of the surveyed site. In total,
4 photogrammetric flights were performed in order to cover the north/south and east/west side of the
island. A total of 360 frames were captured during a continuous flight (Figure 5). The surveyed images
were processed with Pix4D Mapper software [63] by using 11 ground control points located around
J.J. Mar.
Mar. Sci.
Sci. Eng.
Eng. 2020,
2020, 8, x FOR
FOR PEER
8, x647 PEER REVIEW
REVIEW 88 of
of 25
25
around the
around the island
island between
between 00 and
and 1010 m
m altitude.
altitude. The
The mosaic
mosaic image
image and
and the
the DEM
DEM were
were processed
processed atat
the island
0.02 m
0.02 m and between
and 0.3
0.3 m 0 and 10
m resolution m altitude.
resolution respectively. The
respectively. In mosaic image
In particular,
particular, 112 and the
112 frames DEM
frames were were
were used
used to processed
to collect, at
collect, by 0.02
by UAV m and
UAV direct
direct
0.3 m resolution
observation,
observation, therespectively.
the ground-truthInreference
ground-truth particular,
reference 112for
data
data frames
for wereand
training
training used
and to collect,the
validating
validating byclassification
the UAV direct observation,
classification algorithm,
algorithm,
the ground-truth
Table
Table 1.
1. reference data for training and validating the classification algorithm, Table 1.
Figure 5.
Figure 5. (a)
(a)Unmanned
UnmannedAerial
Unmanned AerialVehicle
Aerial Vehicle
Vehicle (Anafi
(Anafi
(Anafi Parrot
Parrot
Parrot WorkWork
Work
UAV) UAV)
UAV) survey
survey
survey performed
performed
performed from
fromfrom aa small
a smallsmall
boat;
boat;
boat;
(b) (b) GNSS
(b)
GNSS GNSS surveys
surveys
surveys carried
carried
carried out along
out
out along along
the theof
the
coast coast
coast of Cirella
of
CirellaCirella Island
Island Island to identify
to
to identifyidentify the 11
the 11 ground
the 11 ground ground control
controlcontrol
points;
points;
points;
(c) (c)
UAV(c) UAV Mosaic
UAV
Mosaic Mosaicofimage
image image ofIsland.
of
Cirella Cirella Island.
Cirella Island.
2.5.
2.5. Image Ground-Truth
2.5. Image
Image Ground-Truth Data
Ground-Truth Data
Data
The
The ASVrobotic
The ASV
ASV roboticplatform
robotic platformDEVSS
platform DEVSS(DEvelopment
DEVSS (DEvelopment
(DEvelopment Vehicle
Vehicle
Vehicle forfor
for Scientific
Scientific
Scientific Survey)
Survey)
Survey) developed
developed
developed by
by the
by the
the 3D Research
3D Research
3D Research private private
private research research
research companycompany
company [64] [64]
[64] and and
and the the Underwater
the Underwater
Underwater Towed Towed
Towed CameraCamera Systems
Camera Systems (UTCS)
Systems (UTCS)
(UTCS)
were used,
were used,
were in
used, in the
in the very
the very shallow
very shallow water
shallow water area,
water area,
area, toto collect
to collect
collect thethe ground-truth
the ground-truth reference
ground-truth reference data
reference data for
data for training
for training and
training and
and
validating the
validating the
validating classification
the classification algorithm,
classification algorithm,
algorithm, TableTable 1 [65].
Table 11 [65]. The
[65]. The ASVs
The ASVs
ASVs waswas equipped
was equipped
equipped with with a GoPro
with aa GoPro Hero
GoPro HeroHero 444
Black model,
Black model,
Black which
model, which
which is is a consumer-brand
is aa consumer-brand high-definition
consumer-brand high-definition
high-definition sports sports camera
sports camera
camera withwith a
with aa 1212
12 MPMP
MP HD HD CMOS
HD CMOS
CMOS
sensor, 00 size [64]. The GoPro Hero 4 Black records at different video and photo resolutions and
sensor, 1/2.5
sensor, 1/2.5″ size
1/2.5″ size [64].
[64]. The
The GoPro
GoPro Hero
Hero 44 Black
Black records
records at at different
different video
video andand photo
photo resolutions
resolutions andand
Field-Of-View (FOV).
Field-Of-View (FOV).
Field-Of-View In
(FOV). In this work,
In this
this work,we
work, weused
we usedthe camera
used the (set
the camera in time-lapse
camera (set
(set in mode)
in time-lapse with
time-lapse mode) 12 MP
mode) with widescreen
with 12 12 MP
MP
1080 p and a1080
widescreen
widescreen FOVppofand
1080 108◦aa. FOV
and The camera
FOV 108°.was
of 108°.
of Thepositioned
The camera was
camera face-down
was positioned
positionedin order to obtain
face-down
face-down vertical
in order
in order toimages
to obtain
obtain
at the same
vertical
vertical height
images
images at from
at the
the same
same theheight
bottom.
height from
from Inthe
thebottom.
the shallowIn
bottom. Inwater area, time-lapse
the shallow
the shallow water area,
water photos
area, were photos
time-lapse
time-lapse recorded
photos by
were
were
using: an
recorded by
recorded UTCs
by using:Platform,
using: an an UTCs equipped
UTCs Platform, with a
Platform, equipped caudal
equipped with fin in order
with aa caudal to reduce
caudal fin
fin in pitch
in order
order toand roll
to reduce movements
reduce pitch
pitch and and
and roll
roll
stabilize
movements
movements video and
andacquisition,
stabilizea video
stabilize SeaViewer’s
video acquisition,Sea-Drop™
acquisition, 6000 high-definition
aa SeaViewer’s
SeaViewer’s Sea-Drop™
Sea-Drop™ underwater video camera,
6000 high-definition
6000 high-definition
with a surface
underwater
underwater console
video
video and two
camera,
camera, withGoPro
with Hero
aa surface
surface 3+ cameras
console
console and two
and (Figure
two GoPro
GoPro 6b),Hero
with3+
Hero a 12
3+ MP HD
cameras
cameras CMOS6b),
(Figure
(Figure sensor
6b), withf
with
2.8–170 ◦ [66] (Figure 6b).
aa 12
12 MP
MP HDHD CMOS
CMOS sensorsensor ff 2.8–170°
2.8–170° [66]
[66] (Figure
(Figure 6b).6b).
Figure 6.
Figure 6. (a)
(a) Autonomous
Autonomous Surface
Surface Vehicle
SurfaceVehicle called DEvelopment
Vehicle called DEvelopment Vehicle
Vehicle for Scientific Survey
for Scientific Survey (DEVSS);
(DEVSS);
(b) Underwater
(b) Underwater Towed
TowedVideo
Towed VideoCamera
Video CameraSystems
Camera Systems(UTCS).
Systems (UTCS).
(UTCS).
The combined
The combined sampling
sampling of
of acoustic
acoustic and
and optical
optical data
data was
was carried
carried out
out in
in the
the shallow
shallow water
water coastal
coastal
area (depth
area (depth <10
<10 m)
m) that
that surrounds
surrounds Cirella
Cirella Island.
Island. In
In order
order to
to acquire
acquire overlapping
overlapping pictures,
pictures, ensuring
ensuring
J. Mar. Sci. Eng. 2020, 8, 647 9 of 25
Figure 7. (a)
Figure 7. Sample
(a) Sampleimage
image before
beforeandand (b) after
afterthe
theapplication
application of the
of the imageimage enhancement
enhancement algorithm
algorithm on
a singleframe
on a single frameof of the
the underwater
underwater photogrammetric
photogrammetric survey. Orthogonal
survey. Orthogonalview view
of theof
3Dthepoint
3Dclouds of
point clouds
three sample areas: (c) area size: 35 m × 7 m, 13 million 3D points; (d) area size: 40 m
of three sample areas: (c) area size: 35 m × 7 m, 13 million 3D points; (d) area size: 40 m × 7 m, 15 × 7 m, 15 million
3D points;
million (e) area
3D points; (e)size:
areaarea size:
size: area m × 933
33size: m,m 14×million
9 m, 143Dmillion
points. 3D
Thepoints.
map between (c) and
The map (d) shows
between (c) and
the location of the submerged transects with respect to Cirella Island.
(d) shows the location of the submerged transects with respect to Cirella Island.
In general, all the ground-truth data used in this work were carefully checked for possible overlaps
In general,
between all the and
the training ground-truth data used
validation samples. in this work
The minimum were
distance, carefully
calculated checked
between any for possible
training
overlaps between samples,
and validation the training and validation
was about samples.
25 m and this The 10%
affects only minimum distance,
of the total dataset.calculated between
any training and validation samples, was about 25 m and this affects only 10% of the total dataset.
2.6. OBIA Segmentation and Classification
2.6. OBIAAll
Segmentation and
collected data, Classification
i.e., DEMs, backscatter intensity map, UAV data and multispectral images data,
were processed in the OBIA process [69] by eCognition Essentials 1.3. software, using a classification
All collected data, i.e., DEMs, backscatter intensity map, UAV data and multispectral images
algorithm [70] (Figure 8). The Multibeam bathymetry data were converted into secondary features:
data, were processed in the OBIA process [69] by eCognition Essentials 1.3. software, using a
Slope, Northness, Eastness, Curvature and Terrain Roughness Index (TRI) using the Morphometry
classification algorithm [70] (Figure 8). The Multibeam bathymetry data were converted into
secondary features: Slope, Northness, Eastness, Curvature and Terrain Roughness Index (TRI) using
the Morphometry Tool in SAGA (SAGA (System for Automated Geoscientific Analyses) Version) [71]
as shown in (Table 2 and Figure 9).
J. Mar. Sci. Eng. 2020, 8, 647 10 of 25
Tool in SAGA (SAGA (System for Automated Geoscientific Analyses) Version) [71] as shown in (Table 2
J. Mar. Sci.and
Eng.Figure 9).x FOR PEER REVIEW
2020, 8, 10 of 25
Figure 9.
Figure 9. Bathymetry
Bathymetry data
data products
products used
used to
to classify
classify the seabed morphologies
the seabed morphologies and acoustic facies:
and acoustic facies:
Multibeam Digital
Multibeam Digital Elevation
Elevation Model
Model (DEM),
(DEM), backscatter
backscatter intensity
intensity map
map and
and secondary
secondary features:
features: Terrain
Terrain
RoughnessIndex
Roughness Index(TRI),
(TRI),Aspect,
Aspect, Curvature
Curvature and
and Slope
Slope obtained
obtained fromfrom postprocessing
postprocessing of bathymetric
of bathymetric data.
data.
The multiresolution segmentation algorithm, performed by the eCognition Essential software,
The multiresolution
was used to identify homogeneoussegmentation objects.algorithm,
The process performed by the eCognition
of multiresolution Essential
segmentation was software,
carried
wasby
out used to identifythe
considering homogeneous objects. The Scale
following parameters: process of multiresolution
Factor, Shape, Smoothness segmentation was carried
and Compactness.
outshallow
In by considering
water, thethe following parameters:
multiresolution segmentation Scalealgorithm
Factor, Shape,
was used Smoothness
to generate and Compactness.
objects with similar In
shallow water,
information by the multiresolution
using segmentation
only the backscatter intensity algorithm
and TRIwas used to
features. The generate objects
bathymetry, with Aspect
Slope, similar
information
and Curvature by features
using only theexcluded
were backscatter fromintensity
this first and TRI features.procedure.
segmentation The bathymetry, Slope, Aspect
The multiresolution
and Curvaturealgorithm
segmentation features were excluded
was used from this
to generate first segmentation
objects procedure. The
with similar information by multiresolution
using only the
segmentation
most important algorithm
featureswas used toThis
selected. generate objects
decision waswith similar
taken since information
the use ofby allusing only(primary
features the most
important
and secondary)featureshadselected.
created an This decisiondisturbance
excessive was takeneffect sinceand the the
usesegmentation
of all features (primary
results were and not
secondary) had created an excessive disturbance effect and the segmentation
adapted to the real shapes of the objects. In order to identify and remove nonimportant features results were not adapted
to theall
from real
the shapes
inputoflayers,
the objects. In order
a feature to identify
selection algorithm and (i.e.,
remove the nonimportant
R package Boruta features from allwas
algorithm) the
inputtolayers,
used assessatheir
feature selection
relevance algorithm (i.e.,
in determining thethe R package
thematic Boruta
classes. Thealgorithm)
Boruta algorithmwas used to assess
is built on a
their relevance
“random” forest in determining
classification the thematic
algorithm. Withclasses.
the wrapper The Boruta
algorithm,algorithm
presentisinbuilt on a “random”
the Boruta package,
forest
we classification
selected algorithm.
all the relevant With the
features wrapper algorithm,
by comparing present of
the importance in the Boruta
originalpackage,
attributes wewith
selected
the
all the relevant features by comparing the importance of the original
importance reached in a random way, estimated using permutations (i.e., shadows). Only the features attributes with the importance
reachedimportance
whose in a random wasway,
higher estimated
than those using permutations
of the randomized(i.e., shadows).
features Only the features
were considered important whose
[72].
importance
The was higherresult
Boruta algorithm thangave
thosea oflistthe
of randomized
attributes accordingfeatures to were
theirconsidered
importance, important
separating [72].them
The
Boruta algorithm result gave a list of attributes according to their importance,
into “confirmed” or “rejected” features. The features without decision at the end of the analysis were separating them into
“confirmed”
classified or “rejected” In
as “provisional”. features. The features
the following without
classification step,decision
we usedatthe the10 end
mostofimportant
the analysis were
attributes
classified asas
considered “provisional”.
“confirmed” In bythethefollowing classification
Boruta algorithm. step, wethe
Therefore, used the 10
Boruta most
test important
allows attributes
identification of
considered
the as “confirmed”
most important by the Boruta
and nonredundant algorithm.
features to train Therefore,
a model the Boruta test
improving allows identification
the learning process timing of
the most
and accuracyimportant and classification
of the final nonredundantmap features
(Figure to 8).
trainThea orthophotos
model improving the learning
generated from UAV process
data
timing and accuracy
surrounding of the final
the area around classification
the Cirella Island map
(seabed (Figure
depth8).from The0orthophotos
to 12.5 m) were generated
treated from UAV
separately
data
by thesurrounding the area around
eCognition software due to the thelackCirella Island
of data (seabed
acquired fromdepth from 0 surveys
multibeam to 12.5 m) wereshallow
in very treated
separately
water. by thetwo
Therefore, eCognition
different software due tothe
projects within theeCognition
lack of data acquired
Essential from multibeam
software were created. surveys in
The first
very shallow water. Therefore, two different projects within the
included all data acquired by multibeam in shallow water areas (8–30 m) and the second included eCognition Essential software were
created.
only The first included
the orthophotos acquiredall data
by UAV acquired
in very byshallow
multibeam waterinareas
shallow(1–12.5waterm; areas
Table (8–30
1). Wem) and the
tested
second included
performance only
of the the orthophotos
three acquiredclassification
different supervised by UAV in very shallow water
algorithms areas
available in(1–12.5 m; Table
the eCognition
1). We tested
Essential the performance
software: (k-NN), (RT) of and
the three
(DT).different
The k-NN supervised
algorithmclassification
is a method algorithms
for classifying available
objectsinby the
a
eCognition
majority Essential
ranking software:
of its (k-NN), (RT)
neighbors—with theand
object(DT).
beingTheassigned
k-NN algorithm
to the class is a most
method for classifying
common among
objects
its by a majority
k-Nearest ranking
neighbor. of its neighbors—with
DT learning is a method commonly the object being usedassigned to the class
in data mining wheremost a common
series of
among its k-Nearest neighbor. DT learning is a method commonly used in data mining where a series
of decisions are made to segment the data into homogeneous subgroups through a tree system with
J. Mar. Sci. Eng. 2020, 8, 647 12 of 25
decisions are made to segment the data into homogeneous subgroups through a tree system with
branches. Random tree is a combination of tree predictors where each tree depends on the values of a
random vector, sampled independently and with the same distribution for all trees in the forest [73].
All trees are trained with the same features but on different training sets, which are generated from the
original training set. It aggregates the ranks from different decision trees in order to assign the final
class of the test object. The ground-truth data acquired with the direct interpretation of multibeam data,
by means of the ASV DEVSS (both images and 3D point clouds) and the UTCS platform were used to
assign specific class values to the segmented objects. In the sampled area, five different classes were
identified: Fine sediment-FS, Coarse sediment-CS, Rock-R, P. oceanica meadows-P and Cystoseira-Cy.
Some samples were identified through other techniques in order to have homogeneous ground-truth
data coverage. Along the perimeter of the island, in the very shallow water areas, ground-truth data
were taken directly via visual identification during UAV survey, while for deeper areas that could not
be reached by the ASV DEVSS, some ground-truth data were selected based on the study of multibeam
bathymetry, backscatter and water column data acquired during the multibeam survey and by direct
observation with the UTCS platform. Overall, 920 ground-truth data were used in this study and
divided into two different groups (Training set and Validation set): one to perform the training step and
the second to evaluate the classification accuracy and Kappa coefficients [74–76] (Table 3 and Figure 3).
Table 3. Main thematic classes of seabed and number of ground-truth data collected through the ASV
DEVVS, UTCS, UAVs and derived by the acoustic data.
3. Results
Multibeam and photogrammetric (UAV) DEMs were merged via Global Mapper software using
the acoustic data and point cloud raw data respectively. The cloud points obtained by Pix4D software
were georeferenced by comparing the position of the Ground Control Points detected along the
coastline and on the island, while the bathymetric data collected with multibeam were used in order to
correct the altitude in the marine area. This operation was carried out through the tools for Quality
Control of LiDAR point cloud data LiDAR module of the Global Mapper software allowing comparison
and/or 3D correction of the height of point cloud data to known control points, and to report statistics
about subsets of points.
The acoustic profiles and point clouds were gridded (0.3 m) generating an integrated digital land
and sea model, overall the merged soundings and cloud points highlighted a vertical subdecimetrical
accuracy, see Figure 10.
J. Mar. Sci. Eng. 2020, 8, 647 13 of 25
J. Mar. Sci. Eng. 2020, 8, x FOR PEER REVIEW 13 of 25
In the shallow water areas, the most relevant results were obtained by using the following
multiresolution segmentation
multiresolution segmentation parameters:
parameters: scale
scale segmentation
segmentation 20;
20; Shape
Shape 0.1,
0.1, compactness
compactness 0.5 and
scale slider min. value
value 10, scale slider max value 100. These parameters
parameters were selected after several
tests carried out with different settings comparing the segmentation objects created by the algorithm
with those of real morphology.
morphology. Figure 10 shows the two Areas of Interest (AOI) from which we have
extracted the two examples of segmentation and classification processes. The The results
results are shown in
Figure 11.
The acquired ground-truth data were used to assign specific class values to determine segmented
objects (Figure 11).
J. Mar. Sci. Eng. 2020, 8, 647 14 of 25
J. Mar. Sci. Eng. 2020, 8, x FOR PEER REVIEW 14 of 25
Figure 11.
Figure 11. Shallow
Shallow water
water classification
classification process
process from
from multibeam
multibeam data:
data: (a)
(a) backscatter
backscatter intensity
intensity map;
map;
(a’) multiresolution segmented objects image and training and validation set; (a’’) RT classified
(a’) multiresolution segmented objects image and training and validation set; (a”) RT classified map: map:
(green: P.
(green: oceanica, light
P. oceanica, light brown:
brown: Fine
Fine sediment,
sediment, gray:
gray: Coarse
Coarse sediment);
sediment); (a’’’)
(a”’) RT final classified
RT final classified map
map
after merging and smoothing objects. Very shallow water classification process
after merging and smoothing objects. Very shallow water classification process from UAV aerial from UAV aerial
photogrammetry: (b)
photogrammetry: (b) Orthomosaic
Orthomosaic UAV image; (b’)
UAV image; (b’) Orthomosaic
Orthomosaic segmented
segmented into
into image
image objects
objects and
and
training and
training and validation
validation set;
set; (b”)
(b’’) k-NN
k-NN classified
classified map:
map: green:
green: P.
P. oceanica,
oceanica, blue:
blue: Cystoseira,
Cystoseira, pale
pale brown:
brown:
Rock); (b”’)
Rock); (b’’’) k-NN
k-NN final
final classified
classified map
map after
after merging
merging andand smoothing
smoothing objects.
objects. See
See Figure
Figure 1010 for
for location.
location.
Thestatistical
All acquiredinformation
ground-truth aboutdata were used
the features to assign
(primary specific class
and secondary), valueswith
associated to the
determine
objects
segmentedfrom
classified objects
766(Figure 11). data (training and validation set) were extracted from eCognition
ground-truth
All statistical
Essential software and information
analyzedabout featuresR (primary
with the Boruta and secondary),
package (version associated
6.0.0) in order withtheir
to assess the
objects classified from 766 ground-truth data (training and validation set) were
importance in the assignment of the corresponding class (Figure 12) [77]. The classification tests extracted from
eCognitionwith
performed Essential
a highersoftware
than 10and analyzed
number with theused
of attributes Boruta R package
in the (version
training phase did6.0.0) in order to
not significantly
assess their
increase importance
the accuracy of theinclassification.
the assignment of the
The most corresponding
relevant class (Figure
features emerging from the12)Boruta
[77]. test,
The
classification
are testsinperformed
the following with a higher
order of importance: the than
mean10 number ofintensity,
backscatter attributes
theused
meaninAspect,
the training phase
the general
did not significantly
Curvature increase themean
standard deviation, accuracy of the
Pléiades Bluclassification.
band, the meanThe most relevant
of Total features
Curvature, theemerging
mean of
from the
Slope, theBoruta
Aspect test, are the
standard following
deviation, theinSlope
order of importance:
standard deviation,theTRI
mean backscatter
standard intensity,
deviation; the
the mean
mean
of Aspect,
Eastness. the general
Figure 12 showsCurvature standard
the results deviation,
of the Boruta test.mean Pléiades Blu band, the mean of Total
Curvature, the mean of Slope, the Aspect standard deviation, the Slope standard deviation, TRI
standard deviation; the mean of Eastness. Figure 12 shows the results of the Boruta test.
J. Mar. Sci. Eng. 2020, 8, 647 15 of 25
J. Mar. Sci. Eng. 2020, 8, x FOR PEER REVIEW 15 of 25
Figure 12.
Figure 12. Results
Results of
of the
the Boruta
Boruta feature
feature selection
selection algorithm.
algorithm. Blue
Blue boxplots
boxplots correspond
correspond to minimal,
to minimal,
and maximum
average and maximum Z Z score
score of
of aa shadow
shadow attribute.
attribute. M: mean; SD:
SD: standard
standard deviation.
deviation.
Table 4. Accuracy assessment for the DT, k-NN and RT supervised classification algorithm (shallow
water area) for 3 different combinations: (A) Pléiades image; (B) Pléiades image-Backscatter-Bathymetry;
(C) Pléiades image-Backscatter-Bathymetry-Secondary features.
Combinations
Decision Tree (DT) Random Tree (RT) k-NN
(Data Source)
Overall accuracy: Overall accuracy: Overall accuracy:
67.83% 66.78% 71.33%
K: 0.48 K: 0.47 K: 0.48
A User’s Producer’s User’s Producer’s User’s Producer’s
Class
Pléiades image accuracy accuracy accuracy accuracy accuracy accuracy
(P) 75.36% 85.25% 72.67% 89.34% 70.31% 73.77%
(R) 83.33% 33.33% 83.33% 33.33% 100.00% 33.33%
(FS) 84.21% 55.56% 87.80% 50.00% 75.35% 74.31%
(CS) 10.64% 100.00% 10.42% 100.00% 18.18% 40.00%
Overall accuracy: Overall accuracy: Overall accuracy:
83.61% 91.80% 82.38%
K: 0.73 K: 0.85 K: 0.69
B
User’s Producer’s User’s Producer’s User’s Producer’s
Pléiades-Backscatter Class
accuracy accuracy accuracy accuracy accuracy accuracy
Bathymetry
(P) 95.45% 77.78% 96.97% 88.89% 90.22% 76.85%
(R) 28.57% 80.00% 42.11% 80.00% 42.11% 80.00%
(FS) 100.00% 88.43% 100.00% 95.04% 89.92% 88.43%
(CS) 23.81% 100.00% 45.45% 100.00% 21.43% 60.00%
Overall accuracy: Overall accuracy: Overall accuracy:
88.57% 99.63% 86.94%
C K: 0.80 K: 0.99 K: 0.77
Pléiades-Backscatter User’s Producer’s User’s Producer’s User’s Producer’s
Class
Bathymetry accuracy accuracy accuracy accuracy accuracy accuracy
Secondary features (P) 94.95% 86.24% 100.00% 99.07% 94.95% 86.24%
(R) 43.75% 70.00% 100.00% 100.00% 43.75% 70.00%
(FS) 94.74% 89.26% 99.31% 100.00% 94.74% 89.26%
(CS) 25.00% 80.00% 100.00% 100.00% 25.00% 80.00%
In the very shallow water area, the same workflow was performed to generate the classified habitat
map (Figure 8). The UAV orthophoto was imported via the eCognition software and a multiresolution
segmentation algorithm in order to identify objects. The most relevant segmentation results were
obtained by using the following parameter settings: scale 300; Shape 0.1, compactness 0.5 and scale
slider min. value: 50, scale slider max. value: 600 (Figure 11). These parameters were selected after
several tests carried out with different settings comparing the shapes created by the algorithm with
those of the real morphologies of the seabed.
A greater scale was used for the orthophotos due to the higher image resolution (0.03 m).
Data available for the OBIA classification derived only from the RGB orthophoto image, therefore the
Boruta algorithm was not used. In the very shallow water area, 3 different classes were identified
(Cystoseira (Cy), Rock (R), P. oceanica meadows (P); Table 2). The supervised classification algorithms
(k-NN, RT and DT) were tested, but in this case, the best classification result in terms of accuracy
was obtained when using the k-NN algorithm, with an overall accuracy equal to 95.24% and Kappa
coefficient 0.92 (Table 5). With regard to the k-NN classification algorithm, all classes (Cystoseira,
Rock and P. oceanica) showed small differences between the Producer’s and User’s accuracy values
(Table 5).
mapping, has nevertheless obtained an excellent result compared to the DT and RT algorithms. The
OBIA object classification, as highlighted in existing literature [45], represents an effective tool to
obtain robust thematic maps.
The Boruta feature selection algorithm showed promising results on seagrass habitat mapping
J.[23,66].
Mar. Sci.The
Eng. 2020, 8, 647
results confirmed the usefulness of applying this feature selection method in seagrass 17 of 25
habitats mapping [47]. The multiresolution segmentation scale requires careful adjustment of the
value and5.represents
Table a very important
Accuracy assessment setting
for the DT, for correct
k-Nearest classification
Neighbors’ algorithmby OBIA
(k-NN) and
and RThas a significant
supervised
impact on the results of seagrass habitat classification
classification algorithms (very shallow water area). [48]. Multiresolution segmentation represents
a very important step in the whole process, so defining the wide range of parameters, first of all, the
scale factor, the shape,Decision
the Tree (DT)
Smoothness Random Tree
and Compactness, is(RT)
of paramount importance k-NNsince they can
Overall Accuracy: 74.6% Overall Accuracy: 77.78% Overall Accuracy: 95.24%
determine the good progress of the classification. In this work, even if only five thematic classes have
K: 0.61 K: 0.65 K: 0.92
been used (Fine sediment (FS), Coarse sediment (CS), Rock (R) P. oceanica meadows (P) and
User’s Producer’s User’s Producer’s User’s Producer’s
Cystoseira
Class (Cy), Figure 13), it has been necessary to increase the number of ground-truth data for
accuracy accuracy accuracy accuracy accuracy accuracy
training(P) and validation
71% in order to improve the
65.38% answers of the
66.70% 100%classification algorithms.
100.00% In order to
100.00%
obtain(R)a relatively 62.50%high number 75%of ground-truth
88% data, different
35.00% sampling 87.00% platforms have been
100.00%
(Cy) and 100.00%
integrated used, such as88.24% 100% ASV, UAV,
acquisitions with 94.12%UTCS and 100.00% 82%
direct observation and
interpretation of the backscatter and DEM data, obtained by the Multibeam acquisition (Figure 9).
ThisTheintegrated technique could
two best-classified be useful
habitat to reduce
maps (very the time
shallow and and costsmaps)
shallow for theselected
acquisition
on theof ground-
basis of
truth data.
accuracy were merged into one very high-resolution classified map (Figures 13 and 14).
Figure
Figure13.13. Comparison
Comparisonbetween
betweenseafloor
seafloormapping
mapping techniques
techniques using
usingoptical GoPro
optical images
GoPro collected
images by
collected
(UTCS) and Autonomous Surface Vehicle (DEVSS) and: (a–d) high-resolution
by (UTCS) and Autonomous Surface Vehicle (DEVSS) and: (a,b,c,d) high-resolution multibeam multibeam bathymetry,
(a’,b’,c’,d’)
bathymetry, backscatter intensity
(a’,b’,c’,d’) data, (a”,b”,c”,d”)
backscatter intensity data,multispectral reflectance
(a’’,b’’,c’’,d’’) of Pléiades
multispectral images.
reflectance ofThe data
Pléiades
collected in the
images. The datastudy area show
collected in thedifferent types
study area of seafloor:
show differentsediments (from fine
types of seafloor: to coarse),
sediments rocks
(from and
fine to
P.coarse),
oceanica.rocks
The GoPro
and P.images wereThe
oceanica. usedGoPro
to interpret
images andwere
calibrate
usedthe tomorphological
interpret andfeatures
calibratewhile
the
using the eCognition
morphological software.
features See Figure
while using 14 for location.
the eCognition software. See Figure 14 for location.
More specifically,
The accuracy of about hectares of P.oceanica
habitat51classification might bewere mapped,
affected by while 119 factors,
multiple hectaresincluding
and 1.70 hectares
habitat
were, respectively, identified for the Fine and Coarse sediment, and 0.45 hectares
heterogeneity, bathymetry and water column attenuation [88]. This study highlights for Cystoseira.
how the
integration of satellite imagery with ultrahigh spatial resolution UAV aerial imagery and bathymetric
4. Discussion
The main objective of this work was to perform benthic habitat mapping by using several data
acquisition platforms and OBIA algorithms and to develop a high-resolution seagrass mapping digital
elevation model. OBIA algorithms are now stable and powerful approaches to use for classifying
J. Mar. Sci. Eng. 2020, 8, 647 18 of 25
benthic habitats and to produce accurate maps [78–82]. In this work, an integrated methodological
approach has been followed for multisensor data fusion (acoustic and optical) with different degrees
of resolution. This approach might be useful, if necessary, for mapping the P. oceanica meadows,
but also seabed geomorphological features and furthermore, to estimate the carbon sequestration
and storage of the seagrass ecosystem [83]. Generally, surveying techniques are used individually,
and show several limitations such as spatial coverage in very shallow waters (e.g., Multibeam) or poor
resolving capacity (e.g., satellite images) that might not allow a complete characterization of the spatial
and temporal extent of P. oceanica meadows in deeper areas. Most benthic seagrass habitat mapping
studies examine a single data source and quite a few attempts have been made to combine multiple
data sources [84,85]. High spatial resolution satellite imagery (2 m or smaller) alone has proven to
be unable to produce adequate accuracy for fine descriptive detailed maps [51] (a fact that is also
confirmed by this study). High-resolution multispectral and hyperspectral imagery can be useful
in discriminating habitat community size to a not-so-fine detail, but its moderate spatial resolution
might limit its broader application especially as the depth of the seabed increases, and reflectance
and radiance are absorbed [86]. The present study highlights how the OBIA classification did not
provide a satisfactory result in terms of thematic accuracy by using only the Pléiades satellite images.
Indeed, satellite images can be effective for the mapping of P. oceanica meadows, only in conditions of
high water transparency and in the presence of excellent spatial and spectral resolution. However,
the combination of acoustic bathymetric data (DEM and backscatter) combined with optical data (e.g.,
multispectral satellite) has proven to improve the final classification performance. The set of optical and
bathymetric acoustic data combined with secondary features showed the best classification results and
this has also been confirmed by other studies [14,35,87]. The present work, specific on high-resolution
seagrass mapping, highlights how RT seems to be the OBIA algorithm with the best classification
performance. Similar answers on RT have been highlighted also by Janowski [47] on the automatic
classification of benthic habitats [78,80,81]. As far as the K-NN classifier is concerned, it has a lower
case history of application in marine habitat mapping studies [79], and in general, the performances of
the K-NN classifier were almost always moderate or fair. The RT algorithm in this study proved to be
very effective in generating accurate classification, thus showing fair performance. Instead, the DT
classification algorithm has always shown the lowest accuracy. The classification of the orthophoto
produced with the UAV showed instead the best accuracy with the k-NN algorithm, which, although
not much used for marine habitat mapping, has nevertheless obtained an excellent result compared to
the DT and RT algorithms. The OBIA object classification, as highlighted in existing literature [45],
represents an effective tool to obtain robust thematic maps.
The Boruta feature selection algorithm showed promising results on seagrass habitat
mapping [23,66]. The results confirmed the usefulness of applying this feature selection method in
seagrass habitats mapping [47]. The multiresolution segmentation scale requires careful adjustment of
the value and represents a very important setting for correct classification by OBIA and has a significant
impact on the results of seagrass habitat classification [48]. Multiresolution segmentation represents
a very important step in the whole process, so defining the wide range of parameters, first of all,
the scale factor, the shape, the Smoothness and Compactness, is of paramount importance since they
can determine the good progress of the classification. In this work, even if only five thematic classes
have been used (Fine sediment (FS), Coarse sediment (CS), Rock (R) P. oceanica meadows (P) and
Cystoseira (Cy), Figure 13), it has been necessary to increase the number of ground-truth data for
training and validation in order to improve the answers of the classification algorithms. In order to
obtain a relatively high number of ground-truth data, different sampling platforms have been integrated
and used, such as acquisitions with ASV, UAV, UTCS and direct observation and interpretation of
the backscatter and DEM data, obtained by the Multibeam acquisition (Figure 9). This integrated
technique could be useful to reduce the time and costs for the acquisition of ground-truth data.
The accuracy of habitat classification might be affected by multiple factors, including habitat
heterogeneity, bathymetry and water column attenuation [88]. This study highlights how the integration
J. Mar. Sci. Eng. 2020, 8, 647 19 of 25
J. Mar. Sci. Eng. 2020, 8, x FOR PEER REVIEW 19 of 25
of satellite
acoustic imagery
data showswith ultrahigh
a good spatial resolution
performance UAV aerial imagery
in habitat classification and
[24]. The bathymetric
integrated acoustic
multisource
data shows a good performance in habitat classification [24]. The integrated multisource
technique represents, indeed, an improved solution to map benthic habitats with high degrees of technique
represents, indeed,Therefore
spatial accuracy. an improvedthe solution
higher theto map benthic
quality and habitats with high degrees
spatial resolution of spatial
of the data accuracy.
the better the
Therefore
performance of the segmentation algorithm and resulting OBIA classification [45,83,88]ofand
the higher the quality and spatial resolution of the data the better the performance the
segmentation
consequently the algorithm and resulting
generation of thematicOBIA
maps.classification [45,83,88]
A high-resolution and consequently
seabed classificationthemapgeneration
has been
of thematic maps. A high-resolution seabed classification map has been obtained
obtained for producing high-quality habitat-classified maps (Multilayer data and UAVs data). Two for producing
high-quality
maps (shallow habitat-classified
and deep watermaps (Multilayer
classified databeen
area) have and UAVs data).
selected basedTwoonmaps (shallow
an accuracy and deep
assessment
water classified
(shallow and deeparea) haveclassified
water been selected
area) based
and haveon an accuracy
been mergedassessment
into a unique(shallow and deephabitat
and complete water
classified area) and have
map (Figures 13 and 14). been merged into a unique and complete habitat map (Figures 13 and 14).
Figure 14.
Figure 14. Seabed
Seabed map classification of
map classification Cirella Island
of Cirella Island overlapped
overlapped over
over high-resolution
high-resolution multibeam
multibeam
bathymetry. The
bathymetry. The map
map shows
shows part
part of
of the
the analyzed areas in
analyzed areas in order
order to
to provide an example
provide an of the
example of the mapping
mapping
outcomes. The
outcomes. The inset
inset at
at the
the bottom
bottom right
right shows
shows the
the circular
circular histogram
histogram of of the wave directions
the wave directions and
and
significant wave height (Hs) plotted using the directional wave time series recorded
significant wave height (Hs) plotted using the directional wave time series recorded from 1979 to from 1979 to
2009, from NOAA wave watch III model [89] for the Mediterranean Sea at the Lat
2009, from NOAA wave watch III model [89] for the Mediterranean Sea at the Lat 41.33 and Long ◦
41.33° and Long
12.50°◦ coordinates.
12.50
At this stage, in the shallow classification, the integration of Pléiades satellite image, bathymetric
and backscatter data has shown that the contribution of the optical component of the satellite data
alone did not confer anyany clear
clear improvements
improvements in in the
the classification
classification procedure.
procedure. Therefore, for future
implementations method, the
implementations of the method, the team
team will evaluate
evaluate how
how the
the different types of satellite sensors,
sensors,
the different
different characteristics
characteristics of
of satellite
satellite images,
images, in terms of spatial and spectral resolution, can improve
the results of the multilevel classification.
classification.
Finally, the
the adopted
adopted methodological
methodological approach has proven its ability to extract information of
geomorphological
geomorphologicaldata datarelated to marine
related ecosystems
to marine from the
ecosystems coastthe
from to the deep
coast toareas,
the in a georeferenced
deep areas, in a
3D environment.
georeferenced 3DThis provides important
environment. geoinformation
This provides important on the extension of
geoinformation onthe
thethree-dimensional
extension of the
three-dimensional
coverage coverage
(area extensions and(area extensions
volumes) and volumes)
of the benthic of the benthic
macrohabitats macrohabitats
necessary necessary
for the analysis of data
for the
such asanalysis of data such as blue carbon.
blue carbon.
J. Mar. Sci. Eng. 2020, 8, 647 20 of 25
5. Conclusions
The integration of different methodological techniques (Multibeam, UAV, DEVS, UTCS and
multispectral image) represents a rapid and effective method for high-resolution seafloor and habitat
mapping from the coastline to deeper water. The geophysical and optical techniques, if correctly
processed, allow generation of high resolution integrated terrestrial and marine digital elevation
models that can also be used for the analysis of the physical environment both in the geological context
and in oceanographic modeling. Furthermore, these processed data can be used for time-lapse analysis
aimed to verify seabed changes, such as, the loss of seagrass habitats, migration of sediments (erosional
and depositional processes), as well as the anthropogenic impacts caused by fishing activities or
leisure boating.
The processing carried out from the multisensor (optical–acoustic) data fusion has significantly
improved the resolution of the mapping of the P. oceanica meadows mainly along the upper limit,
especially in shallower areas where data acquisition, performed with orthophoto UAVs image, are more
likely to be valid. The best results of the Object-based Image classification were achieved with combined
processing of DEM bathymetry, backscatter, secondary data and optical satellite data. The worst and
most inaccurate results of the Object-based classification were obtained when processing relied only
on Pléiades satellite image.
Based on the increasing use of thematic maps for habitat and the current interest in using
seagrass extension as a monitoring indicator, this digital cartographic method improves the quality
(limits of benthic facies) of the final maps, returning high-resolution products with high spatial and
thematic accuracy.
The integration of multiple acquisition methods (Satellite sensor, UAV, DEVS, UTCS and
Multibeam) allows to map the full extension of the P. oceanica meadows starting from very shallow
waters up to the lower limit; improves the performance of the cost-efficiency of the monitoring according
to the quality response of every single sensor (acoustic and/or optical), reduces the monitoring execution
time of the acoustic surveys with the Multibeam in very shallow water areas, which generally require
higher costs and more time to perform the surveys. This mapping technique may represent, within the
Marine Strategy Framework Directive (MFSD-2008/56/EC) and the EU Habitat Directive (92/43/CEE),
a valid methodology to determine the habitat extend and condition of P. oceanica meadows and to
quantify also the carbon sinks and capture rates of seagrass meadows.
Author Contributions: Conceptualization, S.F.R., A.B.; methodology, S.F.R., A.B. and R.D.M.; software, A.B.,
R.D.M. and L.D.; formal analysis, A.B., S.F.R., R.D.M., A.L. and L.D.; investigation, A.B. and R.D.M.; resources, P.L.;
data curation, A.B., S.F.R. and R.D.M.; writing—original draft preparation, S.F.R., A.B., and L.D.G.; writing—review
and editing, S.F.R.; A.B., F.B., A.D.I., M.S. and L.P., visualization, S.F.R, A.B. and R.P.; supervision, S.F.R., A.B. and
L.D.G.; project administration, S.F.R. and A.B.; funding acquisition, E.C. All authors have read and agreed to the
published version of the manuscript.
Funding: This research has been carried out within SIC CARLIT (“I SITI DI IMPORTANZA COMUNITARIA
DELLA CALABRIA “SIC MARINI”), a project financed by the ROP Calabria 2014-2020 ERDF-ESF funds by
the Department of Environment of the Calabria Region and co-financed at 50% with ARPACAL funds with the
management-organizational coordination of the CRSM—Regional Marine Strategy Centre. This research has also
been used for the development of the methodology for quantification of seagrass carbon sink in the SeaForest
LIFE 17CCM/IT 000121 project.
Acknowledgments: The Authors would like to thank Lucia Gigante and Stefano Papa (ISPRA), Salvatore Barresi
and Alfredo Amoruso (CRSM—ARPACAL) for the administrative support. We gratefully acknowledge the
support of the management board of the ASTREA Research Vessel: Giuseppe Cosentino, Luigi Manzueto,
Zeno Amicucci (ISPRA), and Giuseppe Coppola (ARGO). Last but not least the Authors appreciate the unknown
referee’s valuable and profound comments.
Conflicts of Interest: The authors declare no conflict of interest.
J. Mar. Sci. Eng. 2020, 8, 647 21 of 25
References
1. Green, E.P.; Short, F.T.; Frederick, T. World Atlas of Seagrasses; University of California Press: Berkeley, CA,
USA, 2003.
2. Den Hartog, C.; Kuo, J. Taxonomy and Biogeography of Seagrasses. In Seagrasses: Biology, Ecologyand
Conservation; Springer: Dordrecht, The Netherlands, 2007; pp. 1–23. [CrossRef]
3. EEC. Council Directive 92/43/EEC of 21 May 1992 on the conservation of natural habitats and of wild fauna
and flora. Off. J. Eur. Commun. 1992, 206, 7–50.
4. Serrano, O.; Kelleway, J.J.; Lovelock, C.; Lavery, P.S. Conservation of Blue Carbon Ecosystems for Climate
Change Mitigation and Adaptation. In Coastal Wetlands, 2nd ed.; Elsevier: Amsterdam, The Netherlands;
Oxford, UK; Cambridge, MA, USA, 2019; pp. 965–996. [CrossRef]
5. Orth, R.J.; Carruthers, T.J.B.; Dennison, W.C.; Duarte, C.M.; Fourqurean, J.W.; Heck, K.L.; Randall Hughes, A.;
Kendrick, G.A.; Kenworthy, W.J.; Olyarnik, S.; et al. A global crisis for seagrass ecosystems. Bioscience 2006,
56, 987–996. [CrossRef]
6. Duarte, L.D.S.; Machado, R.E.; Hartz, S.M.; Pillar, V.D. What saplings can tell us about forest expansion over
natural grasslands. J. Veg. Sci. 2006, 17, 799–808. [CrossRef]
7. Turner, S.J.; Hewitt, J.E.; Wilkinson, M.R.; Morrisey, D.J.; Thrush, S.F.; Cummings, V.J.; Funnell, G. Seagrass
patches and landscapes: The influence of wind-wave dynamics and hierarchical arrangements of spatial
structure on macrofaunal seagrass communities. Estuaries 1999, 22, 1016–1032. [CrossRef]
8. Lathrop, R.G.; Montesano, P.; Haag, S. A multi scale segmentation approach to mapping seagrass habitats
using airborne digital camera imagery. Photogramm. Eng. Remote Sens. 2006, 72, 665–675. [CrossRef]
9. O’Neill, J.D.; Costa, M. Mapping eelgrass (Zostera marina) in the Gulf Islands National Park Reserve of
Canada using high spatial resolution satellite and airborne imagery. Remote Sens. Environ. 2013, 133, 152–167.
[CrossRef]
10. Hogrefe, K.; Ward, D.; Donnelly, T.; Dau, N. Establishing a baseline for regional scale monitoring of eelgrass
(Zostera marina) habitat on the lower Alaska Peninsula. Remote Sens. 2014, 6, 12447–12477. [CrossRef]
11. Reshitnyk, L.; Robinson, C.L.K.; Dearden, P. Evaluation of WorldView-2 and acoustic remote sensing for
mapping benthic habitats in temperate coastal Pacific waters. Remote Sens. Environ. 2014, 153, 7–23.
[CrossRef]
12. Traganos, D.; Aggarwal, B.; Poursanidis, D.; Topouzelis, K.; Chrysoulakis, N.; Reinartz, P. Towards
Global-Scale Seagrass Mapping and Monitoring Using Sentinel-2 on Google Earth Engine: The Case Study
of the Aegean and Ionian Seas. Remote Sens. 2018, 10, 1227. [CrossRef]
13. Finkl, C.W.; Makowski, C. The Biophysical Cross-shore Classification System (BCCS): Defining Coastal
Ecological Sequences with Catena Codification to Classify Cross-shore Successions Based on Interpretation
of Satellite Imagery. J. Coast. Res. 2020, 36, 1–29. [CrossRef]
14. Hossain, M.S.; Bujang, J.S.; Zakaria, M.H.; Hashim, M. The application of remote sensing to seagrass
ecosystems: An overview and future research prospects. Int. J. Remote Sens. 2015, 36, 61–114. [CrossRef]
15. Pham, T.D.; Yokoya, N.; Bui, D.T.; Yoshino, K.; Friess, D.A. Remote Sensing Approaches for Monitoring
Mangrove Species, Structure, and Biomass: Opportunities and Challenges. Remote Sens. 2019, 11, 230.
[CrossRef]
16. Green, E.P.; Mumby, P.J.; Edwards, A.J.; Clark, C.D. Remote Sensing Handbook for Tropical Coastal Management;
Unesco: Paris, France, 2000; pp. 1–316.
17. Zoffoli, M.L.; Frouin, R.; Kampel, M. Water Column Correction for Coral Reef Studies by Remote Sensing.
Sensors 2014, 14, 16881–16931. [CrossRef] [PubMed]
18. Kenny, A.; Cato, I.; Desprez, M.; Fader, G.; Schüttenhelm, R.; Side, J. An overview of seabed-mapping
technologies in the context of marine habitat classification. ICES J. Mar. Sci. 2003, 60, 411–418. [CrossRef]
19. Brown, C.; Blondel, P. Developments in the application of multibeam sonar backscatter for seafloor habitat
mapping. Appl. Acoust. 2009, 70, 1242–1247. [CrossRef]
20. Pergent, G.; Monnier, B.; Clabaut, P.; Gascon, G.; Pergent-Martini, C.; Valette-Sansevin, A. Innovative method
for optimizing Side-Scan Sonar mapping: The blind band unveiled. Estuar. Coast. Shelf Sci. 2017, 194, 77–83.
[CrossRef]
21. Le Bas, T.; Huvenne, V. Acquisition and processing of backscatter data for habitat mapping–comparison of
multibeam and sidescan systems. Appl. Acoust. 2009, 70, 1248–1257. [CrossRef]
J. Mar. Sci. Eng. 2020, 8, 647 22 of 25
22. De Falco, G.; Tonielli, R.; Di Martino, G.; Innangi, S.; Parnum, S.; Iain, I.M. Relationships between multibeam
backscatter, sediment grain size and Posidonia oceanica seagrass distribution. Cont. Shelf Res. 2010, 30,
1941–1950. [CrossRef]
23. Lacharité, M.; Brown, C.; Gazzola, V. Multisource multibeam backscatter data: Developing a strategy for the
production of benthic habitat maps using semi-automated seafloor classification methods. Mar. Geophys. Res.
2018, 39, 307–322. [CrossRef]
24. Gumusay, M.U.; Bakirman, T.; Tuney Kizilkaya, I.; Aykut, N.O. A review of seagrass detection, mapping and
monitoring applications using acoustic systems. Eur. J. Remote Sens. 2019, 52, 1–29. [CrossRef]
25. Micallef, A.; Le Bas, T.P.; Huvenne, V.A.; Blondel, P.; Hühnerbach, V.; Deidun, A. A multi-method approach
for benthic habitat mapping of shallow coastal areas with high-resolution multibeam data. Cont. Shelf Res.
2012, 39, 14–26. [CrossRef]
26. Held, P.; Schneider von Deimling, J. New Feature Classes for Acoustic Habitat Mapping—A Multibeam
Echosounder Point Cloud Analysis for Mapping Submerged Aquatic Vegetation (SAV). Geosciences 2019,
9, 235. [CrossRef]
27. Bosman, A.; Casalbore, D.; Anzidei, M.; Muccini, F.; Carmisciano, C. The first ultra-high resolution
Marine Digital Terrain Model of the shallow-water sector around Lipari Island (Aeolian archipelago, Italy).
Ann. Geophys. 2015, 58, 1–11. [CrossRef]
28. Bosman, A.; Casalbore, D.; Romagnoli, C.; Chiocci, F. Formation of an ‘a’ā lava delta: Insights from time-lapse
multibeam bathymetry and direct observations during the Stromboli 2007 eruption. Bull. Volcanol. 2014, 76,
1–12. [CrossRef]
29. Tecchiato, S.; Collins, L.; Parnumb, I.; Stevens, A. The influence of geomorphology and sedimentary processes
on benthic habitat distribution and littoral sediment dynamics: Geraldton, Western Australia. Mar. Geol.
2015, 359, 148–162. [CrossRef]
30. Wölfl, A.C.; Snaith, H.; Amirebrahimi, S.; Devey, C.W.; Dorschel, B.; Ferrini, V.; Huvenne, V.A.I.; Jakobsson, M.;
Jencks, J.; Johnston, G.; et al. Seafloor Mapping—The Challenge of a Truly Global Ocean Bathymetry.
Front. Mar. Sci. 2019, 6, 283. [CrossRef]
31. Clarke, J.H.; Lamplugh, M.; Czotter, K. Multibeam water column imaging: Improved wreck least-depth
determination. In Proceedings of the Canadian Hydrographic Conference, Halifax, NS, Canada, 6–9 June
2006; pp. 5–9.
32. Colbo, K.; Ross, T.; Brown, C.; Weber, T. A review of oceanographic applications of water column data from
multibeam echosounders. Estuar. Coast. Shelf Sci. 2014, 145, 41–56. [CrossRef]
33. Dupré, S.; Scalabrin, C.; Grall, C.; Augustin, J.; Henry, P.; Celal Şengör, A.; Görür, N.; Namık Çağatay, M.;
Géli, L. Tectonic and sedimentary controls on widespread gas emissions in the Sea of Marmara: Results from
systematic, shipborne multibeam echo sounder water column imaging. J. Geophys. Res. Solid Earth 2015, 120,
2891–2912. [CrossRef]
34. Bosman, A.; Romagnoli, C.; Madricardo, F.; Correggiari, A.; Fogarin, S.; Trincardi, F. Short-term evolution of
Po della Pila delta lobe from high-resolution multibeam bathymetry (2013–2016). Estuar. Coast. Shelf Sci.
2020, 233, 106533. [CrossRef]
35. Doukari, M.; Batsaris, M.; Papakonstantinou, A.; Topouzelis, K. A Protocol for Aerial Survey in Coastal
Areas Using UAS. Remote Sens. 2019, 11, 1913. [CrossRef]
36. Barrell, J.; Grant, J. High-resolution, low altitude aerial photography in physical geography: A case study
characterizing eelgrass (Zostera marina L.) and blue mussel (Mytilus edulis L.) landscape mosaic structure.
Prog. Phys. Geogr. 2015, 39, 440–459. [CrossRef]
37. Duffy, J.P.; Pratt, L.; Anderson, K.; Land, P.E.; Shutler, J.D. Spatial assessment of intertidal seagrass meadows
using optical imaging systems and a lightweight drone. Estuar. Coast. Shelf Sci. 2018, 200, 169–180. [CrossRef]
38. Makri, D.; Stamatis, P.; Doukari, M.; Papakonstantinou, A.; Vasilakos, C.; Topouzelis, K. Multi-scale seagrass
mapping in satellite data and the use of UAS in accuracy assessment. In Proceedings of the Sixth International
Conference on Remote Sensing and Geoinformation of the Environment, Proc. SPIE 10773, Paphos, Cyprus,
6 August 2018. [CrossRef]
39. Nahirnick, N.K.; Reshitnyk, L.; Campbell, M.; Hessing-Lewis, M.; Costa, M.; Yakimishyn, J.; Lee, L. Mapping
with confidence; delineating seagrass habitats using Unoccupied Aerial Systems (UAS). Remote Sens. Ecol.
Conserv. 2019, 5, 121–135. [CrossRef]
J. Mar. Sci. Eng. 2020, 8, 647 23 of 25
40. Ventura, D.; Bonifazi, A.; Gravina, M.F.; Belluscio, A.; Ardizzone, G. Mapping and Classification of
Ecologically Sensitive Marine Habitats Using Unmanned Aerial Vehicle (UAV) Imagery and Object-Based
Image Analysis (OBIA). Remote Sens. 2018, 10, 1331. [CrossRef]
41. Casella, E.; Collin, A.; Harris, D.; Ferse, S.; Bejarano, S.; Parravicini, V.; Hench, J.L.; Rovere, A. Mapping coral
reefs using consumer-grade drones and structure from motion photogrammetry techniques. Coral Reefs 2017,
36, 269–275. [CrossRef]
42. Carlson, D.; Fürsterling, A.; Vesterled, L.; Skovby, M.; Pedersen, S.; Melvad, C.; Rysgaard, S. An affordable
and portable autonomous surface vehicle with obstacle avoidance for coastal ocean monitoring. Hardwarex
2019, 5, e00059. [CrossRef]
43. Alvsvåg, D. Mapping of a Seagrass Habitat in Hopavågen, Sør-Trøndelag, with the Use of an Autonomous
Surface Vehicle Combined with Optical Techniques. Master’s Thesis, NTNU, Gjøvik, Norway, 2017.
44. Blaschke, T. Object based image analysis for remote sensing. ISPRS J. Photogramm. Remote Sens. 2010, 65,
2–16. [CrossRef]
45. Diesing, M.; Mitchell, P.; Stephens, D. Image-based seabed classification: What can we learn from terrestrial
remote sensing? ICES J. Mar. Sci. 2016, 73, 2425–2441. [CrossRef]
46. Janowski, L.; T˛egowski, J.; Nowak, J. Seafloor mapping based on multibeam echosounder bathymetry and
backscatter data using Object-Based Image Analysis: A case study from the Rewal site, the Southern Baltic.
Oceanol. Hydrobiol. Stud. 2018, 47, 248–259. [CrossRef]
47. Janowski, L.; Trzcinska, K.; Tegowski, J.; Kruss, A.; Rucinska-Zjadacz, M.; Pocwiardowski, P. Nearshore
benthic habitat mapping based on multi-frequency, multibeam echosounder data using a combined
object-based approach: A case study from the Rowy Site in the Southern Baltic Sea. Remote Sens. 2018,
10, 1983. [CrossRef]
48. Wicaksono, P.; Aryaguna, P.A.; Lazuardi, W. Benthic Habitat Mapping Model and Cross Validation Using
Machine-Learning Classification Algorithms. Remote Sens. 2019, 11, 1279. [CrossRef]
49. Mohamed, H.; Nadaoka, K.; Nakamura, T. Assessment of machine learning algorithms for automatic benthic
cover monitoring and mapping using towed underwater video camera and high-resolution satellite images.
Remote Sens. 2018, 10, 773. [CrossRef]
50. Menandro, P.S.; Bastos, A.C.; Boni, G.; Ferreira, L.C.; Vieira, F.V.; Lavagnino, A.C.; Moura, R.; Diesing, M.
Reef Mapping Using Different Seabed Automatic Classification Tools. Geosciences 2020, 10, 72. [CrossRef]
51. Benfield, S.L.; Guzman, H.M.; Mair, J.M.; Young, J.A.T. Mapping the distribution of coral reefs and
associated sublittoral habitats in Pacific Panama: A comparison of optical satellite Sensors and classification
methodologies. Int. J. Remote Sens. 2007, 28, 5047–5070. [CrossRef]
52. Leon, J.; Woodroffe, C.D. Improving the synoptic mapping of coral reef geomorphology using object-based
image analysis. Int. J. Geogr. Inf. Sci. 2011, 25, 949–969. [CrossRef]
53. Phinn, S.R.; Roelfsema, C.M.; Mumby, P.J. Multi-scale, object-based image analysis for mapping geomorphic
and ecological zones on coral reefs. Int. J. Remote Sens. 2012, 33, 3768–3797. [CrossRef]
54. Wahidin, N.; Siregar, V.P.; Nababan, B.; Jaya, I.; Wouthuyzen, S. Object-based image analysis for coral reef
benthic habitat mapping with several classification algorithms. Procedia Environ. Sci. 2015, 24, 222–227.
[CrossRef]
55. Roelfsema, C.M.; Lyons, M.; Kovacs, E.M.; Maxwell, P.; Saunders, M.I.; Samper-Villarreal, J.; Phinn, S.R.
Multi-temporal mapping of seagrass cover, species and biomass: A semi-automated object based image
analysis approach. Remote Sens. Environ. 2014, 150, 172–187. [CrossRef]
56. Siregar, V.P.; Agus, S.B.; Subarno, T.; Prabowo, N.W. Mapping Shallow Waters Habitats Using OBIA
by Applying Several Approaches of Depth Invariant Index in North Kepulauan seribu. In Proceedings
of the IOP Conference Series: Earth and Environmental Science, The 4th International Symposium on
LAPAN-IPB Satellite for Food Security and Environmental Monitoring, Bogor, Indonesia, 9–11 October 2017;
IOP Publishing: Bristol, UK, 2018; Volume 149, p. 012052. [CrossRef]
57. Papakonstantinou, A.; Stamati, C.; Topouzelis, K. Comparison of True-Color and Multispectral Unmanned
Aerial Systems Imagery for Marine Habitat Mapping Using Object-Based Image Analysis. Remote Sens. 2020,
12, 554. [CrossRef]
58. Amodio-Morelli, L.; Bonardi, G.; Colonna, V.; Dietrich, D.; Giunta, G.; Ippolito, F.; Liguori, V.; Lorenzoni, S.;
Paglionico, A.; Perrone, V.; et al. L’arco Calabro-peloritano nell’Orogene Appeninico-Maghrebide.
Mem. Soc. Geol. Ital. 1976, 17, 1–60.
J. Mar. Sci. Eng. 2020, 8, 647 24 of 25
86. Dattola, L.; Rende, S.; Dominici, R.; Lanera, P.; Di Mento, R.; Scalise, S.; Cappa, P.; Oranges, T.; Aramini, G.
Comparison of Sentinel-2 and Landsat-8 OLI satellite images vs. high spatial resolution images (MIVIS and
WorldView-2) for mapping Posidonia oceanica meadows. In Proceedings of the Remote Sensing of the Ocean,
Sea Ice, Coastal Waters, and Large Water Regions, Proc. SPIE 10784, Berlin, Germany, 10 October 2018;
Volume 10784. [CrossRef]
87. Pham, T.D.; Xia, J.; Ha, N.T.; Bui, D.T.; Le, N.N.; Tekeuchi, W. A Review of Remote Sens. Approaches for
Monitoring Blue Carbon Ecosystems: Mangroves, Seagrassesand Salt Marshes during 2010–2018. Sensors
2019, 19, 1933. [CrossRef]
88. Li, J.; Schill, S.R.; Knapp, D.E.; Asner, G.P. Object-Based Mapping of Coral Reef Habitats Using Planet Dove
Satellites. Remote Sens. 2019, 11, 1445. [CrossRef]
89. Ardhuin, F.; Rogers, E.; Babanin, A.; Filipot, J.F.; Magne, R.; Roland, A.; Van der Westhuysen, A.; Queffeulou, P.;
Lefevre, J.; Aouf, L.; et al. Semiempirical dissipation source functions for ocean waves. Part I: Definition,
calibration, and validation. J. Phys. Oceanogr. 2010, 40, 1917–1941. [CrossRef]
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).