Sensors 23 00814 v4
Sensors 23 00814 v4
Article
Vehicle and Driver Monitoring System Using On-Board and
Remote Sensors
Andres E. Campos-Ferreira 1 , Jorge de J. Lozoya-Santos 1 , Juan C. Tudon-Martinez 1, *,
Ricardo A. Ramirez Mendoza 1 , Adriana Vargas-Martínez 1 , Ruben Morales-Menendez 1 and Diego Lozano 2
1 School of Engineering and Science, Tecnologico de Monterrey, Av. E Garza Sada 2501,
Monterrey 64849, Mexico
2 School of Engineering and Technologies, Universidad de Monterrey, Av. I Morones Prieto 4500 Pte.,
San Pedro Garza Garcia 66238, Mexico
* Correspondence: jc.tudon@tec.mx
Abstract: This paper presents an integrated monitoring system for the driver and the vehicle in a
single case of study easy to configure and replicate. On-board vehicle sensors and remote sensors are
combined to model algorithms for estimating polluting emissions, fuel consumption, driving style
and driver’s health. The main contribution of this paper is the analysis of interactions among the
above monitored features highlighting the influence of the driver in the vehicle performance and
vice versa. This analysis was carried out experimentally using one vehicle with different drivers
and routes and implemented on a mobile application. Compared to commercial driver and vehicle
monitoring systems, this approach is not customized, uses classical sensor measurements, and is
based on simple algorithms that have been already proven but not in an interactive environment
with other algorithms. In the procedure design of this global vehicle and driver monitoring system, a
principal component analysis was carried out to reduce the variables used in the training/testing
algorithms with objective to decrease the transfer data via Bluetooth between the used devices: a
biometric wristband, a smartphone and the vehicle’s central computer. Experimental results show
that the proposed vehicle and driver monitoring system predicts correctly the fuel consumption
index in 84%, the polluting emissions 89%, and the driving style 89%. Indeed, interesting correlation
Citation: Campos-Ferreira, A.E.;
results between the driver’s heart condition and vehicular traffic have been found in this analysis.
Lozoya-Santos, J.d.J.;
Tudon-Martinez, J.C.; Mendoza,
R.A.R.; Vargas-Martinez, A.; Keywords: ADAS; driver monitoring; fuel consumption; driving style; emissions
Morales-Menendez, R.; Lozano, D.
Vehicle and Driver Monitoring
System Using On-Board and Remote
Sensors. Sensors 2023, 23, 814. 1. Introduction
https://doi.org/10.3390/s23020814 The most used way of transportation is the car. Worldwide a yearly production of
Academic Editor: Andrey V. Savkin 90 million cars is manufactured. This high demand for production motivates the vehicle
industry to improve the whole lifecycle of its products. Improvements are made in the
Received: 9 November 2022 vehicle design, assembling processes, and vehicle technologies, but customer services
Revised: 21 December 2022
and the environmental footprint have been the most changing and demanding areas in
Accepted: 26 December 2022
recent years.
Published: 10 January 2023
Revenues from mobility services are projected to increase in the next years; there-
fore, the Original Equipment Manufacturers (OEM) are likely to turn into producers of
autonomous driving and smarter mobility services. Advanced Driver Assistance Systems
Copyright: © 2023 by the authors.
(ADAS) are smart algorithms that monitor external factors such as the road, the amount
Licensee MDPI, Basel, Switzerland. of light, and the distance to other cars. This kind of systems gives some alerts to the
This article is an open access article driver when there is a latent risk. In addition, an ADAS focuses on monitoring the usual
distributed under the terms and parameters that cause accidents such as drowsiness, fatigue, or distraction while driving.
conditions of the Creative Commons These driver monitoring systems could vary from contact methods, i.e., using an electroen-
Attribution (CC BY) license (https:// cephalogram (EEG), electrocardiogram (ECG), electromyography or galvanic skin response,
creativecommons.org/licenses/by/ or by contact-less methods as eye tracking, head movement and facial expressions through
4.0/). camera detection. In the automotive industry, almost all the OEMs have developed their
own vehicle monitoring systems and/or driving assistance systems mainly to reduce
the environmental footprint, called eco-driving algorithms. BMW, Ford (EcoBoost), and
Mazda (Skyactiv) have focused efforts on improving the efficiency of motors up to 30%.
In addition, Fiat (eco-driving scoring), Honda (Drive assist), Subaru, Toyota, and Nissan
have implemented panel indicators to feedback the driver if the motor performance is
eco-driving, based on vehicle signals and inputs.
On the other hand, the research community in vehicle technologies, has focused
on developing smart algorithms and methodologies to improve vehicle efficiency and
monitoring and improve the driver experience. Recent research has used different types
of devices for data collection and developed different approaches to assess the vehicle
performance or the driver style, but rarely an interaction between both. The smartphone’s
capacity for data retrieving and the continuous improvement in its processors and internal
sensors have made it a reliable device for data collection. Several authors have developed
research on vehicular technologies using smartphones as sensing and data acquisition
devices for vehicles, [1–3]. Other authors have used more sophisticated sensors such
as LiDARs, smart cameras, Inertial Measurement Units (IMU), etc., to develop ADAS
or eco-driving algorithms [4]. In fact, new research and commercial systems for vehicle
monitoring and driver assistance usually are based on several modern sensors of different
nature (radars, LiDARs, video cameras, sonars, GPS, etc.) whose cost is very high and the
maintenance after any crash is still a research opportunity area.
Interesting control strategies have been proposed in last years for novel ADAS strate-
gies [5], such as Lane Keeping Assistance Systems (LKAS), Pedestrian Safety Systems (PSS),
Collision Warning Systems (CWS), Cruise Control Systems (CCS), Night Vision Systems
(NVS), etc. Recently, new developments focused on shared control between the vehicle
and the driver have been proposed as a promise of a higher automated control level; for
instance, the robust control system proposed in [6] for LKAS using a human-in-the-loop
vehicle system or the linear parameter varying approach presented in [7] for vehicle lateral
control incorporating driver performance estimations. In [8] it is established that the control
benefits of any ADAS could be context dependent particularly by the traffic and necessarily
it is required the use of a vehicle and driver monitoring system.
For vehicle assessment or driver monitoring, there exist several approaches. Some
of these approaches use real vehicles for non-risk maneuvers or validated simulation
platforms for effective risk tests [9]. Phenomenological algorithms are used to calculate
energy expenditure [10,11], or emissions [12]. Statistical algorithms and modeling have
also been used when calculating driving events and energy consumption [13–15]. More
recently, machine learning algorithms are being used to explore the possibility of obtaining
better and more robust models for ADAS implementation, e.g., [16] uses decision trees and
neural networks to identify four types of driving style; authors in [3,17,18] make use of
fuzzy logic in combination with other machine learning algorithms to improve the event
detection of risk maneuvers. In addition, interesting results have demonstrated that the
driver’s health condition can be monitored when the human is part of a vehicle control
loop [19,20].
The aforementioned approaches have been proved in some cases with experimental
platforms showing good performances. Some of these methods cannot be replicated in
easy-way due to their complex structures to be implemented in embedded systems, use
of specialized sensors or design of a customized configuration (communication protocols,
hardware, etc.). Usually, these methods have been developed for single purposes and they
have not studied in an interactive environment with other algorithms. In this sense, the
main contribution of this paper is to present a detailed analysis of interactions among the
most important features of a vehicle monitoring and driver assistance system, highlighting
the influence of the driver into the vehicle performance and vice versa. Following aspects
stress the contribution:
Sensors 2023, 23, 814 3 of 30
• A single case of study is used to analyze in detail the global interactions among
the fuel consumption, CO2 emissions, driving style and driver health condition in
real time.
• The monitoring algorithms use a reduced data set (32% less than literature) according
to a principal component analysis (PCA) in order to decrease the transfer data via
Bluetooth between the used devices.
• For easy replicability, three non-invasive devices are required: (1) an On-Board Di-
agnostic (OBD) connector, (2) a smartphone and, (3) a biometric wristband. All of
them are connected to a single data acquisition system to collect data, process the
algorithms and display the results in real-time even on a mobile application.
This proposed monitoring and driver assistance system can be used for develop-
ing Naturalistic Driving (ND) studies to improve the driver’s understanding of ADAS
functionality and encourage its usage [21,22] .
The outline of this work is as follows. Section 2 presents in detail the review of the state-
of-the-art for the vehicle’s and driver’s assessment highlighting the most used monitoring
algorithms for driving style, fuel consumption, CO2 emissions, and driver health condition.
Section 3 refers to the methodology used in the proposed monitoring system for the driver
and the vehicle, considering data-driven models focused on smartphone and OBD unit
measurements. Section 4 is devoted to present the statistical results of a PCA that allows
the definition of signals related to the vehicle’s and driver’s key performances and their
possible correlations. Then, section 5 presents and discusses the results of the proposed
human–vehicle interaction monitoring system based on the error indexes defined in the
methodology of design. Finally, Section 6 concludes the work, summarizes the contributions
and proposes future work.
for driver assistance systems named advanced driver monitoring for assistance system
(ADMAS). This framework is driver-oriented to help to improve security to the vehicle.
Several factors may affect the driver’s performance. Distractions, fatigue, aggressive
driving style, and weather are the most influencing [28]. Distraction is the major cause
of reported car accidents and is caused by activities such as texting, listening to music,
eating, or looking at off-road zones [26]. When the driver engages in multiple things, the
brain starts concentrating on many tasks at once, which leads to less concentration on
the road. Driver’s distractions have been tackled by several researchers. Some authors
propose a monitoring system using behavioral, physiological, and vehicle signals. The
studies in [28] used an EEG and ECG to monitor the driver and detect a possible distraction.
The fatigue is related to the human’s physical or mental weariness. Prolonged driving,
monotonous driving, highly demanding external activities, and late-night driving are
examples of fatigue causes. To ensure road safety, the fatigue should be detected by the
vehicle and activate the necessary actions in a timely manner. Detecting fatigue exists in
two types of approaches: subjective tests and physiological methods. The subjective tests
are related to the driver and their results depend on the truthfulness on the driver. Some
used tests are the Epworth Sleepiness Scale (ESS), Multiple Sleep Latency Test (MSLT), or
Stanford Sleepiness Scale (SSS) [29]. On the other hand, the physiological methods offer
and objective way to evaluate the fatigue. Several techniques have been explored, such
as biometric evaluation using EEG signals to find a change in cerebral activity during
driving [30], ECG [31], and eye-tracking based on installed cameras [32]. Other research
works compute fatigue indirectly by analyzing vehicle data [33].
About the vehicle’s performance, researchers and automotive manufacturers have
mainly focused on monitoring fuel consumption and polluting emissions. In the next
paragraphs, recent researches are presented in a categorized way by the main features used
to monitor the driver and vehicle performances: energy consumption, polluting emissions
and driving style.
The Black-box model is more related to machine learning algorithms and neural
networks. They often provide great accuracy, but they do not provide any explanation of
the results, nor is it easy to understand how its different features interact. In [41], it can be
observed the performance of back-propagation neural networks and radial basis neural
networks for predicting average fuel consumption. Machine learning algorithms also have
been used [42], deep neural networks [43], as well as support vector machines [44] for
calculating the fuel consumption.
2.2. Emissions
CO2 emissions from transport are composed of gasses delivered from the combustion
of fuel for all transport activity. In addition, the transport sector emits other pollutants in
less quantity that also result from the internal combustion engine. Other types of pollutants
are methane, volatile organic compounds (VOCs), nitrogen oxides (NOx), sulfur dioxide
(SO2 ), carbon monoxide (CO) and fluorine gases. According to the European Union (EU)
emissions report in [45], the sector of naval, air, and railway transport is responsible for
nearly 30% of the EU’s total CO2 emissions, while 72% comes only from road transportation.
In addition, as part of the efforts to reduce CO2 emissions from transport, the EU has set a
new goal of reducing emissions from transport by 60% in 2050 compared to 1990 levels.
Researchers argue about the need to monitor emissions at a high spatiotemporal
resolution. Some approaches combine the air pollution data with the traffic flow of a
specific area [46]. Many models have been done to predict emissions of CO2 emissions,
as well as CO and NOx gasses. Statistical models have been implemented based on the
measurements from monitoring stations. In [47], a fuzzy logic model was used to calculate
emissions in Tehran. In [48], a statistical approach of generalized additive models was
used to forecast the air pollutants in Hong Kong. Recently, machine learning algorithms
also have been used. The authors in [49] developed a model to estimate the hourly traffic
emissions near roads by using a neural network algorithm. In addition, in [50] neural
networks and metaheuristic optimization techniques are used to predict traffic emissions.
Another way is to monitor the vehicle’s emissions in the air directly [51,52] using
Portable Emissions Measurement Systems (PEMS). However, these devices are usually
complex and expensive for large scale reproduction. A different approach is made by
calculating the emissions indirectly. One popular model that calculates emissions indirectly
is the Comprehensive Modal Emissions Model [53]. This model depends basically on
speed and data accuracy. Other authors use the OBD unit interface to obtain data from the
Electronic Control Unit (ECU) about the combustion emissions and combine it with GPS
positioning to determine the level of emissions on a specified route [54]. Sabiron et al. [55]
proposed a solution to monitor environmental footprint using smartphone data and vehicle
inherent characteristics.
uses driver-related variables such as EEG, ECG, steering wheel movement, acceleration,
and braking. The third estimation is a combination of the two previous ones; this way is
the most complex since it is necessary to monitor the driver and the vehicle at the same
time. The result is compared to the qualitative evaluation of one or more people, and then
the driving style can be classified. Other researchers make unsupervised classifications by
grouping drivers with similar driving styles depending on their driving performance.
The use of driving behavior can have several applications. An application can give
feedback or warnings to the driver about different events such as fuel consumption, dangers,
recommendations, among others [10,34,59,60]. In addition, the information could be
used for external users, i.e., car insurance companies, to determine the accident risk and
culpability for each event [61,62]. Another tendency is the high personalizing of the vehicles
and the driving experience, using the data for driver recognition [63], driver monitoring [64],
and [65], smartphone use [66], etc.
Zheng et al. [67] focused on an unsupervised driving behavior by assessing the vehicle
performance through smartphone sensors. The problem of smartphones is that they are not
fixed on the car and can change their position. Proposing data filtering and the coordinate
transform approach based on data from the accelerometer and gyroscope of a smartphone,
the authors could determine the driving performance of five typical events: right turn, left
turn, gas-hit, brake-hit, and forward driving. On the other hand, [3] claims that current
methods are highly dependent on previous smartphone calibration, and on its fixed position
to get reliable information. Thus, authors in [3] propose an adaptive fuzzy classifier where
the threshold to determine the driver behavior, would be adapted online. The classification
events were accelerating, braking and steering. Despite the good classification results, only
a few events were able to be detected with this technique.
the smartphone. For these monitoring purposes, the design of experiments is intended to
extract representative real data from the Monterrey city’s specific environment.
Cloud
Electric Raw and Biometric
processed
Human Biometrics Wristband signals
data wristband
Electric
ECU data OBD signals
Smartphone
reader
Driving style/ Vehicle dynamics
vehicle dynamics
Data
Raw and
processed data
storage
Vehicle Weather,
OBD reader
Traffic,
etc.
Smartphone
Environment
The human is the driver who maneuvers the vehicle giving the desired inputs to travel
from one point to another. The human is the control subsystem of the human–vehicle
system. The feedback is sensed from the eyes and in the brain decides to modify the
different inputs with its arms or legs to modify the steering, accelerating or braking, shift
gear, activate the lights, etc. The human is always affected by physiological and emotional
factors occurring in his daily life. Since these factors affect the driver’s decision, he reflects
these conditions by the driving style, which is different between drivers.
The vehicle is the machine that does the work of displacing from one place to another.
The cars are complex machines with different subsystems in combination to provide
efficiency, comfort, and security to the users. Current vehicles require a driver to modify its
control inputs to function. The vehicle has in-built sensors to monitor the cabin and the
environment so the subsystems control can make adjustments to its signals.
The sensing devices considered for this experimental setup are three: a biometric
wristband, an OBD reader, and a smartphone. The biometric wristband is used to monitor
the driver’s vital signs to determine its health status based on heart rate (HR), temperature,
or skin resistance. The OBD reader is used to communicate and monitor the vehicle’s ECU.
Since ECU is the" brain" of the vehicle, it also has the value of the sensors installed on the
vehicle in addition to internal testings and calculations the ECU makes. The connection is
through the OBD port and interacts with the ECU based on the OBD-II protocol.
Finally, the smartphone monitors the vehicle dynamics from the car using its built-
in sensors, e.g., inertial measurement units, GPS, magnetometer, sound level, etc. The
smartphone’s sensors are accessible using applications that can read and save the data.
In this proposed human–vehicle interaction monitoring system, the smartphone has two
objectives: one is to capture the vehicle dynamics and the other is to store data from
the other devices. The smartphone connects via Bluetooth with the OBD reader and
the wristband to save all the data so it can process it offline. In addition, in order to
preserve the information, the data retrieved also are saved in the cloud using smartphone
network services.
In Figure 1, the physical setup inside the vehicle is presented. Note that the driver is
observed using the biometric wristband while driving. The connection of the CAN reader
to the OBD-II port is done via Bluetooth to the smartphone, which is fixed on a flat surface
inside the vehicle. The smartphone, fixed in the vehicle cabin, is collecting the data from
its in-built sensors and collecting and saving the data from the vehicle’s ECU. Two apps
are running at the same time on the smartphone to monitor the vehicle dynamics: the
Androsensor™ and the Torque™ apps.
Sensors 2023, 23, 814 8 of 30
Figure 3. Experiments conditions to train and validate the algorithms of the proposed human–vehicle
interaction monitoring system.
the driver. Table A1 in Appendix A.1 presents a brief description of each variable and the
relation with its number on the PCA performance.
Four principal components have been selected better to understand the distribution of
the data among each component. The data representation for these components is around
48.34%, i.e., the first four components represent almost half of the data variance, while
the rest 34 components represent the other half. In Figure 4, the components 1 and 2 are
plotted; where three clusters are distinguished. Two clusters can be observed in the negative
and positive extremes for component 1 and 1 cluster can be decoupled for component
2 (in the positive extreme values). Similar clusters can be distinguished in plotting any
combination of first four components. Table 1 details the clusters generated by the first four
principal components.
Figure 4. PCA on raw data: plot of Components 1 and 2 with clustered variables.
Variables grouped in cluster 1 have similar linear behavior during the tests. The speed
and acceleration are closely related to the Throttle Position (12) and this with the vehicle’s
longitudinal acceleration Acc_Y (17), as well as with the Trip Time whilst Moving (14); this
last signal increases when the car is moving and decreases when the car stops. Interesting
correlation in the human–vehicle interaction is shown in this cluster, the BVP feature (35)
that is related to heart rate, and respiratory rhythm [70] is directly proportional to the
vehicle’s speed and acceleration.
On the other hand, the cluster 2 located at the negative side of Component 1 is formed
by Fuel Used (6), Torque (13), Gyr_Z (27), Heart rate (34), and EDA (36). These variables
have similar linear behavior during the tests and inverse behavior compared with the
found in cluster 1. The consumed fuel and the motor’s torque are related since the torque
increases when the driver requests more power and to provide that power more fuel has
to be burned. The inverse relationship with respect to cluster 1 can be observed between
the Fuel Used (6) and Trip Time whilst Moving (14): there will be less fuel consumption
if the car travels faster and arrives as soon as possible at its destination. With respect to
the human–vehicle interaction analysis, the EDA that represents the skin resistance (e.g.,
when someone is stressed and starts to sweat) is correlated to the motor’s torque and
when the vehicle has more yaw motions (Gyr_Z); for instance, in aggressive cornering
driving situations.
The cluster 3 is found on the positive side of Component 2 and is related to the
geolocation of the vehicle and its vertical dynamics. On the other hand, cluster 4 is located
on the positive side of Component 3 and is formed by CO2 Instantaneous emissions
(3), Kilometers Per Litre Instant (7) and Lin_Acc_Y (23). In this group, the pollutant
emissions and fuel consumption are linearly related because the CO2 is a product of
fuel consumption and is proportional by some ratios such as combustion efficiency and
Sensors 2023, 23, 814 11 of 30
As a first sight, the 18 variables clustered in five groups shown in Table 1 represent
almost the 50% of data variance in the 44 experiments. The remaining variables are less
representative in general but they could be more specific for certain vehicle maneuvers.
In fact, for getting more than 90% of variance representation in the dataset, 18 principal
components must be considered, equivalent to 26 variables. In this sense, the PCA tools
have demonstrated to be useful for data dimension reduction in this application: from
a total of 38 variables the dataset was reduced to 26 variables. Due to the observed
congruence by reviewing the resulting data features, this reduced dataset is considered for
training and validating the algorithms in next section. In addition, this dataset is enough
representative of the vehicle behavior and driver status, as well as their interaction to
propose a human–vehicle interaction monitoring system.
5. Results
According to Section 3 focused on the design of the proposed human–vehicle inter-
action monitoring system, the results are presented. First, the decision matrix results for
Sensors 2023, 23, 814 12 of 30
selecting the monitoring algorithms are discussed. Then, the selected monitoring algo-
rithms are trained and tested using experimental data (reduced dataset according to the
PCA results) and corresponding indexes predefined in literature (ESR, RRMSE, and RE).
Figure 6 presents the decision matrix for the review of the state-of-the-art related to the
polluting emissions monitoring algorithms. Most used techniques proposed as alternatives
to compute vehicle emissions are fuzzy logic approaches [47], neural networks [49,50], sta-
tistical methods [48], deep learning techniques [72] and phenomenological models [12,73].
In this case, inertial and GPS signals are used from smartphones and fuel injection and ve-
hicle acceleration data from vehicles’ ECU. The result of the decision analysis recommends
using the phenomenological model presented in [73], which is based on stoichiometric
computations according to European Union standards. In addition, a second alternative
considers data-driven models using OBD readings for the instant CO2 emissions calculation.
Sensors 2023, 23, 814 13 of 30
In a similar way, Figure 7 presents the decision matrix for the driving style algorithms.
For this feature, there exist several approaches based on heuristic methods [1,3], artificial
intelligence algorithms [16,41,58], statistical approaches [14] and phenomenological mod-
els [58]. Smartphone data (mainly IMU and GPS), OBD readings, vision systems, wheel
encoders, and radars are common signals used in algorithms to experimentally determine
the driving style. In this case, the result of the decision analysis recommends the use of
heuristic-based rules proposed in [1], whose proposal uses smartphone built-in sensors to
monitor driving maneuvers such as sudden acceleration, sudden braking, and sharp turns.
These events then are used to categorize the driving style as aggressive or non-aggressive. As
second option, the statistical algorithm proposed by [14] is a candidate for the driving style
monitoring computations due to its simplicity to be implemented. In this case, the driving
style is categorized as calm, normal, and aggressive mode.
Phenomenological
OBD Reference
OBD
50 50
45 45
40 40
Phenomenological
Fuel Consumption [Km/L]
35 35
30 30 Reference
25 25
20 20
15 15
10 10
5 5
0 0
0 100 200 300 400 500 600 700 800 900 0 50 100 150 200 250 300 350
Time [s] Time [s]
(a) (b)
Figure 9. Fuel consumption algorithm comparison in the experiment 12 (a) and experiment 24 (b).
According to the ESR index, in general, the OBD signal-based algorithm has less error
variance (33.53%) than the phenomenological model (77.11%). In addition, the RRMSE
index is lower with the OBD signal-based algorithm, demonstrating that the dynamic
estimation of the fuel consumption is better explained by this data-driven model. The
average RRMSE index among the 44 experiments is 37.64%, with respect to 60.11% obtained
by the phenomenological model. On the other hand, the RE index is better (16.07% on
Sensors 2023, 23, 814 15 of 30
average) with the phenomenological model than the data-driven model (21.73% on average);
this means that the fuel consumption estimation in liters without care about the transient
dynamics, i.e., only comparing the consumption at the end of the route, is better with the
phenomenological model.
Referring to the experiments performed by the same driver in the same route in
Figure 3, e.g., experiments 1 to 10 realized by the driver 1 in route 1, the major consumption
is during the morning tests and the lower consumption during midday trips. In the same
way, for experiments 11 to 20 realized by the driver 2 in route 2, the major consumption
occurred during evening trips and the lower consumption during midday experiments.
This is an indication of the influence of morning and evening traffic in the city, especially in
a city of the size of Monterrey (second largest metropolitan area in Mexico with more than
5 million population).
In Figure 10a, a comparison of fuel consumption against traffic in the time domain
is presented, using the experiment 5. The traffic is computed based on the trip route,
according to the traffic maps from Google. The traffic depends on the hour and day of the
trip; Figure 10b illustrates the recurrent traffic corresponding to the experiment 5 (driver
1, route 1, morning Wednesday). Using these traffic maps, the traffic is labeled on four
classes: 1. Low (blue), 2. Medium (orange), 3. High (red), and 4. Very high (maroon). Then,
using the GPS data collected during the trip, the traffic label is considered depending on
the position of the vehicle on the map.
35
30
25
Fuel consumption [L/Km]
20 Fuel consumption
15
10 Traffic
-5
0 200 400 600 800 1000 1200 1400 1600
Time [s]
(a) (b)
Figure 10. (a) Analysis of fuel consumption concerning traffic and (b) Google traffic map, for
experiment 5.
It can be observed that fuel consumption increases when less traffic is in the experiment
because it allows more acceleration by the vehicle (green circles in Figure 10a). When the
traffic increases the consumption decreases because less acceleration is needed and the time
idle is more frequent.
5.2.2. Emissions
The emissions feature is essential to be monitored because of the high contributions
from the vehicles to the greenhouse gasses. Worldwide is considered an objective to
diminish the ecological footprint caused by humans, and one point that has been working
on several ways is the emissions reduction. To reduce emissions is necessary to monitor
them and observe the amount of gasses emitted during transportation. This work aims to
monitor the emissions of CO2 gasses and obtain an algorithm able for their computation.
Sensors 2023, 23, 814 16 of 30
The CO2 emissions are computed in the selected algorithms according to the decision
matrix of Figure 6 using the 44 experiments collected. The inputs of both algorithms,
phenomenological model [73] and OBD signal-based algorithm, were filtered and then fed into
the functions. In Figure 11, the comparison result of experiments 13 and 33 is presented. In
green line is the output from phenomenological model and in red line is the output from OBD
signal-based algorithm, while black line represents the reference data from the vehicle’s ECU
given in grams per kilometer. Qualitatively, both algorithms have similar performances,
they follow the reference dynamics in general but with some mismatches due to the high
frequency contents. The system identification could be improved by considering pre-
processing tasks to eliminate or compensate for the fast changes in the instantaneous
emissions computation. According to Appendix A.3, the quantitative results of these
evaluations show that the transient error measured by all indexes is lower considering the
OBD signal-based algorithm. For instance, the RE index indicates that the OBD signal-based
algorithm can predict correctly the polluting emissions with 89% of effectiveness, i.e., with
an average error of 11%.
1000 1000
Reference Phenomenological
900 900
OBD
Reference
800 800
OBD
700 700
CO2 emissions [g/Km]
CO! emissions [g/Km]
600 600
Phenomenological
500 500
400 400
300 300
200 200
100 100
0
200 300 400 700 750 800 850
Time [s] Time [s]
(a) (b)
Figure 11. Comparison between CO2 emission algorithms in experiments 13 (a) and 33 (b).
The final amount of CO2 emitted in these experiments has a good approximation
with any used model, although slightly better with the phenomenological model [73]. This
approximation is better than the transient dynamics because this estimation considers the
cumulative emissions during the trip in contrast to the instantaneous emissions whose
measurements vary when the vehicle is idle or when is reducing/increasing the speed.
According to the cluster of experiments by route presented in Figure 3, it can be
noticed that on experiments from 1 to 10, the major percentage of emissions occurs during
morning tests (odd number experiments) and the lower emissions during midday trips
(even number experiments). In the same way, for experiments 11 to 20, the major emissions
occurred during evening trips (even number experiments) and the lower during midday
experiments (odd number experiments). This is an indication of the influence of morning
and evening traffic on the city. In Figure 12, a comparison of CO2 emissions against vehicle
speed in the time domain is presented. It can be observed that emissions increase when
speed reduces because as the vehicle is idle and/or does not advance, the instant emissions
per kilometer increase. Thus, the vehicle is releasing more CO2 into the air per kilometer.
Sensors 2023, 23, 814 17 of 30
1200
Emissions Speed
1000 100
800 80
[Km/h]
CO2 [g/Km]
600 60
Vehicle spped
400 40
200 20
0
0 100 200 300 400 500 600
Time [s]
Figure 12. CO2 emissions with respect to the vehicle speed. Note that lower than 20 km/h (marked
by the green arrows), the CO2 emissions per kilometer are drastically increased because the vehicle is
liberating pollutants even the car is in idling stops.
Table 2. Threshold definition for driving style using the heuristic-based rules method.
40 3
Aggresive Aggresive
Right turn
30 2
10
Yaw rate [º/s]
0
0
Left turn -1
-10
-2
-20 Accelerating
Non-aggressive Non-aggressive
-30 -3
-40 -4
Time [s] Time [s]
(a) (b)
Figure 13. Thresholds for driver 1 using the heuristic method. In (a) is the yaw rate measurement
when the vehicle turns on left and right and the driver style is aggressive and non-aggressive; in
(b), the longitudinal acceleration in the braking and accelerating mode is presented, considering an
aggressive and non-aggressive driving style.
On the other hand, the statistical approach [14], based on OBD data, consists of
dividing the trip into three fundamental events: launch, acceleration, and brake event.
Launch event is related to the acceleration from idle position (vehicle stopped but with the
engine on) to a certain speed. Launch is an event that can provide information mostly on
city driving because of several brakes and accelerations are done because of bumps, traffic,
traffic lights, and other causes. The acceleration event is related to the vehicle in movement
but the user wants to increase the speed. The braking event also has a high relationship
with driving style, because several brakes and/or harsh brakes indicate a more aggressive
style, besides is related to more fuel combustion since the brake is related to energy loss. In
Table 3, each event and how the start and stop are considered for each event are presented.
In Figure 14a, the event detection example on a certain trip is presented. The event is
plotted over the speed curve. The start of each event is plotted with asterisk symbols and
the ending of the event is plotted with circles. The blue marks indicate acceleration events,
red marks indicate braking events and green marks indicate launching events. As it can be
observed, several events can be detected for a particular driving experiment.
Sensors 2023, 23, 814 19 of 30
5 5
40 Calm
brake 4.5 Normal 4.5
35 Aggressive
Acceleration 4 4 Aggressive (100%)
25 3 3
20 2.5 2.5
2 2 Normal (66%)
15
1.5 1.5
10
1 1
5
Launch 0.5 0.5
Calm (33%)
0 0 0
50 55 60 65 70 75 80 85 90 95 20 30 40 50 60 70 80 20 30 40 50 60 70 80
Time [s] Speed [Km/h] Speed [Km/h]
Once all the events are detected and labeled from a driving test, the features associated
to the corresponding measurements are extracted from each event in order to classify them
into a specific level in driving style: aggressive, normal, or calm. For instance, in Figure 14,b
is illustrated an example of the feature extraction from the longitudinal acceleration for the
acceleration event, this assignation strongly depends on the driver’s perception. Finally, these
features are condensed into a single score map considering their 90th percentile value, due
to this percentile is a better predictor with more stability than the mean value, see Figure 14c.
This score map performs as a threshold map among the driving styles, and normalizes
the measurement value in 33%, 66%, or 100% according to the driving style classification
illustrated in Figure 14c. Due to each event is monitored by different measurements, each
measurement is normalized, and a weighted linear combination among the converted
scores is used to emit a final driving score that combines all the events for a particular
driving test, more details in [14].
In Table 4, the comparison of both algorithms is presented, in addition to the heuristic-
based rules proposed in [1] and the statistical algorithm proposed by [14], for the nine experi-
ments. The second column states the driver’s self-evaluation which is considered as the
reference. In the following columns are presented the classification results of the algorithms
associated with the driving style. It is worth to mention the drivers followed all the time
the norms and rules for driving in Monterrey city during all the experiments.
By analyzing Table 4, the statistical algorithm (OBD data-based) reaches 78% of ef-
fectiveness in the driving style classification, while the heuristic method (smartphone
data-based) has identified correctly 89% of the driving styles. However, in this later ap-
Sensors 2023, 23, 814 20 of 30
proach, the normal and calm styles perceived by the driver’s self-evaluation are joined into
the non-aggressive style. In both cases, the bad classifications are related to detecting and
scoring an aggressive driving style, such that the threshold definitions could be changed
for getting better results. It is important to consider that these types of algorithms require
the most varied and rich data to have better results and be extrapolated to any vehicle,
driver or type of road.
Taking a look into the score obtained by the statistical approach, the experiments 36
(aggressive style by driver 1), 39 (aggressive style by driver 2), and 42 (aggressive style by
driver 3) have a higher score. If we observe the real fuel consumption in liters of these three
experiments (Appendix A.2), it is found that they have higher fuel consumption among the
driving style experiments (from 36 to 44). In addition, this observation is also obtained in
the CO2 emissions. Thus, an aggressive driving style causes more fuel consumption and
emits more pollutants.
Light Tachycardia
Normal
Light Bradycardia
Bradycardia
0 100 200 300 400 500 600 700
Time [s]
70
100
60
80 3. High
HR [beats/min]
50
Speed [km/h]
Traffic
60 40
30
40 2. Medium
Light Tachycardia
20
Normal
20 Light Bradycardia
Bradycardia 10
0 1.Low 0
0 100 200 300 400 500 600 700 0 100 200 300 400 500 600 700 0 100 200 300 400 500 600 700
Time [s] Time [s] Time [s]
The above results confirm the relationship between the driver’s heart rate with driving
style and sudden accelerations caused by external factors such as traffic or road state.
However, these are not the only factors that can influence driver behavior because even the
mood or activities that are not under the scope of this work (journey’s purpose, kind of
companion, etc.) can bias the results. Further analysis on collected signals as well to add
other monitoring devices such as cameras, electroencephalograms or even questionnaires
could help to discern and explain these other not considered factors.
Due to the chosen algorithms used to estimate the fuel consumption, emissions,
driving style and heart condition are of low complexity and use conventional data available
from any smartphone and data from any vehicle’s CAN network, these algorithms can
be integrated into a single platform and this platform can be replicated. Figure 17 shows
a simple mobile application that integrates the algorithms in a single human–vehicle
interaction monitoring system. The app allows the driver to create an account, upload
the most representative vehicle characteristics, monitor the key features of the human–
vehicle interaction system in real-time by a login and the possibility to store the data for
future statistics.
Sensors 2023, 23, 814 22 of 30
Figure 17. Mobile application of the proposed human–vehicle interaction monitoring system: (a) login
interface, (b) create account interface, (c) vehicle specifications interface, (d,e) examples of the
monitoring system in real-time.
6. Conclusions
This paper presents a detailed analysis of interactions among the most important
features of a vehicle monitoring and driver assistance system, highlighting the influence
of the driver into the vehicle performance and vice versa. The monitoring system is
an integration of reliable and feasible algorithms used to monitor the fuel consumption,
CO2 emissions, driving style, and driver’s health in real-time and uses a reduced data
set (32% less than the literature) according to a principal component analysis (PCA) in
order to decrease the transfer of data via Bluetooth between the used devices. For an easy
replicability, three non-invasive devices are required: (1) an On-Board Diagnostic (OBD)
connector, (2) a smartphone and, (3) a biometric wristband. All of them are connected to a
single data acquisition system to collect the information, process the algorithms, and display
the results in real-time on a mobile application easy to interact with and understand.
PCA results demonstrate important correlations between driver and vehicle perfor-
mance. The most important features observed by the proposed integral monitoring system
for the driver and the vehicle are: (1) the fuel consumption increases when less traffic is
on the experiment because it allows having more acceleration by the vehicle, (2) pollutant
emissions increase when speed reduces because as the vehicle is idle the instant emissions
per kilometer increase, (3) an aggressive driving style causes more fuel consumption and
emits more pollutants, (4) light tachycardia episodes occur in the driver exactly when the
traffic is very high, and also (5) an aggressive driving style increases the adrenaline flowing
on the body, leading to an increase in the driver’s heart rate.
According to the average of index error from the total experiments, modeling results
show that the presented integral monitoring system for the driver and the vehicle can
predict correctly the fuel consumption index in 84%, the polluting emissions 89%, and the
driving style 89%.
Data Availability Statement: The data presented in this study are available on request from the
corresponding author.
Acknowledgments: Authors thank Tecnologico de Monterrey and CONACyT because of their partial
support for this work.
Conflicts of Interest: The authors declare no conflict of interest.
Appendix A
Appendix A.1
In this Appendix are described the used variables collected from the OBD-II scanner,
smartphones and biometric wristband, in the 44 experiments.
Appendix A.2
In this Appendix are presented the 44 experiments evaluated using the error indexes
ESR, RRMSE and RE, for estimation of the fuel consumption.
Index
Fuel Consumption [L]
Exp ESR [%] RRMSE [%] RE [%]
Real Phen. OBD Phen. OBD Phen. OBD Phen. OBD
1 1.16 0.98 1.20 70.53 24.69 61.95 36.66 15.43 4.10
2 0.54 0.56 0.48 71.68 13.16 60.35 25.86 4.09 10.36
3 0.86 0.74 0.71 60.87 21.63 57.48 34.26 14.18 17.54
4 0.62 0.65 0.61 72.46 20.32 63.29 33.51 5.52 0.78
5 1.14 0.88 0.97 55.89 19.50 56.20 33.20 22.71 14.92
6 0.53 0.57 0.52 88.09 20.69 60.02 29.08 7.06 1.48
7 0.86 0.66 0.74 74.83 29.53 61.45 38.60 22.83 13.73
8 0.71 0.55 0.59 72.84 23.52 55.69 31.64 21.52 16.67
9 1.07 0.87 0.79 55.49 18.42 60.84 35.06 19.30 26.63
10 0.79 0.61 0.72 77.98 20.05 60.06 30.46 23.28 9.07
11 0.74 0.70 0.64 91.77 24.30 62.27 32.04 5.44 13.48
12 0.81 0.77 0.77 83.97 25.20 63.08 34.55 5.96 5.83
13 0.72 0.70 0.69 90.51 29.76 64.96 37.25 3.32 4.51
14 0.94 0.84 1.00 74.17 21.13 61.42 32.78 10.22 6.51
Sensors 2023, 23, 814 25 of 30
Index
Fuel Consumption [L]
Exp ESR [%] RRMSE [%] RE [%]
Real Phen. OBD Phen. OBD Phen. OBD Phen. OBD
15 0.57 0.19 0.06 255.68 291.08 96.45 102.91 67.15 89.85
16 1.13 0.91 0.97 56.32 18.45 57.55 32.94 18.92 13.71
17 0.62 0.58 0.68 80.11 38.67 61.32 42.60 6.78 9.63
18 0.96 0.75 0.85 64.18 20.75 59.05 33.58 21.75 10.84
19 0.72 0.66 0.63 97.67 28.27 62.96 33.87 8.65 11.61
20 0.89 0.89 0.89 71.63 19.56 62.11 32.46 0.09 0.03
21 0.25 0.20 0.28 53.17 20.08 52.98 32.56 19.72 11.09
22 0.21 0.22 0.30 66.68 26.23 56.03 35.14 2.61 40.36
23 0.27 0.21 0.19 67.25 53.47 61.42 54.77 23.95 31.55
24 0.22 0.23 0.35 75.87 24.63 55.29 31.50 3.08 57.09
25 0.27 0.25 0.26 71.27 25.04 51.50 30.53 5.85 1.45
26 0.42 0.32 0.52 99.01 48.00 67.58 47.06 23.66 21.41
27 0.43 0.33 0.66 77.33 25.97 65.30 37.84 22.15 52.86
28 0.39 0.33 0.57 64.11 19.78 59.17 32.86 16.89 44.90
29 0.29 0.19 0.32 57.54 20.06 54.54 32.20 33.03 11.90
30 0.28 0.22 0.38 106.75 29.58 63.38 33.37 23.06 34.78
31 0.32 0.26 0.33 57.30 28.72 56.16 39.76 17.32 3.72
32 0.48 0.39 0.59 74.76 20.79 59.29 31.26 18.03 23.39
33 0.69 0.46 0.57 105.28 35.97 71.29 41.67 33.46 16.93
34 0.57 0.43 0.56 48.63 19.80 56.35 35.96 23.97 1.09
35 0.21 0.20 0.28 73.96 24.32 53.20 30.50 5.60 31.22
36 0.27 0.18 0.23 55.79 28.76 57.04 40.95 30.41 14.93
37 0.21 0.22 0.30 66.68 26.23 56.03 35.14 2.61 40.36
38 0.22 0.23 0.35 75.87 24.63 55.29 31.50 3.08 57.09
39 0.28 0.26 0.17 63.42 49.79 61.01 54.05 8.20 40.88
40 0.25 0.21 0.30 43.72 19.42 50.66 33.77 13.39 23.92
41 0.26 0.22 0.38 106.75 29.58 63.38 33.37 23.06 34.78
42 0.27 0.20 0.21 66.12 78.86 55.49 60.60 24.19 21.82
43 0.24 0.21 0.29 67.35 34.85 53.54 38.51 12.15 19.87
44 0.20 0.22 0.27 81.64 32.03 60.42 37.85 13.45 37.47
Mean: 0.54 0.46 0.53 77.11 33.53 60.11 37.64 16.07 21.73
Appendix A.3
In this Appendix are presented the 44 experiments evaluated using the error indexes
ESR, RRMSE and RE, for estimation of the emissions of CO2 .
Sensors 2023, 23, 814 26 of 30
Index
CO2 Emissions [Kg]
Exp ESR [%] RRMSE [%] RE [%]
Real Phen. OBD Phen. OBD Phen. OBD Phen. OBD
1 1.53 1.62 1.78 160.71 160.30 52.24 52.17 5.84 16.55
2 1.10 1.16 1.19 68.89 58.90 47.77 44.18 5.55 7.91
3 1.70 1.63 1.69 64.79 57.42 40.30 37.94 4.11 0.92
4 1.25 1.27 1.32 55.30 47.03 42.68 39.36 1.40 5.20
5 1.73 1.97 1.91 101.26 95.38 63.69 61.82 13.98 10.44
6 1.19 1.16 1.25 91.30 121.93 52.22 60.35 2.67 4.91
7 1.70 1.46 1.55 75.81 60.22 50.78 45.26 14.14 8.96
8 1.31 1.54 1.52 66.15 63.22 43.96 42.98 17.48 16.05
9 2.21 1.71 1.74 118.39 117.13 76.79 76.37 22.84 21.25
10 1.46 1.37 1.48 71.87 47.92 53.53 43.71 5.80 1.49
11 1.41 1.50 1.45 50.18 46.71 38.75 37.39 6.48 3.04
12 1.77 1.81 1.78 58.20 44.47 41.80 36.54 2.76 0.61
13 1.44 1.22 1.29 75.97 53.08 49.88 41.69 15.50 10.03
14 1.82 1.91 2.10 97.25 78.62 49.58 44.58 4.88 15.01
15 0.03 0.07 0.08 444.12 700.92 111.03 139.49 113.62 153.43
16 1.89 1.92 1.94 76.92 72.29 47.35 45.91 1.76 2.97
17 1.33 1.56 1.50 70.22 65.52 46.77 45.17 17.87 13.37
18 1.57 1.97 1.79 97.66 88.95 55.52 52.99 25.53 14.35
19 1.48 1.57 1.55 107.29 87.99 54.87 49.69 5.73 4.70
20 1.89 1.84 1.87 78.26 69.03 49.37 46.37 2.73 1.40
21 0.58 0.58 0.59 59.25 45.58 43.96 38.56 0.21 0.97
22 0.49 0.58 0.58 56.00 54.33 43.55 42.90 17.99 18.03
23 0.42 0.41 0.46 206.56 206.09 111.18 111.05 1.77 10.03
24 0.47 0.63 0.60 63.33 67.27 44.75 46.12 34.37 27.85
25 0.54 0.63 0.61 58.39 43.88 40.90 35.45 16.44 12.96
26 0.86 1.03 1.20 69.38 119.13 39.48 51.74 19.48 38.98
27 1.01 0.96 1.27 66.14 76.34 46.13 49.56 4.59 26.20
28 0.89 1.01 1.17 77.54 89.98 43.85 47.23 13.58 32.39
29 0.64 0.56 0.62 67.03 54.91 44.59 40.36 13.09 4.30
30 0.63 0.64 0.68 52.47 34.97 32.64 26.65 2.21 8.24
31 0.76 0.55 0.72 71.12 44.42 44.18 34.91 27.29 4.69
32 1.06 1.07 1.14 49.72 54.05 36.20 37.74 1.30 6.98
Sensors 2023, 23, 814 27 of 30
Index
CO2 emissions [Kg]
Exp ESR [%] RRMSE [%] RE [%]
Real Phen. OBD Phen. OBD Phen. OBD Phen. OBD
33 1.18 1.22 1.26 69.41 60.90 35.38 33.14 3.59 7.01
34 1.13 0.99 1.19 92.17 81.57 55.03 51.77 12.04 5.36
35 0.42 0.58 0.58 92.75 106.13 49.33 52.77 37.13 37.74
36 0.64 0.50 0.59 71.34 57.66 53.24 47.87 22.66 8.92
37 0.49 0.58 0.58 56.00 54.33 43.55 42.90 17.99 18.03
38 0.47 0.63 0.60 63.33 67.27 44.75 46.12 34.37 27.85
39 0.68 0.53 0.70 67.26 48.78 45.89 39.08 22.48 2.82
40 0.54 0.54 0.59 67.95 62.65 46.38 44.54 0.08 7.74
41 0.63 0.64 0.68 52.47 34.97 32.64 26.65 2.21 8.24
42 0.67 0.45 0.72 68.48 27.18 50.52 31.83 32.81 7.20
43 0.52 0.49 0.58 63.68 74.26 51.15 55.24 6.14 10.74
44 0.47 0.60 0.60 71.19 48.71 43.01 35.58 28.28 28.02
Mean: 1.07 1.08 1.14 77.19 70.96 48.38 45.91 12.48 11.83
References
1. Chhabra, R.; Verma, S.; Rama Krishna, C. Detecting Aggressive Driving Behavior using Mobile Smartphone. In Proceedings of
2nd International Conference on Communication, Computing and Networking, Chandigarh, India, 29–30 March 2018; Krishna,
C.R.; Dutta, M.; Kumar, R., Eds.; Springer: Singapore, 2019; Volume 46, Lecture Notes in Networks and Systems, pp. 513–521.
[CrossRef]
2. Saiprasert, C.; Pholprasit, T.; Thajchayapong, S. Detection of Driving Events using Sensory Data on Smartphone. Int. J. Intell.
Transp. Syst. Res. 2017, 15, 17–28. [CrossRef]
3. Arroyo, C.; Bergasa, L.M.; Romera, E. Adaptive fuzzy classifier to detect driving events from the inertial sensors of a smartphone.
In Proceedings of the 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC), Rio de Janeiro, Brazil,
1–4 November 2016; pp. 1896–1901.
4. Feng, Y.; Pickering, S.; Chappell, E.; Iravani, P.; Brace, C. A Support Vector Clustering Based Approach for Driving Style
Classification. Int. J. Mach. Learn. Comput. 2019, 9, 344–350. [CrossRef]
5. Perez, J.; Gonzalez, D.; Milanes, D. Vehicle control in ADAS applications: State of the Art. In Intelligent Transport Systems:
Technologies and Applications; Perallos, A., Hernandez-Jayo, U., Onieva, E., García-Zuazola, I.J.; Eds.; Wiley: Hoboken, NJ, USA,
2015; p. 206. [CrossRef]
6. Sentouh, C.; Nguyen, A.; Benloucif, M.; Popieul, J. Driver-Automation Cooperation Oriented Approach for Shared Control of
Lane Keeping Assist Systems. IEEE Trans. Control. Syst. Technol. 2019, 27, 1962–1978. [CrossRef]
7. Medero, A.; Sename, O.; Puig, V. LPV Lateral Control for ADAS Based on Driver Performance Monitoring; IFAC PapersOnLine;
Elsevier: Amsterdam, The Netherlands, 2022; pp. 685–690.
8. Holzinger, J.; Tkachenko, P.; Obereigner, G.; del Re, L. Context Aware Control of ADAS. In Proceedings of the American Control
Conference, Denver, CO, USA, 1–3 July 2020; pp. 2288–2293.
9. Iqbal, M.; Han, J.; Zhou, Z.; Towey, D.; Chen, T. Metamorphic testing of Advanced Driver-Assistance System (ADAS) simulation
platforms: Lane Keeping Assist System (LKAS) case studies. Inf. Softw. Technol. 2023, 155, 107104. [CrossRef]
10. Formentin, S.; Ongini, C.; Savaresi, S.M. A Smartphone-Based Energy-Oriented Driving Assistance System. Electr. Veh. Shar. Serv.
Smarter Cities 2017, 10, 171–189. [CrossRef]
11. Keytel, L.; Goedecke, J.; Noakes, T.; Hiiloskorpi, H.; Laukkanen, R.; van der Merwe, L.; Lambert, E. Prediction of energy
expenditure from heart rate monitoring during submaximal exercise. J. Sport. Sci. 2005, 23, 289–297. [CrossRef]
12. Lehmann, A.; Gross, A. Towards Vehicle Emission Estimation from Smartphone Sensors. In Proceedings of the 18th IEEE
International Conference on Mobile Data Management (MDM), Daejeon, People Republic of Korea, 29 May–1 June 2017;
pp. 154–163. [CrossRef]
Sensors 2023, 23, 814 28 of 30
13. Alqudah, Y.A.; Sababha, B.H. A Statistical Approach to Estimating Driving Events by a Smartphone. In Proceedings of the 2016
International Conference on Computational Science and Computational Intelligence, Las Vegas, Nevada, USA, 15–17 December
2016; Arabnia, H.R.; Deligiannidis, L.; Yang, M.; Intelligence, I.C.o.C.S.; Computational., Eds.; IEEE: Piscataway, NJ, USA, 2016;
pp. 1021–1025. [CrossRef]
14. Ouali, T.; Shah, N.; Kim, B.; Fuente, D.; Gao, B. Driving Style Identification Algorithm with Real-World Data Based on Statistical
Approach. In Proceedings of the SAE International400 Commonwealth Drive, Warrendale, PA, USA, 5 April 2016.
15. Warren, J.; Lipkowitz, J.; Sokolov, V. Clusters of Driving Behavior From Observational Smartphone Data. IEEE Intell. Transp. Syst.
Mag. 2019, 11, 171–180. [CrossRef]
16. Bejani, M.M.; Ghatee, M. A context aware system for driving style evaluation by an ensemble learning on smartphone sensors
data. Transp. Res. Part Emerg. Technol. 2018, 89, 303–320. [CrossRef]
17. Derbel, O.; Landry, JR, R. Driving style assessment based on the GPS data and fuzzy inference systems. In Proceedings of the
2015 12th International Multi-Conference on IEEE, Systems, Signals & Devices (SSD), , Mahdia, Tunisia, 16–19 March 2015; pp.
1–8. [CrossRef]
18. Sun, R.; Zhang, Y.; Bai, H.; Su, W. Fuzzy logic based approach and sensitivity analysis of irregular driving detection algorithm. In
Proceedings of the 2016 IEEE Chinese Guidance, Navigation and Control Conference, Nanjing, China, 12–14 August 2016; IEEE:
Piscataway, NJ, USA, 2016; pp. 507–511. [CrossRef]
19. Corno, M.; Giani, P.; Tanelli, M.; Savaresi, S.M. Human-in-the-loop bicycle control via active heart rate regulation. IEEE Trans.
Control. Syst. Technol. 2014, 23, 1029–1040. [CrossRef]
20. Ponnan, S.; Theivadas, J.; HemaKumar, V.; Einarson, D. Driver monitoring and passenger interaction system using wearable
device in intelligent vehicle. Comput. Electr. Eng. 2022, 103, 108323. [CrossRef]
21. Orlovska, J.; Novakazi, F.; Lars-Ola, B.; Karlsson, M. Effects of the driving context on the usage of Automated Driver Assistance
Systems (ADAS)—Naturalistic Driving Study for ADAS Evaluation. Transp. Res. Interdiscip. Perspect. 2020, 4, 100093. [CrossRef]
22. Rezaei, M.; Yazdani, M.; Jafari, M.; Saadati, M. Gender differences in the use of ADAS technologies: A systematic review. Transp.
Res. Part F 2021, 78, 1–15. [CrossRef]
23. Zhao, X.; Wu, Y.; Rong, J.; Zhang, Y. Development of a driving simulator based eco-driving support system. Transp. Res. Part
Emerg. Technol. 2015, 58, 631–641. [CrossRef]
24. Reininger, M.; Miller, S.; Zhuang, Y.; Cappos, J. A first look at vehicle data collection via smartphone sensors. In Proceedings of
the Sensors Applications Symposium (SAS), Zadar, Croatia, 13–15 April 2015; pp. 1–6.
25. Cammaerts, K.; Morse, P.; Kidera, K. Improving Performance through the Use of Driver-in-the-Loop Simulations. ATZ Worldw.
2019, 121, 52–57. [CrossRef]
26. Khan, M.Q.; Lee, S. A Comprehensive Survey of Driving Monitoring and Assistance Systems. Sensors 2019, 19, 2574. [CrossRef]
27. Izquierdo-Reyes, J.; Ramirez-Mendoza, R.A.; Bustamante-Bello, M.R.; Navarro-Tuch, S.; Avila-Vazquez, R. Advanced driver
monitoring for assistance system (ADMAS). Int. J. Interact. Des. Manuf. (IJIDeM) 2018, 12, 187–197. [CrossRef]
28. Dehzangi, O.; Sahu, V.; Taherisadr, M.; Galster, S. Multi-modal system to detect on-the-road driver distraction. In Proceedings
of the 2018 21st International Conference on Intelligent Transportation Systems (ITSC), Maui, HI, USA, 4–7 November 2018;
pp. 2191–2196.
29. Peruzzini, M.; Foco, E.; Reboa, A. Toward the Definition of a Technological Set-up for Drivers’ Health Status Monitoring. In
Proceedings of the Transdisciplinary Engineering Methods for Social Innovation of Industry 4.0: Proceedings of the 25th ISPE Inc.
International Conference on Transdisciplinary Engineering, Singapore, 10–14 July 2017; Volume 7, p. 221.
30. Kong, W.; Zhou, Z.; Jiang, B.; Babiloni, F.; Borghini, G. Assessment of driving fatigue based on intra/inter-region phase
synchronization. Neurocomputing 2017, 219, 474–482. [CrossRef]
31. Tjolleng, A.; Jung, K.; Hong, W.; Lee, W.; Lee, B.; You, H.; Son, J.; Park, S. Classification of a Driver’s cognitive workload levels
using artificial neural network on ECG signals. Appl. Ergon. 2017, 59, 326–332. [CrossRef]
32. Chen, D.; Ma, Z.; Li, B.C.; Yan, Z.; Li, W. Drowsiness detection with electrooculography signal using a system dynamics approach.
J. Dyn. Syst. Meas. Control. 2017, 139. [CrossRef]
33. Vasudevan, K.; Das, A.P.; Sandhya, B.; Subith, P. Driver drowsiness monitoring by learning vehicle telemetry data. In Proceedings
of the 2017 10th International Conference on Human System Interactions (HSI), Ulsan, People Republic of Korea, 17–19 July 2017;
pp. 270–276.
34. Araujo, R.; Igreja, A.; de Castro, R.; Araujo, R.E. Driving coach: A smartphone application to evaluate driving efficient patterns.
In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Madrid, Spain, 3–7 June 2012; IEEE: Piscataway, NJ, USA, 2012;
pp. 1005–1010. [CrossRef]
35. Casavola, A.; Prodi, G.; Rocca, G. Efficient gear shifting strategies for green driving policies. In Proceedings of the Proceedings of
the 2010 American Control Conference, Baltimore, MD, USA, 30 June–2 July 2010; pp. 4331–4336.
36. Astarita, V.; Festa, D.C.; Mongelli, D.W. EcoSmart: An Application for Smartphones for Monitoring Driving Economy. Adv. Mater.
Res. 2013, 827, 360–367. [CrossRef]
37. Zhou, M.; Jin, H.; Wang, W. A review of vehicle fuel consumption models to evaluate eco-driving and eco-routing. Transp. Res.
Part Transp. Environ. 2016, 49, 203–218. [CrossRef]
Sensors 2023, 23, 814 29 of 30
38. Cachón, L.; Pucher, E. Fuel Consumption Simulation Model of a CNG Vehicle Based on Real-World Emission Measurement.
In Proceedings of the 8th International Conference on Engines for Automobiles, Consiglio Nazionale delle Ricerche, Italy,
16 September 2007. [CrossRef]
39. Skog, I.; Handel, P. Indirect Instantaneous Car-Fuel Consumption Measurements. IEEE Trans. Instrum. Meas. 2014, 63, 3190–3198.
[CrossRef]
40. Orfila, O.; Saint Pierre, G.; Messias, M. An android based ecodriving assistance system to improve safety and efficiency of internal
combustion engine passenger cars. Transp. Res. Part Emerg. Technol. 2015, 58, 772–782. [CrossRef]
41. Karaduman, M.; Eren, H. Deep learning based traffic direction sign detection and determining driving style. In Proceedings of
the 2nd International Conference on Computer Science and Engineering, Antalya, Turkey, 5–8 October 2017; IEEE: New York, NY,
USA, 2017; pp. 1046–1050. [CrossRef]
42. Vilaça, A.; Aguiar, A.; Soares, C. Estimating fuel consumption from GPS data. In Proceedings of the Iberian Conference on
Pattern Recognition and Image Analysis, Santiago de Compostela, Spain, 17–19 June 2015; pp. 672–682.
43. Kanarachos, S.; Mathew, J.; Fitzpatrick, M.E. Instantaneous vehicle fuel consumption estimation using smartphones and recurrent
neural networks. Expert Syst. Appl. 2019, 120, 436–447. [CrossRef]
44. Zeng, W.; Miwa, T.; Morikawa, T. Exploring trip fuel consumption by machine learning from GPS and CAN bus data. J. East.
Asia Soc. Transp. Stud. 2015, 11, 906–921.
45. European Parliament. CO2 Emissions from Cars: Facts and Figures (Infographics); European Parliament: Strasbourg, France, 2019.
46. Forehead, H.; Huynh, N. Review of modelling air pollution from traffic at street-level-The state of the science. Environ. Pollut.
2018, 241, 775–786. [CrossRef]
47. Zarandi, M.F.; Faraji, M.; Karbasian, M. Interval type-2 fuzzy expert system for prediction of carbon monoxide concentration in
mega-cities. Appl. Soft Comput. 2012, 12, 291–301. [CrossRef]
48. Kwok, L.; Lam, Y.; Tam, C.Y. Developing a statistical based approach for predicting local air quality in complex terrain area.
Atmos. Pollut. Res. 2017, 8, 114–126. [CrossRef]
49. Cai, M.; Yin, Y.; Xie, M. Prediction of hourly air pollutant concentrations near urban arterials using artificial neural network
approach. Transp. Res. Part Transp. Environ. 2009, 14, 32–41. [CrossRef]
50. Azeez, O.S.; Pradhan, B.; Shafri, H.Z.; Shukla, N.; Lee, C.W.; Rizeei, H.M. Modeling of CO emissions from traffic vehicles using
artificial neural networks. Appl. Sci. 2019, 9, 313. [CrossRef]
51. Pucher, G. Deriving Traffic-Related CO2 Emission Factors with High Spatiotemporal Resolution from Extended Floating Car Data.
In The Rise of Big Spatial Data; Ivan, I., Singleton, A., Horák, J., Inspektor, T.; Eds.; Springer: Cham, Switzerland, 2017; pp. 55–68.
52. de Boer, G.; Krootjes, P. The Quality of Floating Car Data Benchmarked: An Alternative to Roadside Equipment? In Proceedings
of the 19th ITS World Congress ERTICO-ITS, Vienna, Austria, 22–26 October 2012.
53. Turkensteen, M. The accuracy of carbon emission and fuel consumption computations in green vehicle routing. Eur. J. Oper. Res.
2017, 262, 647–659. [CrossRef]
54. Abera, E.S.; Belay, A.; Abraham, A. Real-Time Vehicle Emission Monitoring and Location Tracking Framework. In Advances
in Nature and Biologically Inspired Computing, Pietermaritzburg, South Africa, 1–3 December 2015; Springer: Berlin, Germany,
pp. 211–221.
55. Guillaume Sabiron.; Laurent Thibault.; Philippe Degeilh.; Gilles Corde. Sabiron, G.; Thibault, L.; Dégeilh, P.; Corde, G. Pollutant
Emissions Estimation Framework for Real-Driving Emissions at Microscopic Scale and Environmental Footprint Calculation. In
Proceedings of the 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 26–30 June 2018.
56. Dörr, D.; Grabengiesser, D.; Gauterin, F. Online driving style recognition using fuzzy logic. In Proceedings of the 17th international
IEEE conference on intelligent transportation systems (ITSC), Qingdao, China, 8–11 October 2014; pp. 1021–1026.
57. Tillmann, W.; Hobbs, G. The accident-prone automobile driver: A study of the psychiatric and social background. Am. J.
Psychiatry 1949, 106, 321–331. [CrossRef] [PubMed]
58. Marina Martinez, C.; Heucke, M.; Wang, F.Y.; Gao, B.; Cao, D. Driving Style Recognition for Intelligent Vehicle Control and
Advanced Driver Assistance: A Survey. IEEE Trans. Intell. Transp. Syst. 2018, 19, 666–676. [CrossRef]
59. Meseguer, J.E.; Calafate, C.T.; Cano, J.C.; Manzoni, P. Assessing the impact of driving behavior on instantaneous fuel consumption.
In Proceedings of the 2015 12th Annual IEEE Consumer Communications and Networking Conference (CCNC), Las Vegas, NV,
USA, 9–12 January 2015; pp. 443–448.
60. Stoichkov, R. Android Smartphone Application for Driving Style Recognition. Ph.D. Thesis, Department of Electrical Engineering
and Information Technology Institute for Media Technology, Ilmenau, Germany, 2013.
61. Rizzo, N.; Sprissler, E.; Hong, Y.; Goel, S. Privacy preserving driving style recognition. In Proceedings of the 2015 International
Conference on Connected Vehicles and Expo (ICCVE), Shenzhen, China, 19–23 October 2015; pp. 232–237.
62. Zheng, W.; Nai, W.; Zhang, F.; Qin, W.; Dong, D. A novel set of driving style risk evaluation index system for UBI-based
differentiated commercial vehicle insurance in China. In Proceedings of the CICTP 2015; Beijing, China, 24–27 July 2015, pp.
2510–2524.
63. Daptardar, S.; Lakshminarayanan, V.; Reddy, S.; Nair, S.; Sahoo, S.; Sinha, P. Hidden Markov Model based driving event detection
and driver profiling from mobile inertial sensor data. In Proceedings of the SENSORS, Busan, Korea, 1–4 November 2015; pp. 1–4.
[CrossRef]
Sensors 2023, 23, 814 30 of 30
64. Castignani, G.; Derrmann, T.; Frank, R.; Engel, T. Smartphone-Based Adaptive Driving Maneuver Detection: A Large-Scale
Evaluation Study. IEEE Trans. Intell. Transp. Syst. 2017, 18, 2330–2339. [CrossRef]
65. Izquierdo-Reyes, J.; Ramirez-Mendoza, R.A.; Bustamante-Bello, M.R.; Pons-Rovira, J.L.; Gonzalez-Vargas, J.E. Emotion recognition
for semi-autonomous vehicles framework. Int. J. Interact. Des. Manuf. (IJIDeM) 2018, 12, 1447–1454. [CrossRef]
66. Bener, A.; Lajunen, T.; Özkan, T.; Haigney, D. The effect of mobile phone use on driving style and driving skills. Int. J.
Crashworthiness 2006, 11, 459–465. [CrossRef]
67. Yang Zheng.; John Hansen. Zheng, Y.; Hansen, J.H. Unsupervised Driving Performance Assessment using Free-Positioned
Smartphones in Vehicles: Windsor Oceanico Hotel. In Proceedings of the 2016 IEEE 19th International Conference on Intelligent
Transportation Systems (ITSC), Rio de Janeiro, Brazil, 1–4 November 2016.
68. Savaresi, S.M.; Bittanti, S.; Montiglio, M. Identification of semi-physical and black-box non-linear models: the case of MR-dampers
for vehicles control. Automatica 2005, 41, 113–127.
69. Witters, M.; Swevers, J. Black-box model identification for a continuously variable, electro-hydraulic semi-active damper. Mech.
Syst. Signal Process. 2010, 24, 4–18. [CrossRef]
70. Barschdorff, D.; Wei Zhang. Barschdorff, D.; Zhang, W. Respiratory rhythm detection with photoplethysmographic methods. In
Proceedings of the 16th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Baltimore,
MD, USA, 3–6 November 1994; Volume 2, pp. 912–913.
71. Najjaran, S.; Rahmani, Z.; Hassanzadeh, M. Fuzzy predictive control strategy for plug-in hybrid electric vehicles over multiple
driving cycles. Int. J. Dynam. Control, 2021, 10, 930–941. [CrossRef]
72. Singh, M.; Dubey, R. Deep Learning Model Based CO2 Emissions Prediction using Vehicle Telematics Sensors Data. IEEE Trans.
Intell. Veh. 2021. [CrossRef]
73. European Commission, European Environment Agency, Joint Research Centre. COPERT 4: Estimating Emissions from Road
Transport; European Environment Agency: Brussels, Belgium, 2011. [CrossRef]
74. Moczko, J.A. Advanced methods of heart rate signals processing and their usefulness in diagnosis support I. Mathematical heart
rate descriptors and virtual instrumentation. Comput. Methods Sci. Technol. 2002, 8, 65–76. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.