Selfdriving Cars and Human Factors
Selfdriving Cars and Human Factors
2296
Copyright 2012 by Human Factors and Ergonomics Society, Inc. All rights reserved. DOI 10.1177/1071181312561483
I. INTRODUCTION
Automation has entered many aspects of our daily life,
including the way we transport ourselves. Adaptive Cruise
Control (ACC), lane-keeping assistance, and blind spot
assistance are being introduced into vehicles at a rapid pace.
Such systems provide information and advice (e.g., warnings,
suggested actions) or control the vehicle in specific
longitudinal or lateral tasks. Although fully automated cars
have been under investigation for about half a century (e.g.,
Levine & Athans, 1966; Burnham & Bekey, 1976; Ioannou &
Chien, 1993; Hall & Chaib-Draa, 2005; Naus et al., 2009),
they are not yet available for public use. The challenges of
vehicular automation are more than technical. Neale & Dingus
(1998) stated that the hardest problems associated with an
Automated Highway System (AHS) are soft; that is, they
are human factors issues of safety, usability, and acceptance,
as well as institutional issues. These are problems that are
many times more difficult to overcome and must be overcome,
largely, in parallel with the traditionally hard technological
issues (p. 111).
II. CHALLENGES OF INTERACTION BETWEEN HUMAN AND
AUTOMATION
One might be inclined to think that automation
eventually reduces the humans task to the selection of the
travel destination. However, the reality is that even with
highly automated systems, the contribution of the human
operator is crucial (Bainbridge, 1983). Using automation shifts
the humans driving tasks from manual control to supervisory
control of the conducted maneuvers (Geyer et al., 2011).
Being out of the loop may lead to overreliance, behavioral
adaptation, erratic mental workload, skill degradation, reduced
situation awareness, and an inadequate mental model of
automation capabilities (cf. Endsley & Kiris, 1995;
Parasuraman et al., 2000). In the following, we briefly revisit
these issues in the driving context.
Overreliance
Overreliance (or complacency) is defined as the situation
where the human does not question the performance of
automation and insufficiently counterchecks the automation
status. Distraction and poor judgment are two major causes of
accidents (Peters & Peters, 2002). Overreliance and loss of
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012
2297
Instruction
and
Vehicle-to-Human
Shared Control
Several researchers have argued that interactions
between human and automation should not merely consist of
activations and deactivations. They have proposed developing
appropriate frameworks to keep drivers involved in the control
loop (Stanton & Young, 2000), allow drivers to understand the
systems capability (Seppelt & Lee, 2007), and support the
acquisition of situation awareness with a minimum of
cognitive effort.
Shared control is a framework whereby human and
automation cooperate to achieve the required control action
together. This approach should realize automation benefits
(e.g., fast response, accurate control) while avoiding problems
such as the out-of-the-loop unfamiliarity and mode errors
(Flemisch et al., 2012). Abbink et al. (2012) developed a
haptic gas pedal and steering which has been tested as a
medium for shared control for car following and curve
negotiation. The gas pedal stiffness adapts according to the
headway to the following car. The human can still overrule
and change the distance by using more or less force on the
pedals. De Winter and Dodou (2011) provided a critical
reflection of the literature on the effects of shared control on
road safety. They argued that force feedback should not be
provided continuously, but only when deviations from
acceptable tolerance limits arise.
Adaptive Automation
Variation in driving conditions (e.g., infrastructure,
traffic rules, traffic density, and weather) and drivers
population (e.g., age, gender, and experience) justify
designing automation systems that can adapt to these
differences. Adaptive interfaces can reduce the drivers mental
workload by filtering the presentation of information
according to situational requirements. Piechulla et al. (2003)
implemented such a filter as a projective real-time workload
estimator based on an assessment of the current traffic
situation. In a driving-simulator study, Lee et al. (2007)
quantified driver sensitivity to different ranges of brake
duration and magnitude. They suggested that their findings
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012
2298
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012
from the driving task. In addition, there is a risk that the driver
does not fully understand what the setting implies. The use of
additional icons can be distracting, while the learning and
remembering involved can impose an extra workload on
drivers. An alternative option is to use a system consisting of
an adaptive setting based on the drivers history of manual car
following. The system can choose the average minimum
headway distance and maximum speed that the driver held for
more than a set period (e.g., one minute) within the most
recent hour of driving. Such a mechanism will prevent the
driver from having to face an unexpected following distance
and bring forward an expected following distance. The CACC
should be turned on by the driver who should be informed of
the system status by visual/audio cues to maintain correct
awareness.
Joining. When the system is on, the car should start
cruising at the set speed, resort to car following when
approaching a slower vehicle, and brake to a complete
standstill if needed. In other words, a platoon can be joined
from the rear without driver intervention. However, drivers
have to be informed about large speed differences, and may be
advised to change lanes to better follow their desired speed.
Here, adaptive automation may be used, monitoring the driver
state, and providing person-specific advice to the driver.
Drivers should be aware of situations where joining is not
feasible due to the maneuvers of other members of the platoon
or any other constraints, such as maximum length of the
platoon because of the road layout. Thus, transitions in the
platoon that pose constraints on any other platooning members
should be communicated to the constrained members.
Platooning. The state of the system should be clearly
communicated to the driver as Platoon Mode through a
communication portal or icon. This mode can be announced
audibly and repeated at certain intervals to keep the human
aware. Again, using adaptive automation, frequency, and
intensity should be raised when humans are potentially less
attentive (e.g., when they do not provide any input for a
prolonged time).
When platooning, humans should not experience
unannounced or abrupt changes. Topology changes in the
platoon and the splitting off and joining of other members
should be announced to avoid surprises and lowering of trust
in the automation. Humans should be aware of the limits of
maneuvers. For example, a human cannot close the headway
beyond a certain threshold and should not steer instantly. To
avoid sudden steering, haptic feedback can be used on the
steering wheel.
Splitting. Drivers should be able to safely end their
platooning. One potential solution is that the human increases
the headway to an allowable limit. When this limit is reached
the CACC system disables and the driver is informed of the
shutdown via an information portal.
VI. CONCLUSION
This paper reviewed several human-factors challenges of
automated driving. We applied the issues, needs and solutions
for vehicle automation to the concept of CACC and proposed
an interaction mechanism between humans and CACC. The
proposed system has few modes, keeps drivers engaged in the
2299
PROCEEDINGS of the HUMAN FACTORS and ERGONOMICS SOCIETY 56th ANNUAL MEETING - 2012
2300
NV, 19331938.
Rudin-Brown, C., & Parker, H. (2004). Behavioural adaptation to
adaptive cruise control (ACC): implications for preventive
strategies. Transportation Research Part F: Traffic Psychology
and Behaviour, 7, 5976.
Sarter, N. B., & Woods, D. D. (1997). Team play with a powerful
and independent agent: Operational experiences and automation
surprises on the Airbus A-320. Human Factors, 39, 553569.
Seppelt, B., & Lee, J. (2007). Making adaptive cruise control
(ACC) limits visible. International Journal of Human Computer
Studies, 65, 192205.
Sheridan, T. B. (1999). Human supervisory control of aircraft, rail
and highway vehicles. Transactions of the Institute of
Measurement and Control, 21, 191201.
Spiessl, W., & Hussmann, H. (2011). Assessing error recognition
in automated driving. IET Intelligent Transportation Systems, 5,
103111.
Stanton, N., Young, M., & McCaulder, B. (1997). Drive-by-wire:
The case of driver workload and reclaiming control with
adaptive cruise control. Safety Science, 27, 149159.
Stanton, N. A., & Young, M. S. (2000). A proposed
psychological model of driving automation. Theoretical Issues
in Ergonomics Science, 1, 315331.
Stanton, N., & Young, M. (2005). Driver behavior with adaptive
cruise control. Ergonomics, 48, 12941313.
Van den Broek, T. H. A., Netten, B. D., Hoedemaeker, M., &
Ploeg, J. (2010). The experimental setup of a large field
operational test for cooperative driving vehicles at the A270.
13th International IEEE Annual Conference on Intelligent
Transportation Systems, Madeira Island, Portugal, 198203.
Vahidi, A., & Eskandarian, A. (2003). Research advances in
intelligent collision avoidance and adaptive cruise control. IEEE
Transaction on Intelligent Transportation Systems, 4, 143152.
Victor, T. (2000). On the need for driver attention support
systems. Journal of Traffic Medicine, 28.
Ward, N. J. (2000). Automation of task processes: An example of
Intelligent Transportation Systems. Human Factors and
Ergonomics in Manufacturing, 10, 395408.
Wickens, C. D. (2008). Multiple Resources and Mental
Workload. Human Factors, 50, 449455.
Wilde, G. J. S. (1988). Risk Homeostasis Theory and traffic
accidents: Propositions, deductions and discussions of
dissension in recent reactions. Ergonomics, 31, 441468.
Young, M. S., & Stanton, N. (2002). Attention and automation:
new perspectives on mental under load and performance.
Theoretical Issues Ergonomic Science, 3, 178194.
Zhang, P., & McDonald, M. (2005). Manual vs. adaptive cruise
control Can drivers expectation be matched? Transportation
Research Part C: Emerging Technologies, 13, 421431.