Socialroboticssurvey PDF
Socialroboticssurvey PDF
Abstract
This paper reviews “socially interactive robots”: robots for which social human–robot interaction is important. We begin
by discussing the context for socially interactive robots, emphasizing the relationship to other research fields and the different
forms of “social robots”. We then present a taxonomy of design methods and system components used to build socially
interactive robots. Finally, we describe the impact of these robots on humans and discuss open issues. An expanded version
of this paper, which contains a survey and taxonomy of current applications, is available as a technical report [T. Fong, I.
Nourbakhsh, K. Dautenhahn, A survey of socially interactive robots: concepts, design and applications, Technical Report No.
CMU-RI-TR-02-29, Robotics Institute, Carnegie Mellon University, 2002].
© 2003 Elsevier Science B.V. All rights reserved.
Keywords: Human–robot interaction; Interaction aware robot; Sociable robot; Social robot; Socially interactive robot
0921-8890/03/$ – see front matter © 2003 Elsevier Science B.V. All rights reserved.
doi:10.1016/S0921-8890(02)00372-X
144 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
underlying assumption is that humans prefer to inter- maintain the child’s interest. Learned development of
act with machines in the same way that they interact social (and other) skills is a primary concern of epi-
with other people. A survey and taxonomy of current genetic robotics [44,169].
applications is given in [60]. Some researchers design socially interactive robots
Socially interactive robots operate as partners, peers simply to study embodied models of social behav-
or assistants, which means that they need to exhibit a ior. For this use, the challenge is to build robots that
certain degree of adaptability and flexibility to drive have an intrinsic notion of sociality, that develop social
the interaction with a wide range of humans. Socially skills and bond with people, and that can show empa-
interactive robots can have different shapes and func- thy and true understanding. At present, such robots re-
tions, ranging from robots whose sole purpose and main a distant goal [39,44], the achievement of which
only task is to engage people in social interactions will require contributions from other research areas
(Kismet, Cog, etc.) to robots that are engineered to such as artificial life, developmental psychology and
adhere to social norms in order to fulfill a range of sociology [133].
tasks in human-inhabited environments (Pearl, Sage, Although socially interactive robots have already
etc.) [18,117,127,140]. been used with success, much work remains to in-
Some socially interactive robots use deep models crease their effectiveness. For example, in order for
of human interaction and pro-actively encourage so- socially interactive robots to be accepted as “natural”
cial interaction. Others show their social competence interaction partners, they need more sophisticated so-
only in reaction to human behavior, relying on humans cial skills, such as the ability to recognize social con-
to attribute mental states and emotions to the robot text and convention.
[39,45,55,125]. Regardless of function, building a so- Additionally, socially interactive robots will even-
cially interactive robot requires considering the human tually need to support a wide range of users: differ-
in the loop: as designer, as observer, and as interaction ent genders, different cultural and social backgrounds,
partner. different ages, etc. In many current applications, so-
cial robots engage only in short-term interaction (e.g.,
1.4. Why socially interactive robots? a museum tour) and can afford to treat all humans in
the same manner. But, as soon as a robot becomes part
Socially interactive robots are important for do- of a person’s life, that robot will need to be able to
mains in which robots must exhibit peer-to-peer in- treat him as a distinct individual [40].
teraction skills, either because such skills are required In the following, we closely examine the concepts
for solving specific tasks, or because the primary func- raised in this introductory section. We begin by de-
tion of the robot is to interact socially with people. A scribing different design methods. Then, we present a
discussion of application domains, design spaces, and taxonomy of system components, focusing on the de-
desirable social skills for robots is given in [42,43]. sign issues unique to socially interactive robots. We
One area where social interaction is desirable is conclude by discussing open issues and core chal-
that of “robot as persuasive machine” [58], i.e., the lenges.
robot is used to change the behavior, feelings or atti-
tudes of humans. This is the case when robots mediate
human–human interaction, as in autism therapy [162]. 2. Methodology
Another area is “robot as avatar” [123], in which the
robot functions as a representation of, or representa- 2.1. Design approaches
tive for, the human. For example, if a robot is used for
remote communication, it may need to act socially in Humans are experts in social interaction. Thus,
order to effectively convey information. if technology adheres to human social expectations,
In certain scenarios, it may be desirable for a robot people will find the interaction enjoyable, feeling
to develop its interaction skills over time. For exam- empowered and competent [130]. Many researchers,
ple, a pet robot that accompanies a child through his therefore, explore the design space of anthropomor-
childhood may need to improve its skills in order to phic (or zoomorphic) robots, trying to endow their
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 147
creations with characteristics of intentional agents. For it must perceive the same things that humans find to
this reason, more and more robots are being equipped be salient and relevant [169].
with faces, speech recognition, lip-reading skills, The second rationale for biological inspiration is
and other features and capacities that make robot– that it allows us to directly examine, test and refine
human interaction “human-like” or at least “creature- those scientific theories upon which the design is based
like” [41,48]. [1]. This is particularly true with humanoid robots.
From a design perspective, we can classify how so- Cog, for example, is a general-purpose humanoid plat-
cially interactive robots are built in two primary ways. form intended for exploring theories and models of
With the first approach, “biologically inspired”, de- intelligent behavior and learning [140].
signers try to create robots that internally simulate, or Some of the theories commonly used in biologically
mimic, the social intelligence found in living creatures. inspired design are as follows.
With the second approach, “functionally designed”, Ethology. This refers to the observational study
the goal is to construct a robot that appears outwardly of animals in their natural setting [95]. Ethology
to be socially intelligent, even if the internal design can serve as a basis for design because it describes
does not have a basis in science. the types of activity (comfort-seeking, play, etc.) a
Robots have limited perceptual, cognitive, and be- robot needs to exhibit in order to appear life-like
havioral abilities compared to humans. Thus, for the [4]. Ethology is also useful for addressing a range of
foreseeable future, there will continue to be significant behavioral issues such as concurrency, motivation, and
imbalance in social sophistication between human and instinct.
robot [20]. As with expert systems, however, it is pos- Structure of interaction. Analysis of interactional
sible that robots may become highly sophisticated in structure (such as instruction, cooperation, etc.) can
restricted areas of socialization, e.g., infant-caretaker help focus design of perception and cognition systems
relations. by identifying key interaction patterns [162]. Dauten-
Finally, differences in design methodology means hahn, Ogden and Quick use explicit representations
that the evaluation and success criteria are almost al- of interactional structure to design “interaction aware”
ways different for different robots. Thus, it is hard robots [48]. Dialogue models, such as turn-taking in
to compare socially interactive robots outside of their conversation, can also be used in design as in [104].
target environment and use. Theory of mind. Theory of mind refers to those
social skills that allow humans to correctly attribute
2.1.1. Biologically inspired beliefs, goals, perceptions, feelings, and desires to
With the “biologically inspired” approach, design- themselves and others [163]. One of the critical pre-
ers try to create robots that internally simulate, or cursors to these skills is joint (or shared) attention:
mimic, the social behavior or intelligence found in liv- the ability to selectively attend to an object of mutual
ing creatures. Biologically inspired designs are based interest [7]. Joint attention can aid design, by provid-
on theories drawn from natural and social sciences, ing guidelines for recognizing and producing social
including anthropology, cognitive science, develop- behaviors such as gaze direction, pointing gestures,
mental psychology, ethology, sociology, structure of etc. [23,140].
interaction, and theory of mind. Generally speaking, Developmental psychology. Developmental psy-
these theories are used to guide the design of robot chology has been cited as an effective mechanism
cognitive, behavioral, motivational (drives and emo- for creating robots that engage in natural social ex-
tions), motor and perceptual systems. changes. As an example, the design of Kismet’s
Two primary arguments are made for drawing in- “synthetic nervous system”, particularly the percep-
spiration from biological systems. First, numerous tual and behavioral aspects, is heavily inspired by the
researchers contend that nature is the best model for social development of human infants [18]. Addition-
“life-like” activity. The hypothesis is that in order for ally, theories of child cognitive development, such as
a robot to be understandable by humans, it must have Vygotsky’s “child in society” [92], can offer a frame-
a naturalistic embodiment, it must interact with its work for constructing robot architecture and social
environment in the same way living creatures do, and interaction design [44].
148 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
2.3. Embodiment
personality traits has been the “Big Five Inventory” Robot personality is conveyed in numerous ways.
[76]. The “Big Five”, which was developed through Emotions are often used to portray stereotype person-
lexical analysis, described personality in terms of five alities: timid, friendly, etc. [168]. A robot’s embodi-
traits: ment (size, shape, color), its motion, and the manner
in which it communicates (e.g., natural language) also
• extroversion (sociable, outgoing, confidence); contribute strongly [144]. Finally, the tasks a robot
• agreeableness (friendliness, nice, pleasant); performs may also influence the way its personality is
• conscientiousness (helpful, hard-working); perceived.
• neuroticism (emotional stability, adjusted);
• openness (intelligent, imaginative, flexibility). 2.7. Human-oriented perception
Common alternatives to the “Big Five” are questi- To interact meaningfully with humans, social robots
onnaire-based scales such as the Myers–Briggs Type must be able to perceive the world as humans do,
Indicator (MBTI) [112]. i.e., sensing and interpreting the same phenomena that
humans observe. This means that, in addition to the
2.6.2. Personality in social robots perception required for conventional functions (local-
There is reason to believe that if a robot had a com- ization, navigation, obstacle avoidance), social robots
pelling personality, people would be more willing to must possess perceptual abilities similar to humans.
interact with it and to establish a relationship with it In particular, social robots need perception that is
[18,79]. In particular, personality may provide a use- human-oriented: optimized for interacting with hu-
ful affordance, giving users a way to model and un- mans and on a human level. They need to be able
derstand robot behavior [144]. to track human features (faces, bodies, hands). They
In designing robot personality, there are numer- also need to be capable of interpreting human speech
ous questions that need to be addressed. Should the including affective speech, discrete commands, and
robot have a designed or learned personality? Should natural language. Finally, they often must have the ca-
it mimic a specific human personality, exhibiting spe- pacity to recognize facial expressions, gestures, and
cific traits? Is it beneficial to encourage a specific type human activity.
of interaction? Similarity of perception requires more than sim-
There are five common personality types used in ilarity of sensors. It is also important that humans
social robots. and robots find the same types of stimuli salient [23].
Tool-like. Used for robots that operate as “smart Moreover, robot perception may need to mimic the
appliances”. Because these robots perform service way human perception works. For example, the human
tasks on command, they exhibit traits usually associ- ocular-motor system is based on foveate vision, uses
ated with tools (dependability, reliability, etc.). saccadic eye movements, and exhibits specific visual
Pet or creature. These toy and entertainment robots behaviors (e.g., glancing). Thus, to be readily under-
exhibit characteristics that are associated with domes- stood, a robot may need to have similar visual motor
ticated animals (cats, dogs, etc.). control [18,21,25].
Cartoon. These robots exhibit caricatured person-
alities, such as seen in animation. Exaggerated traits 2.7.1. Types of perception
(e.g., shyness) are easy to portray and can be useful Most human-oriented perception is based on pas-
for facilitating interaction with non-specialists. sive sensing, typically computer vision and spoken
Artificial being. Inspired by literature and film, pri- language recognition. Passive sensors, such as CCD
marily science fiction, these robots tend to display ar- cameras, are cheap, require minimal infrastructure,
tificial (e.g., mechanistic) characteristics. and can be used for a wide range of perception tasks
Human-like. Robots are often designed to exhibit [2,36,66,118].
human personality traits. The extent to which a robot Active sensors (ladar, ultrasonic sonar, etc.), though
must have (or appear to have) human personality de- perhaps less flexible than their passive counterparts,
pends on its use. have also received attention. In particular, active
156 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
sensors are often used for detecting and localizing Facial expression. Since Darwin [37], facial expres-
human in dynamic settings. sions have been considered to convey emotion. More
recently, facial expressions have also been thought to
2.7.2. People tracking function as social signals of intent. A comprehensive
For human–robot interaction, the challenge is to find review of facial expression recognition (including a
efficient methods for people tracking in the presence review of ethical and psychological concerns) is [94].
of occlusions, variable illumination, moving cameras, A survey of older techniques is [136].
and varying background. A broad survey of human There are three basic approaches to facial ex-
tracking is presented in [66]. Specific robotics appli- pression recognition [94]. Image motion techniques
cations can be reviewed in [26,114,127,154]. identify facial muscle actions in image sequences.
Anatomical models track facial features, such as the
2.7.3. Speech recognition
distance between eyes and nose. Principal component
Speech recognition is generally a two-step process:
analysis (PCA) reduce image-based representations
signal processing (to transform an audio signal into
of faces into principal components such as eigenfaces
feature vectors) followed by graph search (to match
or holons.
utterances to a vocabulary). Most current systems use
Gaze tracking. Gaze is a good indicator of what a
Hidden Markov models to stochastically determine
person is looking at and paying attention to. A person’s
the most probable match. An excellent introduction to
gaze direction is determined by two factors: head ori-
speech recognition is [129].
entation and eye orientation. Although numerous vi-
Human speech contains three types of information:
sion systems track head orientation, few researchers
who the speaker is, what the speaker said, and how
have attempted to track eye gaze using only passive
the speaker said it [18]. Depending on what infor-
vision. Furthermore, such trackers have not proven to
mation the robot requires, it may need to perform
be highly accurate [158]. Gaze tracking research in-
speaker tracking, dialogue management, or emotion
cludes [139,152].
analysis. Recent applications of speech in robotics in-
clude [18,91,103,120,148].
2.8. User modeling
2.7.4. Gesture recognition
When humans converse, we use gestures to clarify In order to interact with people in a human-like
speech and to compactly convey geometric informa- manner, socially interactive robots must perceive hu-
tion (location, direction, etc.). Very often, a speaker man social behavior [18]. Detecting and recognizing
will use hand movement (speed and range of motion) human action and communication provides a good
to indicate urgency and will point to disambiguate spo- starting point. More important, however, is being able
ken directions (e.g., “I parked the car over there”). to interpret and reacting to behavior. A key mecha-
Although there are many ways to recognize ges- nism for performing this is user modeling.
tures, vision-based recognition has several advantages User modeling can be quantitative, based on the
over other methods. Vision does not require the user evaluation of parameters or metrics. The stereotype
to master or wear special hardware. Additionally, vi- approach, for example, classifies users into different
sion is passive and can have a large workspace. Two subgroups (stereotypes), based on the measurement
excellent overviews of vision-based gesture recogni- of pre-defined features for each subgroup [155]. User
tion are [124,166]. Details of specific systems appear modeling may also be qualitative in nature. Inter-
in [85,161,167]. actional structure analysis, story and script based
matching, and BDI all identify subjective aspects of
2.7.5. Facial perception behavior.
Face detection and recognition. A widely used ap- There are many types of user models: cognitive,
proach for identifying people is face detection. Two attentional, etc. A user model generally contains at-
comprehensive surveys are [34,63]. A large number tributes that describe a user, or group, of users. Models
of real-time face detection and tracking systems have may be static (defined a priori) or dynamic (adapted
been developed in recent years, such as [139,140,158]. or learned). Information about users may be acquired
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 157
explicitly (through questioning) or implicitly (inferred as “swarm intelligence” and “collective robotics”.
through observation). The former can be time con- Other robot–robot work has addressed the use of
suming, and the latter difficult, especially if the user “leader following” [38,72], inter-personal communi-
population is diverse [69]. cation [13,15,149], imitation [14,65], and multi-robot
User models are employed for a variety of purposes. formations [109].
First, user models help the robot understand human In recent years, there has been significant effort
behavior and dialogue. Second, user models shape and to understand how social learning can occur through
control feedback (e.g., interaction pacing) given to the human–robot interaction. One approach is to create se-
user. Finally, user models are useful for adapting the quences of known behaviors to match a human model
robot’s behavior to accommodate users with varying [102]. Another approach is to match observations (e.g.,
skills, experience, and knowledge. motion sequences) to known behaviors, such as mo-
Fong et al. [59] employ a stereotype user model to tor primitives [51,52]. Recently, Kaplan et al. [77]
adapt human–robot dialogue and robot behavior to dif- have explored the use of animal training techniques
ferent users. Pineau et al. discuss the use of a quantita- for teach an autonomous pet robot to perform com-
tive temporal Bayes net to manage individual-specific plex tasks. The most common social learning method,
interaction between a nurse robot and elderly individ- however, is imitation.
uals. Schulte et al. [143] describe a memory-based
learner used by a tour robot to improve its ability to 2.9.2. Imitation
interact with different people. Imitation is an important mechanism for learning
behaviors socially in primates and other animal species
2.9. Socially situated learning [46]. At present, there is no commonly accepted defi-
nition of “imitation” in the animal and human psychol-
In socially situated learning, an individual interacts ogy literature. An extensive discussion is given in [71].
with his social environment to acquire new compe- Researchers often refer to Thorpe’s definition [157],
tencies. Humans and some animals (e.g., primates) which defines imitation as the “copying of a novel or
learn through a variety of techniques including di- otherwise improbable act or utterance, or some act for
rect tutelage, observational conditioning, goal emula- which there is clearly no instinctive tendency”.
tion, and imitation [64]. One prevalent form of in- With robots, imitation relies upon the robot hav-
fluence is local, or stimulus, enhancement in which ing many perceptual, cognitive, and motor capabilities
a teacher actively manipulates the perceived environ- [24]. Researchers often simplify the environment or
ment to direct the learner’s attention to relevant stimuli situation to make the problem tractable. For example,
[96]. active markers or constrained perception (e.g., white
objects on a black background) may be employed to
2.9.1. Robot social learning make tracking of the model amenable.
For social robots, learning is used for transferring Breazeal and Scassellati [24] argue that even if a
skills, tasks, and information. Learning is important robot has the skills necessary for imitation, there are
because the knowledge of the teacher, or model, still several questions that must be addressed:
and robot may be very different. Additionally, be-
cause of differences in sensing and perception, the • How does the robot know when to imitate? In order
model and robot may have very different views of the for imitation to be useful, the robot must decide
world. Thus, learning is often essential for improving not only when to start/stop imitating, but also when
communication, facilitating interaction, and sharing it is appropriate (based on the social context, the
knowledge [80]. availability of a good model, etc.).
A number of studies in robot social learning have • How does the robot know what to imitate? Faced
focused on robot–robot interaction. Some of the ear- with a stream of sensory data, the robot must decide
liest work focused on cooperative, or group, behavior which of the model’s actions are relevant to the task,
[6,100]. A large research community continues to which are part of the instruction process, and which
investigate group social learning, often referred to are circumstantial.
158 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
• How does the robot map observed action into be- sors towards an object, and maintain its focus on the
havior? Once the robot has identified and observed selected object.
salient features of the model’s actions, it must as- Marom and Hayes [96–98] consider attention to be
certain how to reproduce these actions through its a collection of mechanisms that determine the signifi-
behavior. cance of stimuli. Their research focuses on the devel-
• How does the robot evaluate its behavior, correct opment of pre-learning attentional mechanisms, which
errors, and recognize when it has achieved its goal? help reduce the amount of information that an indi-
In order for the robot to improve its performance, it vidual has to deal with.
must be able to measure to what degree its imitation
is accurate and to recognize when there are errors. 2.10.2. Expression
Kozima and Yano [82,83] argue that to be inten-
Imitation has been used as a mechanism for learning
tional, a robot must exhibit goal-directed behavior. To
simple motor skills from observation, such as block
do so, it must possess a sensorimotor system, a reper-
stacking [89] or pendulum balancing [141]. Imitation
toire of behaviors (innate reflexes), drives that trigger
has also been applied to the learning of sensor–motor
these behaviors, a value system for evaluating percep-
associations [3] and for constructing task representa-
tual input, and an adaptation mechanism.
tions [116].
Breazeal and Scassellati [22] describe how Kismet
2.10. Intentionality conveys intentionality through motor actions and fa-
cial expressions. In particular, by exhibiting proto-
Dennett [50] contends that humans use three strate- social responses (affective, exploratory, protective, and
gies to understand and predict behavior. The physical regulatory), the robot provides cues for interpreting its
stance (predictions based on physical characteristics) actions as intentional.
and design stance (predictions based on the design Schulte et al. [143] discuss how a caricatured hu-
and functionality of artifacts) are sufficient to explain man face and simple emotion expression can convey
simple devices. With complex systems (e.g., humans), intention during spontaneous short-term interaction.
however, we often do not have sufficient information, For example, a tour guide robot might have the inten-
to perform physical or design analysis. Instead, we tion of making progress while giving a tour. Its facial
tend to adopt an intentional stance and assume that the expression and recorded speech playback can commu-
systems’ actions result from its beliefs and desires. nicate this information.
In order for a robot to interact socially, therefore, it
needs to provide evidence that is intentional (even if
3. Discussion
it is not intrinsic [138]). For example, a robot could
demonstrate goal-directed behaviors, or it could ex-
hibit the attentional capacity. If it does so, then the 3.1. Human perception of social robots
human will consider the robot to act in a rational
manner. A key difference between conventional and socially
interactive robots is that the way in which a human
2.10.1. Attention perceives a robot establishes expectations that guide
Scassellati [139] discusses the recognition and pro- his interaction with it. This perception, especially of
duction of joint attention behaviors in Cog. Just as the robot’s intelligence, autonomy, and capabilities is
humans use a variety of physical social cues to in- influenced by numerous factors, both intrinsic and ex-
dicate which object is currently under consideration, trinsic.
Cog performs gaze following, imperative pointing, and Clearly, the human’s preconceptions, knowledge,
declarative pointing. and prior exposure to the robot (or similar robots)
Kopp and Gardenfors [84] also claim that atten- have a strong influence. Additionally, aspects of the
tional capacity is a fundamental requirement for in- robot’s design (embodiment, dialogue, etc.) may play
tentionality. In their model, a robot must be able to a significant role. Finally, the human’s experience over
identify relevant objects in the scene, direct its sen- time will undoubtedly shape his judgment, i.e., initial
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 159
impressions will change as he gains familiarity with to observe interaction over time, especially after the
the robot. user had fully integrated the robot into his work
In the following, we briefly present studies that have routine. A key finding was that robots need to be
examined how these factors affect human–robot inter- capable of social interaction, or at least aware of
action, particularly in the way in which the humans the social context, whenever they operate around
relate to, and work with, social robots. people.
In [47], Dautenhahn and Werry describe a quantita-
3.1.1. Attitudes towards robots tive method for evaluating robot–human interactions,
Bumby and Dautenhahn conducted a study to which is similar to the way ethologists use observa-
identify how people, specifically children, perceive tion to evaluate animal behavior. This method has
robots and what type of behavior they may exhibit been used to study differences in interaction style
when interacting with robots [27]. They found that when children play with a socially interactive robotic
children tend to conceive of robots as geometric toy versus a non-robotic toy. Complementing this
forms with human features (i.e., a strong anthropo- approach, Dautenhahn et al. [49] have also proposed
morphic pre-disposition). Moreover, children tend qualitative techniques (based on conversation analy-
to attribute free will, preferences, emotion, and sis) that focus on social context.
male gender to the robots, even without external
cueing. 3.1.3. Effects of emotion
In [78], Khan describes a survey to investigate Cañamero and Fredslund [30] performed a study
people’s attitudes towards intelligent service robots. to evaluate how well humans can recognize facial ex-
A review of robots in literature and film, followed pressions displayed by Feelix (Fig. 10). In this study,
by a interview study, were used to design the sur- they asked test subjects to make subjective judgments
vey questionnaire. The survey revealed that people’s of the emotions displayed on Feelix’s face and in pic-
attitudes are strongly influenced by science fiction. tures of humans. The results were very similar to those
Two significant findings were: (1) a robot with reported in other studies of facial expression recog-
machine-like appearance, serious personality, and nition.
round shape is preferred; (2) verbal communication Bruce et al. [26] conducted a 2 × 2 full factorial
using a human-like voice is highly desired. experiment to explore how emotion expression and
indication of attention affect a robot’s ability to en-
3.1.2. Field studies gage humans. In the study, the robot exhibited different
Thus far, few studies have investigated people’s emotions based on its success at engaging and lead-
willingness to closely interact with social robots. ing a person through a poll-taking task. The results
Given that we expect social robots to play increas- suggest that having an expressive face and indicating
ingly larger roles in daily life, there is a strong need attention with movement can help make a robot more
for field studies to examine how people behave when compelling to interact with.
robots are introduced into their activities.
Scheeff et al. [142] conducted two studies to observe 3.1.4. Effects of appearance and dialogue
how a range of people interact with a creature-like so- One problem with dialogue is that it can lead to bi-
cial robot, in both laboratory and public conditions. In ased perceptions. For example, associations of stereo-
these studies, children were observed to be more en- typed behavior can be created, which may lead users
gaged than adults and had responses that varied with to attribute qualities to the robot that are inaccurate.
gender and age. Also, a friendly robot personality was Users may also form incorrect models, or make poor
reported to have prompted qualitatively better interac- assumptions, about how the robot actually works. This
tion than an angry personality. can lead to serious consequences, the least of which
In [75], Huttenrauch and Severinson-Eklund de- is user error [61].
scribe a long-term usage study of CERO, a service Kiesler and Goetz conducted a series of studies to
robot that assists motion-impaired people in an of- understand the influence of a robot’s appearance and
fice environment (Fig. 9). The study was designed dialogue [79]. A primary contribution of this work are
160 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
measures for characterizing the mental models used is really the design of a new human–machine coop-
by people when they interact with robots. A significant erative system. The design of automated systems is
finding was that neither ratings, nor behavioral obser- really the design of a team and requires provisions
vations alone, are sufficient to fully describe human for the coordination between machine agents and
responses to robots. In addition, Kiesler and Goetz practitioners.
concluded that dialogue more strongly influences de-
velopment and change of mental models than differ- In other words, humans and robots must be able to
ences in appearance. coordinate their actions so that they interact produc-
DiSalvo et al. [54] investigated how the features tively with each other. It is not appropriate (or even
and size of humanoid robot faces contribute to the necessary) to make the robot as socially competent as
perception of humanness. In this study, they analyzed possible. Rather, it is more important that the robot be
48 robots and conducted surveys to measure people’s compatible with the human’s needs, that it matches ap-
perception. Statistical analysis showed that the pres- plication requirements; that it be understandable and
ence of certain features, the dimensions of the head, believable, and that it provide the interactional support
and the number of facial features greatly influence the the human expects.
perception of humanness. As we have seen, building a social robot involves
numerous design issues. Although much progress has
3.1.5. Effects of personality already been made to solving these problems, much
When a robot exhibits personality (whether in- work remains. This is due, in part, to the broad range
tended by the designer or not), a number of effects of applications for which social robots are being devel-
occur. First, personality can serve as an affordance for oped. Additionally, however, is the fact that there are
interaction. A growing number of commercial prod- many research questions that remain to be answered,
ucts targeting the toy and entertainment markets, such including the following.
as Tiger Electronics Furby (a creature-like robot), What are the minimal criteria for a robot to be
Hasbro’s My Real Baby (a robot doll), and Sony’s social? Social behavior includes such a wide range
Aibo (robot dog) focus on personality as a way to of phenomena that it is not evident which features a
entice and foster effective interaction [18,60]. robot must have in order to show social awareness or
Personality can also impact task performance, in ei- intelligence. Clearly, a robot’s design depends on its
ther a negative or positive sense. For example, Goetz intended use, the complexity of the social environment
and Kiesler examined the influence of two different and the sophistication of the interaction. But, to what
robot personalities on user compliance with an exer- extent does social robot design need to reflect theories
cise routine [67]. In their study, they found some evi- of human social intelligence?
dence that simply creating a charming personality will How do we evaluate social robots? Many re-
not necessarily engender the best cooperation with a searchers contend that adding social interaction ca-
robotic assistant. pabilities will improve robot performance, e.g., by
increasing usability. Thus far, however, little experi-
3.2. Open issues and questions mental evidence exists to support this claim. What is
needed is a systematic study of how “social features”
When we engage in social interaction, there is no impact human–robot interaction in the context of dif-
guarantee that it will be meaningful or worthwhile. ferent application domains [43]. The problem is that
Sometimes, in spite of our best intentions, the inter- it difficult to determine which metrics are most ap-
action fails. Relationships, especially long-term ones, propriate for evaluating social “effectiveness”. Should
involve a myriad of factors and making them succeed we use human performance metrics? Should we ap-
requires concerted effort. ply psychological, sociological or HCI measures?
In [165], Woods writes: How do we account for cross-cultural differences and
individual needs?
It seems paradoxical, but studies of the impact of What differentiates social robots from robots that
automation reveal that design of automated systems exhibit good human–robot interaction? Although
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 161
conventional HRI design does not directly address the staff. Social robots will engage us, entertain us, and
issues presented in this paper, it does involve tech- enlighten us.
niques that indirectly support social interaction. For Central to the success of social robots will be close
example, HCI methods (e.g., contextual inquiry) are and effective interaction between humans and robots.
often used to ensure that the interaction will match Thus, although it is important to continue enhancing
user needs. The question is: are social robots so dif- autonomous capabilities, we must not neglect improv-
ferent from traditional robots that we need different ing the human–robot relationship. The challenge is not
interactional design techniques? merely to develop techniques that allow social robots
What underlying social issues may influence fu- to succeed in limited tasks, but also to find ways that
ture technical development? An observation made by social robots can participate in the full richness of hu-
Restivo is that “robotics engineers seem to be driven man society.
to program out aspects of being human that for one
reason or another they do not like or that make them
personally uncomfortable” [134]. If this is true, does Acknowledgements
that mean that social robots will always be “benign”
by design? If our goal is for social robots to eventu- We would like to thank the participants of the Robot
ally have a place in human society, should we not in- as Partner: An Exploration of Social Robots work-
vestigate what could be the negative consequences of shop (2002 IEEE International Conference on Intel-
social robots? ligent Robots and Systems) for inspiring this paper.
Are there ethical issues that we need to be concerned We would also like to thank Cynthia Breazeal, Lola
with? For social robots to become more and more so- Cañamero, and Sara Kiesler for their insightful com-
phisticated, they will need increasingly better compu- ments. This work was partially supported by EPSRC
tational models of individuals, or at least, humans in grant (GR/M62648).
general. Detailed user modeling, however, may not be
acceptable, especially if it involves privacy concerns.
A related question is that of user monitoring. If a so- References
cial robot has a model of an individual, should it be
capable of recognizing when a person is acting errat- [1] B. Adams, C. Breazeal, R.A. Brooks, B. Scassellati,
ically and taking action? Humanoid robots: a new kind of tool, IEEE Intelligent
Systems 15 (4) (2000) 25–31.
How do we design for long-term interaction? To [2] J. Aggarwal, Q. Cai, Human motion analysis: A review,
date, research in social robot has focused exclusively Computer Vision and Image Understanding 73 (3) (1999)
on short duration interaction, ranging from periods of 428–440.
several minutes (e.g., tour-guiding) to several weeks, [3] P. Andry, P. Gaussier, S. Moga, J.P. Banquet, Learning and
such as in [75]. Little is known about interaction over communication via imitation: An autonomous robot perspe-
ctive, IEEE Transactions on Systems, Man and Cybernetics
longer periods. To remain engaging and empowering 31 (5) (2001).
for months, or years, will social robots need to be capa- [4] R. Arkin, M. Fujita, T. Takagi, R. Hasekawa, An ethological
ble of long-term adaptiveness, associations, and mem- and emotional basis for human–robot interaction, Robotics
ory? Also, how can we determine whether long-term and Autonomous Systems 42 (2003) 191–201 (this issue).
human–robot relationships may cause ill-effects? [5] C. Armon-Jones, The social functions of emotions, in: R.
Harré (Ed.), The Social Construction of Emotions, Basil
Blackwell, Oxford, 1985.
3.3. Summary [6] T. Balch, R. Arkin, Communication in reactive multiagent
robotic systems, Autonomous Robots 1 (1994).
As we look ahead, it seems clear that social robots [7] S. Baron-Cohen, Mindblindness: An Essay on Autism and
will play an ever larger role in our world, working for Theory of Mind, MIT Press, Cambridge, MA, 1995.
[8] C. Bartneck, M. Okada, Robotic user interfaces, in: Procee-
and in cooperation with humans. Social robots will
dings of the Human and Computer Conference, 2001.
assist in health care, rehabilitation, and therapy. So- [9] C. Bartneck, eMuu—an emotional embodied character for
cial robots will work in close proximity to humans, the ambient intelligent home, Ph.D. Thesis, Technical
serving as tour guides, office assistants, and household University Eindhoven, The Netherlands, 2002.
162 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
[10] R. Beckers, et al., From local actions to global tasks: [27] K. Bumby, K. Dautenhahn, Investigating children’s attitudes
Stigmergy and collective robotics, in: Proceedings of towards robots: A case study, in: Proceedings of the
Artificial Life IV, 1996. Cognitive Technology Conference, 1999.
[11] A. Billard, Robota: clever toy and educational tool, Robotics [28] J. Cahn, The generation of affect in synthesized speech,
and Autonomous Systems 42 (2003) 259–269 (this issue). Journal of American Voice I/O Society 8 (1990) 1–19.
[12] A. Billard, K. Dautenhahn, Grounding communication in [29] L. Cañamero, Modeling motivations and emotions as a basis
situated, social robots, in: Proceedings Towards Intelligent for intelligent behavior, in: W. Johnson (Ed.), Proceedings
Mobile Robots Conference, Report No. UMCS-97-9-1, of the International Conference on Autonomous Agents.
Department of Computer Science, Manchester University, [30] L. Cañamero, J. Fredslund, I show you how I like you—
1997. can you read it in my face? IEEE Transactions on Systems,
[13] A. Billard, K. Dautenhahn, Grounding communication in Man and Cybernetics 31 (5) (2001).
autonomous robots: An experimental study, Robotics and [31] L. Cañamero (Ed.), Emotional and Intelligent II: The
Autonomous Systems 24 (1–2) (1998) 71–81. Tangled Knot of Social Cognition, Technical Report No.
[14] A. Billard, K. Dautenhahn, Experiments in learning by FS-01-02, AAAI Press, 2001.
imitation: Grounding and use of communication in robotic [32] J. Cassell, Nudge, nudge, wink, wink: Elements of face-to-
agents, Adaptive Behavior 7 (3–4) (1999). face conversation for embodied conversational agents, in:
[15] A. Billard, G. Hayes, Learning to communicate through J. Cassell, et al. (Eds.), Embodied Conversational Agents,
imitation in autonomous robots, in: Proceedings of the Inter- MIT Press, Cambridge, MA, 1999.
national Conference on Artificial Neural Networks, 1997. [33] J. Cassell, et al. (Eds.), Embodied Conversational Agents,
[16] E. Bonabeau, M. Dorigo, G. Theraulaz, Swarm Intelligence: MIT Press, Cambridge, MA, 1999.
From Natural to Artificial Systems, Oxford University Press, [34] R. Chellappa, et al., Human and machine recognition of
Oxford, 1999. faces: A survey, Proceedings of the IEEE 83 (5) (1995).
[17] C. Breazeal, A motivational system for regulating human– [35] M. Coulson, Expressing emotion through body movement:
robot interaction, in: Proceedings of the National Conference A component process approach, in: R. Aylett, L.
on Artificial Intelligence, Madison, WI, 1998, pp. 54–61. Cañamero (Eds.), Animating Expressive Characters for
[18] C. Breazeal, Designing Sociable Robots, MIT Press, Social Interactions, SSAISB Press, 2002.
Cambridge, MA, 2002. [36] J. Crowley, Vision for man–machine interaction, Robotics
[19] C. Breazeal, Toward sociable robots, Robotics and Auto- and Autonomous Systems 19 (1997) 347–358.
nomous Systems 42 (2003) 167–175 (this issue). [37] C. Darwin, The Expression of Emotions in Man and
[20] C. Breazeal, Designing sociable robots: Lessons learned, in: Animals, Oxford University Press, Oxford, 1998.
K. Dautenhahn, et al. (Eds.), Socially Intelligent Agents: [38] K. Dautenhahn, Getting to know each other—artificial
Creating Relationships with Computers and Robots, Kluwer social intelligence for autonomous robots, Robotics and
Academic Publishers, Dordrecht, 2002. Autonomous Systems 16 (1995) 333–356.
[21] C. Breazeal, P. Fitzpatrick, That certain look: Social [39] K. Dautenhahn, I could be you—the phenomenological
amplification of animate vision, in: Proceedings of the AAAI dimension of social understanding, Cybernetics and Systems
Fall Symposium on Society of Intelligence Agents—The Journal 28 (5) (1997).
Human in the Loop, 2000. [40] K. Dautenhahn, The art of designing socially intelligent
[22] C. Breazeal, B. Scassellati, How to build robots that make agents—science, fiction, and the human in the loop, Applied
friends and influence people, in: Proceedings of the Inter- Artificial Intelligence Journal 12 (7–8) (1998) 573–617.
national Conference on Intelligent Robots and Systems, [41] K. Dautenhahn, Socially intelligent agents and the primate
1999. social brain—towards a science of social minds, in:
[23] C. Breazeal, B. Scassellati, A context-dependent attention Proceedings of the AAAI Fall Symposium on Society of
system for a social robot, in: Proceedings of the International Intelligence Agents, 2000.
Joint Conference on Artificial Intelligence, Stockholm, [42] K. Dautenhahn, Roles and functions of robots in human
Sweden, 1999, pp. 1146–1153. society—implications from research in autism therapy,
[24] C. Breazeal, B. Scassellati, Challenges in building robots Robotica, to appear.
that imitate people, in: K. Dautenhahn, C. Nehaniv (Eds.), [43] K. Dautenhahn, Design spaces and niche spaces of believable
Imitation in Animals and Artifacts, MIT Press, Cambridge, social robots, in: Proceedings of the International Workshop
MA, 2001. on Robots and Human Interactive Communication, 2002.
[25] C. Breazeal, A. Edsinger, P. Fitzpatrick, B. Scassellati, [44] K. Dautenhahn, A. Billard, Bringing up robots or—the
Active vision systems for sociable robots, IEEE Transactions psychology of socially intelligent robots: From theory to
on Systems, Man and Cybernetics 31 (5) (2001). implementation, in: Proceedings of the Autonomous Agents,
[26] A. Bruce, I. Nourbakhsh, R. Simmons, The role of 1999.
expressiveness and attention in human–robot interaction, in: [45] K. Dautenhahn, C. Nehaniv, Living with socially intelligent
Proceedings of the AAAI Fall Symposium Emotional and agents: A cognitive technology view, in: K. Dautenhahn
Intelligent II: The Tangled Knot of Society of Cognition, (Ed.), Human Cognition and Social Agent Technology,
2001. Benjamin, New York, 2000.
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 163
[46] K. Dautenhahn, C. Nehaniv (Eds.), Imitation in Animals Social Learning: Psychological and Biological Perspectives,
and Artifacts, MIT Press, Cambridge, MA, 2001. Erlbaum, London, 1988.
[47] K. Dautenhahn, I. Werry, A quantitative technique for [65] P. Gaussier, et al., From perception–action loops to imitation
analysing robot–human interactions, in: Proceedings of the processes: A bottom-up approach of learning by imitation,
International Conference on Intelligent Robots and Systems, Applied Artificial Intelligence Journal 12 (7–8) (1998).
2002. [66] D. Gavrilla, The visual analysis of human movement: A
[48] K. Dautenhahn, B. Ogden, T. Quick, From embodied survey, Computer Vision and Image Understanding 73 (1)
to socially embedded agents—implications for interaction- (1999).
aware robots, Cognitive Systems Research 3 (3) (2002) [67] J. Goetz, S. Kiesler, Cooperation with a robotic assistant,
(Special Issue on Situated and Embodied Cognition). in: Proceedings of CHI, 2002.
[49] K. Dautenhahn, I. Werry, J. Rae, P. Dickerson, Robotic play- [68] D. Goldberg, M. Mataric, Interference as a tool for designing
mates: Analysing interactive competencies of children with and evaluating multi-robot controllers, in: Proceedings
autism playing with a mobile robot, in: K. Dautenhahn, et al. AAAI-97, Providence, RI, 1997, pp. 637–642.
(Eds.), Socially Intelligent Agents: Creating Relationships [69] D. Goren-Bar. Designing model-based intelligent dialogue
with Computers and Robots, Kluwer Academic Publishers, systems, in: M. Rossi, K. Siau (Eds.), Information Modeling
Dordrecht, 2002. in the New Millennium, Idea Group, 2001.
[50] D. Dennett, The Intentional Stance, MIT Press, Cambridge, [70] E. Hall, The hidden dimension: Man’s use of space in public
MA, 1987. and private, The Bodley Head Ltd., 1966.
[51] J. Demiris, G. Hayes, Imitative learning mechanisms in [71] C. Heyes, B. Galef, Social Learning in Animals: The Roots
robots and humans, in: Proceedings of the European Work- of Culture, Academic Press, 1996.
shop on Learning Robotics, 1996. [72] G. Hayes, J. Demiris, A robot controller using learning by
[52] J. Demiris, G. Hayes, Active and passive routes to imitation, imitation, in: Proceedings of the International Symposium
in: Proceedings of the AISB Symposium on Imitation in on Intelligent Robotic Systems, 1994.
Animals and Artifacts, 1999. [73] O. Holland, Grey Walter: The pioneer of real artificial
[53] J.-L. Deneubourg, et al., The dynamic of collective sorting
life, in: C. Langton, K. Shimohara (Eds.), Proceedings of
robot-like ants and ant-like robots, in: Proceedings of
the International Workshop on Artificial Life, MIT Press,
the International Conference on Simulation of Adaptive
Cambridge, MA.
Behavior, 2000.
[74] E. Hudlicka, Increasing SIA architecture realism by
[54] C. DiSalvo, et al., All robots are not equal: The design and
modeling and adapting to affect and personality, in: K.
perception of humanoid robot heads, in: Proceedings of the
Dautenhahn, et al. (Eds.), Socially Intelligent Agents:
Conference on Designing Interactive Systems, 2002.
Creating Relationships with Computers and Robots, Kluwer
[55] B. Duffy, Anthropomorphism and the social robot, Robotics
Academic Publishers, Dordrecht, 2002.
and Autonomous Systems 42 (2003) 177–190 (this issue).
[75] H. Hüttenrauch, K. Severinson-Eklund, Fetch-and-carry with
[56] P. Ekman, W. Friesen, Measuring facial movement with the
CERO: Observations from a long-term user study, in:
facial action coding system, in: Emotion in the Human Face,
Proceedings of the International Workshop on Robots and
Cambridge University Press, Cambridge, 1982.
[57] P. Ekman, Basic emotions, in: T. Dalgleish, M. Power (Eds.), Human Communication, 2002.
Handbook of Cognition and Emotion, Wiley, New York, [76] O. John, The ‘Big Five’ factor taxonomy: Dimensions of
1999. personality in the natural language and in questionnaires,
[58] B. Fogg, Introduction: Persuasive technologies, Commu- in: L. Pervin (Ed.), Handbook of Personality: Theory and
nications of the ACM 42 (5) (1999). Research, Guilford, 1990.
[59] T. Fong, C. Thorpe, C. Baur, Collaboration, dialogue, and [77] F. Kaplan, et al., Taming robots with clicker training: A
human–robot interaction, in: Proceedings of the International solution for teaching complex behaviors, in: Proceedings of
Symposium on Robotics Research, 2001. the European Workshop on Learning Robots, 2001.
[60] T. Fong, I. Nourbakhsh, K. Dautenhahn, A survey of socially [78] Z. Khan, Attitudes towards intelligent service robots,
interactive robots: concepts, design, and applications, Technical Report No. TRITA-NA-P9821, NADA, KTH,
Technical Report No. CMU-RI-TR-02-29, Robotics Institute, Stockholm, Sweden, 1998.
Carnegie Mellon University, 2002. [79] S. Kiesler, J. Goetz, Mental models and cooperation with
[61] T. Fong, C. Thorpe, C. Baur, Robot, asker of questions, robotic assistants, in: Proceedings of CHI, 2002.
Robotics and Autonomous Systems 42 (2003) 235–243 (this [80] V. Klingspor, J. Demiris, M. Kaiser, Human–robot-
issue). communication and machine learning, Applied Artificial
[62] N. Frijda, Recognition of emotion, Advances in Experi- Intelligence Journal 11 (1997).
mental Social Psychology 4 (1969). [81] H. Kobayashi, F. Hara, A. Tange, A basic study on dynamic
[63] T. Fromherz, P. Stucki, M. Bichsel, A survey of face control of facial expressions for face robot, in: Proceedings
recognition, MML Technical Report No. 97.01, Department of the International Workshop on Robots and Human
of Computer Science, University of Zurich, 1997. Communication, 1994.
[64] B. Galef, Imitation in animals: History, definition, and [82] H. Kozima, H. Yano, In search of otogenetic prerequisites
interpretation of data from the psychological laboratory, in: for embodied social intelligence, in: Proceedings of the
164 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
Workshop on Emergence and Development on Embodied [101] M. Mataric, Issues and approaches in design of collective
Cognition; International Conference on Cognitive Science, autonomous agents, Robotics and Autonomous Systems 16
2001. (1995) 321–331.
[83] H. Kozima, H. Yano, A robot that learns to communicate [102] M. Mataric, et al., Behavior-based primitives for articulated
with human caregivers, in: Proceedings of the International control, in: Proceedings of the International Conference on
Workshop on Epigenetic Robotics, 2001. Simulation of Adaptive Behavior, 1998.
[84] L. Kopp, P. Gardenfors, Attention as a Minimal Criterion [103] T. Matsui, H. Asoh, J. Fry, et al., Integrated natural
of Intentionality in Robotics, vol. 89, Lund University of spoken dialogue system of Jijo-2 mobile robot for office
Cognitive Studies, 2001. services, in: Proceedings of the AAAI-99, Orlando, FL,
[85] D. Kortenkamp, E. Huber, P. Bonasso, Recognizing and 1999, pp. 621–627.
interpreting gestures on a mobile robot, in: Proceedings of [104] Y. Matsusaka, T. Kobayashi, Human interface of humanoid
the AAAI-96, Portland, OR, 1996, pp. 915–921. robot realizing group communication in real space, in:
[86] R. Krauss, P. Morrel-Samuels, C. Colasante, Do conver- Proceedings of the International Symposium on Humanoid
sational hand gestures communicate? Journal of Personality Robotics, 1999.
and Social Psychology 61 (1991). [105] D. McNeill, Hand and Mind: What Gestures Reveal About
[87] M. Krieger, J.-B. Billeter, L. Keller, Ant-like task allocation Thought, University of Chicago Press, Chicago, IL, 1992.
and recruitment in cooperative robots, Nature 406 (6799) [106] C. Melhuish, O. Holland, S. Hoddell, Collective sorting and
(2000). segregation in robots with minimal sensing, in: Proceedings
[88] C. Kube, E. Bonabeau, Cooperative transport by ants of the International Conference on Simulation of Adaptive
and robots, Robotics and Autonomous Systems 30 (2000) Behavior, 1998.
85–101. [107] F. Michaud, S. Caron, Roball—An autonomous toy-rolling
[89] Y. Kuniyoshi, et al., Learning by watching: Extracting robot, in: Proceedings of the Workshop on Interactive Robot
reusable task knowledge from visual observation of human Entertainment, 2000.
performance, IEEE Transactions of the Robotics and [108] F. Michaud, et al., Artificial emotion and social robotics, in:
Automation 10 (6) (1994). Proceedings of the International Symposium on Distributed
Autonomous Robotic Systems, 2000.
[90] M. Lansdale, T. Ormerod, Understanding Interfaces, Aca-
[109] F. Michaud, et al., Dynamic robot formations using direc-
demic Press, New York, 1994.
tional visual perception, in: Proceedings of the International
[91] S. Lauria, G. Bugmann, T. Kyriacou, E. Klein, Mobile
Conference on Intelligent Robots and Systems, 2002.
robot programming using natural language, Robotics and
[110] H. Mizoguchi, et al., Realization of expressive mobile robot,
Autonomous Systems 38 (2002) 171–181.
in: Proceedings of the International Conference on Robotics
[92] V. Lee, P. Gupta, Children’s Cognitive and Language
and Automation, 1997.
Development, Blackwell Scientific Publications, Oxford,
[111] I. Murray, J. Arnott, Towards the simulation of emotion in
1995.
synthetic speech: A review of the literature on human vocal
[93] H. Lim, A. Ishii, A. Takanishi, Basic emotional walking
emotion, Journal of Acoustic Society of America 93 (2)
using a biped humanoid robot, in: Proceedings of the IEEE
(1993).
SMC, 1999. [112] I. Myers, Introduction to Type, Consulting Psychologists
[94] C. Lisetti, D. Schiano, Automatic facial expression inter- Press, Palo Alto, CA, 1998.
pretation: Where human–computer interaction, artificial [113] T. Nakata, et al., Expression of emotion and intention
intelligence, and cognitive science intersect, Pragmatics and by robot body movement, in: Proceedings of the 5th
Cognition 8 (1) (2000). International Conference on Autonomous Systems, 1998.
[95] K. Lorenz, The Foundations of Ethology, Springer, Berlin, [114] Y. Nakauchi, R. Simmons, A social robot that stands in
1981. line, in: Proceedings of the International Conference on
[96] Y. Marom, G. Hayes, Preliminary approaches to attention Intelligent Robots and Systems, 2000.
for social learning, Informatics Research Report No. [115] W. Newman, M. Lamming, Interactive System Design,
EDI-INF-RR-0084, University of Edinburgh, 1999. Addison-Wesley, Reading, MA, 1995.
[97] Y. Marom, G. Hayes, Attention and social situatedness [116] M. Nicolescu, M. Mataric, Learning and interacting in
for skill acquisition, Informatics Research Report No. human–robot domains, IEEE Transactions on Systems, Man
EDI-INF-RR-0069, University of Edinburgh, 2001. and Cybernetics 31 (5) (2001).
[98] Y. Marom, G. Hayes, Interacting with a robot to enhance [117] I. Nourbakhsh, An affective mobile robot educator with
its perceptual attention, Informatics Research Report No. a full-time job, Artificial Intelligence 114 (1–2) (1999)
EDI-INF-RR-0085, University of Edinburgh, 2001. 95–124.
[99] D. Massaro, Perceiving Talking Faces: From Speech [118] A. Rowe, C. Rosenberg, I. Nourbakhsh, CMUcam: a low-
Perception to Behavioural Principles, MIT Press, Cambridge, overhead vision system, in: Proceedings of the International
MA, 1998. Conference on Intelligent Robots and Systems (IROS 2002),
[100] M. Mataric, Learning to behave socially, in: Proceedings Lausanne, Switzerland, 2002.
of the International Conference on Simulation of Adaptive [119] T. Ogata, S. Sugano, Emotional communication robot:
Behavior, 1994. WAMOEBA-2R emotion model and evaluation experiments,
T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166 165
in: Proceedings of the International Conference on [139] B. Scassellati, Investigating models of social development
Humanoid Robots, 2000. using a humanoid robot, in: B. Webb, T. Consi (Eds.),
[120] H. Okuno, et al., Human–robot interaction through real-time Biorobotics, MIT Press, Cambridge, MA, 2000.
auditory and visual multiple-talker tracking, in: Proceedings [140] B. Scassellati, Foundations for a theory of mind for a
of the International Conference on Intelligent Robots and humanoid robot, Ph.D. Thesis, Department of Electronics
Systems, 2001. Engineering and Computer Science, MIT Press, Cambridge,
[121] A. Paiva (Ed.), Affective interactions: Towards a new gene- MA, 2001.
ration of computer interfaces, in: Lecture Notes in Computer [141] S. Schall, Robot learning from demonstration, in: Procee-
Science/Lecture Notes in Artificial Intelligence, vol. 1914, dings of the International Conference on Machine Learning,
Springer, Berlin, 2000. 1997.
[122] J. Panksepp, Affective Neuroscience, Oxford University [142] M. Scheeff, et al., Experiences with Sparky: A social
Press, Oxford, 1998. robot, in: Proceedings of the Workshop on Interactive Robot
[123] E. Paulos, J. Canny, Designing personal tele-embodiment, Entertainment, 2000.
Autonomous Robots 11 (1) (2001). [143] J. Schulte, et al., Spontaneous, short-term interaction with
[124] V. Pavlovic, R. Sharma, T. Huang, Visual interpretation of mobile robots in public places, in: Proceedings of the
hand gestures for human–computer interaction: A review, International Conference on Robotics and Automation,
IEEE Transactions on Pattern Analysis and Machine Inte- 1999.
lligence 19 (7) (1997). [144] K. Severinson-Eklund, A. Green, H. Hüttenrauch, Social
[125] P. Persson, et al., Understanding socially intelligent agents— and collaborative aspects of interaction with a service robot,
A multilayered phenomenon, IEEE Transactions on SMC Robotics and Autonomous Systems 42 (2003) 223–234 (this
31 (5) (2001). issue).
[126] R. Pfeifer, On the role of embodiment in the emergence [145] T. Sheridan, Eight ultimate challenges of human–robot com-
of cognition and emotion, in: Proceedings of the Toyota munication, in: Proceedings of the International Workshop
Conference on Affective Minds, 1999. on Robots and Human Communication, 1997.
[146] C. Smith, H. Scott, A componential approach to the meaning
[127] J. Pineau, M. Montemerlo, M. Pollack, N. Roy, S. Thrun,
of facial expressions, in: J. Russell, J. Fernandez-Dols (Eds.),
Towards robotic assistants in nursing homes: Challenges
The Psychology of Facial Expression, Cambridge University
and results, Robotics and Autonomous Systems 42 (2003)
Press, Cambridge, 1997.
271–281 (this issue).
[147] R. Smith, S. Eppinger, A predictive model of sequential
[128] R. Plutchik, Emotions: A general psychoevolutionary theory,
iteration in engineering design, Management Science 43 (8)
in: K. Scherer, P. Ekman (Eds.), Approaches to Emotion,
(1997).
Erlbaum, London, 1984.
[148] D. Spiliotopoulos, et al., Human–robot interaction based
[129] L. Rabiner, B. Jaung, Fundamentals of Speech Recognition,
on spoken natural language dialogue, in: Proceedings of
Englewood Cliffs, Prentice-Hall, NJ, 1993.
the European Workshop on Service and Humanoid Robots,
[130] B. Reeves, C. Nass, The Media Equation, CSLI Publications,
2001.
Stanford, 1996.
[149] L. Steels, Emergent adaptive lexicons, in: Proceedings of
[131] J. Reichard, Robotics: Fact, Fiction, and Prediction, Viking the International Conference on SAB, 1996.
Press, 1978. [150] L. Steels, F. Kaplan, AIBO’s first words. The social learning
[132] W. Reilly, Believable social and emotional agents, Ph.D. of language and meaning, in: H. Gouzoules (Ed.), Evolution
Thesis, Computer Science, Carnegie Mellon University, of Communications, vol. 4, No. 1, Benjamin, New York,
1996. 2001.
[133] S. Restivo, Bringing up and booting up: Social theory and [151] L. Steels, Language games for autonomous robots, IEEE
the emergence of socially intelligent robots, in: Proceedings Intelligent Systems 16 (5) (2001).
of the IEEE Conference on SMC, 2001. [152] R. Stiefelhagen, J. Yang, A. Waibel, Tracking focus of
[134] S. Restivo, Romancing the robots: Social robots and society, attention for human–robot communication, in: Proceedings
in: Proceedings of the Robots as Partners: An Exploration of the International Conference on Humanoid Robots,
of Social Robots Workshop, International Conference on 2001.
Intelligent Robots and Systems (IROS 2002), Lausanne, [153] K. Suzuki, et al., Intelligent agent system for human–robot
Switzerland, 2002. interaction through artificial emotion, in: Proceedings of the
[135] A. Sage, Systems Engineering, Wiley, New York, 1992. IEEE SMC, 1998.
[136] A. Samal, P. Iyengar, Automatic recognition and analysis [154] R. Tanawongsuwan, et al., Robust tracking of people by
of human faces and facial expressions: A survey, Pattern a mobile robotic agent, Technical Report No. GIT-GVU-
Recognition 25 (1992). 99-19, Georgia Institute of Technology, 1999.
[137] H. Schlossberg, Three dimensions of emotion, Psychology [155] L. Terveen, An overview of human–computer collaboration,
Review 61 (1954). Knowledge-Based Systems 8 (2–3) (1994).
[138] J. Searle, Minds, Brains and Science, Harvard University [156] F. Thomas, O. Johnston, Disney Animation: The Illusion of
Press, Cambridge, MA, 1984. Life, Abbeville Press, 1981.
166 T. Fong et al. / Robotics and Autonomous Systems 42 (2003) 143–166
[157] W. Thorpe, Learning and Instinct in Animals, Methuen, Terrence Fong is a joint postdoctoral
London, 1963. fellow at Carnegie Mellon University
[158] K. Toyama, Look, Ma—No hands! Hands-free cursor control (CMU) and the Swiss Federal Institute
with real-time 3D face tracking, in: Proceedings of the of Technology/Lausanne (EPFL). He re-
Workshop on Perceptual User Interfaces, 1998. ceived his Ph.D. (2001) in Robotics from
[159] R. Vaughan, K. Stoey, G. Sukhatme, M. Mataric, Go ahead, CMU. From 1990 to 1994, he worked at
make my day: Robot conflict resolution by aggressive the NASA Ames Research Center, where
competition, in: Proceedings of the International Conference he was co-investigator for virtual environ-
on SAB, 2000. ment telerobotic field experiments. His
[160] J. Velasquez, A computational framework for emotion-based research interests include human–robot
control, in: Proceedings of the Workshop on Grounding interaction, PDA and web-based interfaces, and field mobile robots.
Emotions in Adaptive Systems; International Conference on
SAB, 1998.
[161] S. Waldherr, R. Romero, S. Thrun, A gesture-based interface
for human–robot interaction, Autonomous Robots 9 (2000). Illah Nourbakhsh is an Assistant Profes-
[162] I. Werry, et al., Can social interaction skills be taught by sor of Robotics at Carnegie Mellon Uni-
a social agent? The role of a robotic mediator in autism versity (CMU) and is co-founder of the
therapy, in: Proceedings of the International Conference on Toy Robots Initiative at The Robotics In-
Cognitive Technology, 2001. stitute. He received his Ph.D. (1996) de-
[163] A. Whiten, Natural Theories of Mind, Basil Blackwell, gree in computer science from Stanford.
Oxford, 1991. He is a founder and chief scientist of
[164] T. Willeke, et al., The history of the mobot museum robot Blue Pumpkin Software, Inc. and Mobot,
series: An evolutionary study, in: Proceedings of FLAIRS, Inc. His current research projects include
2001. robot learning, believable robot personal-
[165] D. Woods, Decomposing automation: Apparent simplicity, ity, visual navigation and robot locomotion.
real complexity, in: R. Parasuraman, M. Mouloua (Eds.),
Automation and Human Performance: Theory and Appli-
cations, Erlbaum, London, 1996. Kerstin Dautenhahn is a Reader in ar-
[166] Y. Wu, T. Huang, Vision-based gesture recognition: A tificial intelligence in the Computer Sci-
review, in: Gesture-Based Communications in HCI, Lecture ence Department at University of Hert-
Notes in Computer Science, vol. 1739, Springer, Berlin, fordshire, where she also serves as coor-
1999. dinator of the Adaptive Systems Research
[167] G. Xu, et al., Toward robot guidance by hand gestures Group. She received her doctoral degree
using monocular vision, in: Proceedings of the Hong Kong in natural sciences from the University of
Symposium on Robotics Control, 1999. Bielefeld. Her research lies in the areas
[168] S. Yoon, et al., Motivation driven learning for interactive of socially intelligent agents and HCI, in-
synthetic characters, in: Proceedings of the International cluding virtual and robotic agents. She
Conference on Autonomous Agents, 2000. has served as guest editor for numerous special journal issues in
[169] J. Zlatev, The Epigenesis of Meaning in Human Beings and AI, cybernetics, artificial life, and recently co-edited the book “So-
Possibly in Robots, Lund University Cognitive Studies, vol. cially Intelligent Agents—Creating Relationships with Computers
79, Lund University, 1999. and Robots”.