0% found this document useful (0 votes)
103 views20 pages

AI Social Companionship

This document provides a literature review on social companionship with artificial intelligence (AI), specifically focusing on conversational agents (CAs). The review revealed major theories, constructs, and themes in the research domain. A conceptual framework was developed encompassing antecedents, mediators, moderators, and consequences of social companionship with CAs. The study discusses future research directions to guide the design of efficient and ethical AI companions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
103 views20 pages

AI Social Companionship

This document provides a literature review on social companionship with artificial intelligence (AI), specifically focusing on conversational agents (CAs). The review revealed major theories, constructs, and themes in the research domain. A conceptual framework was developed encompassing antecedents, mediators, moderators, and consequences of social companionship with CAs. The study discusses future research directions to guide the design of efficient and ethical AI companions.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Technological Forecasting & Social Change 193 (2023) 122634

Contents lists available at ScienceDirect

Technological Forecasting & Social Change


journal homepage: www.elsevier.com/locate/techfore

Social companionship with articial intelligence: Recent trends and


uture avenues
Rijul Chaturvedi a, Sanjeev Verma a, Ronnie Das b, Yogesh K. Dwivedi c, d, *
a
National Institute o Industrial Engineering, Mumbai, India
b
Department o Marketing, Audencia Business School, 8 Rte de la Jonelière, 44300 Nantes, France
c
Digital Futures or Sustainable Business & Society Research Group, School o Management, Swansea University, Bay Campus, Fabian Bay, Swansea, UK
d
Department o Management, Symbiosis Institute o Business Management, Pune & Symbiosis International, Deemed University, Pune, Maharashtra, India

A R T I C L E I N F O A B S T R A C T

Keywords: The social companionship (SC) eature in conversational agents (CAs) enables the emotional bond and consumer
Aective computing relationships. The heightened interest in SC with CAs led to exponential growth in publications scattered across
Social presence disciplines with ragmented ndings, thus limiting holistic understanding o the domain and warrants a
Social companionship
macroscopic view o the domain to guide uture research directions. The present study lls the research void by
Conversational agents
Anthropomorphism
oering a comprehensive literature review entailing science perormance and intellectual structure mapping. The
Articial intelligence comprehensive review revealed the research domain's major theories, constructs, and thematic structure. The-
matic and content analysis o intellectual structure resulted in a conceptual ramework encompassing anteced-
ents, mediators, moderators, and consequences o SC with CAs. The study discusses uture research directions
guiding practitioners and academicians in designing ecient and ethical AI companions.

1. Introduction agents now oer emotional support (Provoost et al., 2017) and establish
human-relational bonds with the users (Darcy et al., 2021). Thereore,
Due to digital advancements, customers now reach companies CAs oering social companionship to their users are reerred to as AI
regardless o geographical location, time, and channel does not limit the companions.
interactional continuity between rms and consumers (Suwono and The deployment o empathetic chatbots transcends dierent in-
Sihombing, 2016). Additionally, deploying Articial Intelligence (AI) dustries like banking, health care, e-commerce, education, and tourism
lits consumers' services (Ameen et al., 2021) to the next level through (Adam et al., 2021; Lee et al., 2020b; Rhee and Choi, 2020; Hsieh, 2011;
personalization and convenience. Thereore, rms deploy emerging Bickmore et al., 2013) due to its multiaceted role. In recent years, many
technologies powered by AI to oer seamless experiences to their cus- researchers have reviewed the literature on conversational agents (Lim
tomers. AI-enabled conversational agents (CAs) that provide digital et al., 2022; Rapp et al., 2021) or AI agents or health support (Gasteiger
assistance and build customer relations become signicant in such a et al., 2021). However, the literature on social companionship with
scenario. They are now evolving with new capabilities enabling them to conversational agents is scattered, thus limiting our understanding to-
engage users or prolonged interactions. wards the eld.
CAs unold paradigm shits in human-computer interaction (Biundo But why conducting a systematic literature review on companion-
et al., 2016). Leading organizations like Amazon and Google use digital ship with conversational agents is o urgent? Because brands preer to
assistants with a sense o humour (Bothun et al., 2017). Gate box, a invest in new and emerging technologies. According to the Conversa-
hologram virtual social companion, helps users to combat loneliness and tional AI market report (2021), the global market or conversational AI
oer human-like emotional support (Hirano, 2016). “Love Plus,” a is projected to reach 18.4 billion by 2026. The evolution o CAs with
videogame by Konami, allows users to build a romantic relationship advanced capabilities allows users to shit towards new and updated
with a digital character (Lowry, 2015). AI-enabled conversational versions o chatbot applications. For example, ChatGPT, launched in

* Corresponding author at: Digital Futures or Sustainable Business & Society Research Group, School o Management, Swansea University, Bay Campus, Fabian
Bay, Swansea, UK.
E-mail addresses: rijul.chaturvedi.2020@nitie.ac.in (R. Chaturvedi), sanjeev@nitie.ac.in (S. Verma), rdas@audencia.com (R. Das), y.k.dwivedi@swansea.ac.uk
(Y.K. Dwivedi).

https://doi.org/10.1016/j.techore.2023.122634
Received 1 November 2022; Received in revised orm 6 April 2023; Accepted 7 May 2023
Available online 20 May 2023
0040-1625/© 2023 The Authors. Published by Elsevier Inc. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

November 2022 by Open.AI, can now remember earlier user conversa- resources, strategy, and operations (Sridevi and Suganthi, 2022; Kar and
tions. ChatGPT can write essays, poems, news articles, and stories or Kushwaha, 2021). Marketing is no exception, as chatbots advancements
users, and thus has reached one million users in just ve days o launch. and their capabilities have led to their adoption as customer service
Extant literature discusses CAs with varied unctions and capabilities. agents in the past decades. Firms deploy modern chatbots to enrich the
The question is whether investigating these capabilities can lead to a customer experience with new technologies (Kushwaha et al., 2021;
long-term relational bond, such as companionship between users and AI. Chaturvedi and Verma, 2022). Rapp et al. (2021), conducted a literature
Ultimately, an AI system's companionship development quality could review on text-based chatbots research and ound that the modern
trigger continued usage intention. AI companions can infuence user chatbot designs are heading towards providing empathy, emotional
thoughts, decisions, behaviours, purchasing patterns etc. experience, and prolonged interactions. The eld o AI now touches
Recently, researchers have examined the eects o CA capabilities upon the theory o mind and sel-awareness o articially intelligent
(like social presence, anthropomorphic eatures, interaction style, and systems (Verma et al., 2021). Emerging themes in AI-enabled technol-
media richness) on consequences like loneliness reduction, emotional ogies include eeling AI, emotional AI, empathetic AI and aective
connection, attitude towards the product, intention to use, and customer computing (Huang et al., 2019). Thereore, the uture or brands lies in
satisaction (Jones et al., 2021; Araujo, 2018; Adam et al., 2021; Rhee creating, communicating and delivering AI companions that can provide
and Choi, 2020; Lee et al., 2020a; Cheng and Jiang, 2020). Advance CAs long time emotional and unctional support to the consumers.
can now reduce the user's loneliness (Skjuve et al., 2021), provide social According to Lim (2012), AI companions are “robots or virtual
interaction support to children and people with special needs (Ramadan conversational agents that possess a certain level o intelligence and auton-
et al., 2021; Sa et al., 2021), reduce patients' loneliness (Loveys et al., omy as well as social skills, allowing them to establish and maintain long-term
2019), and provide lie support and companionship to older adults relationships with users.” The eld o human-computer interaction has
(Tsiourti et al., 2016). evolved rom interace design to social acceptability and believability
Despite available examples o companion chatbots like Replika, (Pesty and Duhaut, 2011). AI companions mitigate the loneliness o
Mitsuku, XiaoIce and Gate box, the ethical implications o such individuals seeking emotional and social support, which became sig-
advanced chatbots that can orm long-term connections with users have nicant in the Covid-19 isolation phase (Odekerken-Schröder et al.,
been the subject o ongoing debate among scholars (Murtarelli et al., 2020).
2021). While some argue that these chatbots can provide valuable social
support and companionship (Ta et al., 2020), others express concern
2.1. Conversational agents (CAs)
about the potential or manipulation and exploitation (Possati, 2022). In
such a case, an absence o a systematic literature review makes it
CAs are sotware agents designed to mimic human conversations via
challenging to address these issues comprehensively. Notably, the
natural language processing through communication channels like
consolidation o the area can unurl immense opportunities or mar-
speech, text, gestures, and acial expressions (Laranjo et al., 2018), oten
keters, practitioners, academicians, and customers. Thereore,
appear as text-based chatbots, digital avatars, and social robots (Rad-
converging literature on CAs or social companionship becomes
ziwill and Benton, 2017). CAs can be digital assistants, recommendation
essential.
agents, and social companions (McGoldrick et al., 2008). Chatbots oer
As technology is heading towards eeling AI, research on AI com-
unctional utilities that help customers perorm digital tasks like setting
panions prolierates. However, the literature still lacks a comprehensive
the alarm or reminder, checking the weather, playing songs, searching
ramework to unveil SC's antecedents, mediators, moderators, and out-
or inormation, including product recommendations, etc. (Chaturvedi
comes with CAs. Moreover, current trends and ways orwards remain
and Verma, 2022). CAs assist consumers in ordering products via voice
uzzy without a systematic literature review. The ollowing research
commands (Aw et al., 2022). Aective computing advancements
questions remain unanswered:
enabled CA to build emotional connections with humans. According to
RQ1. . What are the publication and citation trends in SC with CAs? Hamilton et al. (2021), humans preer recommendations rom known
ties (riends or companions in their network) and repeat purchase
RQ2. . Which are the top sources, publications, and authors in SC with
intention based on voice assistants recommendations leads to brand
CAs?
loyalty (Maroukhani et al., 2022). Thus, advanced AI companions can
RQ3. . What are the major theories in SC with CAs? maintain long-term relationships with humans, moderate their emo-
tions, and induce purchase intentions and brand loyalty (McLean et al.,
RQ4. . What are the major themes in SC with CAs?
2021).
RQ5. . What are the antecedents, mediators, moderators, and outcomes o
SC with CAs?
2.2. Social companionship (SC)
RQ6. . Which directions should uture research pursue to advance the SC
with CAs? According to Benyon and Mival (2010), SC reers to “a pleasant and
accessible relationship with an interactive source, emerging out o the
The present study attempts to ll the void and discern the holistic
social and emotional investment o a person which requires a level o
view o the research domain with perormance and intellectual structure
trust, compatibility, and amiliarity with the source that results in a
mapping. The thematic and content analysis enables the bird's eye view
eeling o security, and general wellbeing.” CAs designed or SC are
o the domain (Donthu et al., 2021a).
called “Articial Companions (AI Companions),” which can substitute
The present study is organized into nine sections. Section 2 presents
human relations by observing users' past experiences (Campos and
the theoretical background, Section 3 discusses the review methodol-
Paiva, 2010). For example, in-home companions help users schedule and
ogy, Section 4 ocuses on perormance analysis, Section 5 details theo-
perorm routine health care activities, mobile companions help in out-
retical oundations, Section 6 unolds the intellectual structure, Section
door physical activities, and virtual cooking companions help by rec-
7 proposes a conceptual ramework, Section 8 discusses uture research
ommending daily recipes (Turunen et al., 2011). AI companion toys
directions, and Section 9 concludes the article.
engage children in long-term relationships (Adam et al., 2010).
“Replika: My AI riend” is an advanced AI companion with therapeutic
2. Theoretical background
resources and partners to reduce users' loneliness (Skjuve et al., 2021).
Interaction with the social chatbot (like Mitsuku) reduces the need or
AI applications are revolutionizing management streams like human
physical, social presence (Croes and Antheunis, 2021).

2
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

2.3. Evolution o AI companions support to its users (Hirano, 2016).


The ourth and most recent shit is observed with the introduction o
Over the past ew decades, AI companions have come a long way in generative AI into CAs. The recent launch o ChatGPT equipped with the
terms o their capabilities and acceptance in society. The current study, third generation o the generative pre-trained transormer model and
presents our shits in the evolution o AI companions. In 1996, the sel-supervised learning in November 2022 reached one million users in
world saw the rst AI companion in the orm o Tamagotchi (Bloch and the rst ve days. The CA can generate original content, respond to a
Lemish, 1999), a small virtual pet that users could take care o on a LED- wide range o prompts and questions, and discuss any subject as long as
based digital screen. The toy was designed to simulate the experience o the user wants (Dwivedi et al., 2023). It can also remember previous
caring or a virtual pet or which users could develop a sense o re- conversations with the user. Fig. 1 represents the our technological
sponsibility and attachment. While it used a simple design with rule- shits rom 1996 to 2022.
based modeling and physical buttons or user interaction, it sparked
an interest in having a digital companion. As technology advanced, the 3. Research methodology
capabilities, architecture and companionship eatures improved.
Table 1, summarizes the evolution o AI companions rom 1996 to 2022. Systematic literature review (SLR) is an appropriate approach to
In 2001, Baby harp Seal PARO, a therapeutic robot, entertained address the research questions pertaining to current trends and uture
patients in hospitals, older adults and other lonely individuals research directions in the research domain (Whittemore et al., 2014).
(Takayanagi et al., 2014). The robot is designed to look like a baby Seal The present study delves into the literature on SC with CAs to uncover
with real ur. Its architecture includes uzzy logic and reinorcement the perormance analysis (authors, sources, and documents) and intel-
learning techniques, enabling PARO to behave like a real seal. Although lectual structure (themes, constructs, and theories) o the current liter-
the design o the robot is limited in the range o actions and expressions, ature (Donthu et al., 2021a). The scientic approach makes the review
it provides signicant emotional support to the users and allows them to replicable, transparent, and objective. The present study replicable
love and care or someone. Nabaztag, released in 2008, is a rabbit- research protocol contains ve steps o in-depth analysis and adheres to
shaped smart device that can announce weather orecasts and news established guidelines in previous studies (Verma and Yadav, 2021;
headlines, play MP3 streams, and send and receive messages (Cavazza Verma et al., 2021; Mhatre et al., 2020) to reveal current trends and the
et al., 2008). User interaction with a rabbit inormation console pro- way orward. Fig. 2 highlights the fow process to conduct the current
vided a sense o care or the rabbit and ormed a bond with it over time. literature review.
The technological evolution o AI companions then shited to
anthropomorphic characters with the release o KASPAR, a child-shaped 3.1. Stage 1: search strategy
doll-like robot in 2005 (Wood et al., 2019). The robot is specially
designed or Kids with autism disorder that allow users to love and care The search strategy delimits source types to only journals to retrieve
or the robot with responsive acial expressions on the human touch. scientic, contemporary, and explanatory literature (Chandra et al.,
Such companions possess limited conversational capabilities, thus 2022; Lim et al., 2022). For literature search and retrieval, the biblio-
restricting users rom enhanced interactions. With natural language metric database Scopus, instead o other alternatives, such as the web o
processing (NLP) and deep learning-based modeling techniques, Apple science, google scholar, etc., is preerred, as Scopus ensures the stringent
released Siri with a technological breakthrough in 2011 (Thorne, 2020). criteria or indexation o published documents (Verma, 2022; Chandra
Siri assists users with digital tasks such as making phone calls, setting et al., 2022; Lim et al., 2022).
alarms, playing music and videos etc. The virtual assistant was only
available or Apple devices, but its unctional utility helped users 3.2. Stage 2: selection o search string
manage their digital tasks. Amazon launched Alexa in 2014 with
recurrent neural networks (RNNs) and learning-based modeling to A pool o keywords refecting the research domain and the possible
provide a richer user experience to its customers (Gao et al., 2018). The synonyms o keywords ormed the search string. The search string was
unctional intelligence o Alexa and its humorous responses to the users also added to the keyword's repository used in similar review studies in
enabled it to improve proximity with amazon customers. Despite a extant literature. Academic experts were consulted or the nalization o
complex design with many unctions, such as intelligent home auto- the search string. The search was ocused on the social companionship
mation and digital assistance, Alexa has yet to understand user emotions aspects o the conversational agents and thus keywords such as chatbot
and respond accordingly. or social bot were excluded rom the search to avoid a large number o
The third shit in the evolution o AI companions observed the irrelevant results. Search strings include keywords like “Articial
introduction o aective computing into chatbot designs. Microsot Asia Intelligent agents” OR “AI agents” OR “Digital Assistants” OR “Virtual
released an empathetic chatbot called XiaoIce in 2014. The CA is Assistants” OR “Conversational AI” OR “Conversational agents” OR “AI
designed to provide emotional support to the users with its ability to companion” OR “Digital Companion” OR “Virtual Companion” OR
understand user emotions through sentiment analysis (Zhou et al., “Articial Companion” AND “Companion” OR “Love” OR “Friend” OR
2020). Modern architecture with Convolutional neural networks (CNNs) “Bond” OR “Emotion”. In May 2022, the search perormed in the Scopus
and long-short-term memory networks (LSTM) allowed XiaoIce to Bibliometric database resulted in 988 documents. The resultant litera-
engage users or prolonged interaction and develop human-like bonds. ture database was downloaded in Bibtex (.bib) and comma-separated
However, the chatbot only continues the discussion or the user until value (.csv) le ormats or urther processing.
input rom the user is received. Replika is an advanced version o the
emotional chatbot launched in 2016. Its unique design allows AI to 3.3. Stage 3: ltering the initial results (inclusion and exclusion criteria)
mimic human romantic conversations with the users (Ta et al., 2020).
One technological breakthrough with Replika is its ability to initiate The Scopus database unctions, such as sorting and ltering, acili-
conversations with the user without input, just like humans. RNNs, tated the organization o codes or language, subject area, document
LSTM and reinorcement learning techniques allow Replika to provide type, and source type. Two-stage ltering instrumentalizes the precise
its users with emotional and mental health support up to a level that can data collection. Initially, ltering is operationalized on the ltering
reduce eelings o loneliness. With one o the most advanced chatbot menu available in the Scopus database, ollowed by manual ltering
designs, Replika has yet to learn about digital assistance and home through careul scanning o each document. In the initial ltering,
automation skills. Gatebox, a holographic virtual assistant launched in documents were delimited to publications in the English language;
2016, has both the capabilities that provide utilitarian and emotional document type as article or review; source type as journals and

3
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 1
Evolution o AI companions rom 1996 to 2022.
Characters Articial pets Anthropomorphic characters

Tamagotchi (Bloch and PARO (Takayanagi et al., Nabaztag (Cavazza et al., KASPAR (Wood et al.,
Name Apple's Siri (Thorne, 2020)
Lemish, 1999) 2014) 2008) 2019)
Time 1996 2001 2005 2005 2011
Theme Virtual pet Therapeutic robot Inormation console Therapeutic robot Digital assistant
A virtual Pet in a small egg- A baby harp seal robot with A rabbit-shaped smart
A child-shaped doll- A virtual assistant in apple smart
Description shaped toy with a mini led real ur, fippers, and device with two long ears
like robot. phones
screen. vocalizations. that could move.
Fuzzy logic and reinorcement
Architecture Rule-based modeling Rule-based modeling Rule-based modeling Deep learning-based modeling
learning
The weather orecast, stock Voice command, inormation
market report, news search, setting alarms,
The user could eed the pet, Singing songs,
Recognizing human voices, headlines, alarm clock, e- reminders, making phone calls,
Capabilities play games with it, and even recognizing the human
human touch, mail alerts, sending and announcing weather orecasts,
discipline it i it misbehaved. touch
receiving messages, and playing music and videos, and
MP3-Streams. third-party app integration.
Appearance Virtually embodied Physically embodied Physically embodied Psychically embodied Disembodied
Medium o Voice-based, LED lights
Physical buttons on the toy Touch Sensors Touch Sensors Text, Image, and Voice
interaction display, and web interace
A portable toy with a
Robot ocuses on emotional Integration o Wi- Conversational ability to
microprocessor could allow
Key specialty support by mimicking the connectivity into a Expressive ace understand the natural language
users to interact with a virtual
behaviour o a real baby Seal. consumer device o users and reply accordingly
pet.
The toy was designed to
Siri's ability to learn about users
simulate the experience o Baby harp seal that users An articial rabbit that Human child-like
over time and its integration
Companionship caring or a virtual pet or could love and care or by users could interact with, appearance with acial
with various third-party apps
eature which users could develop a moving their hands on its ur care or and orm bond expressions that Kids
and services made it a more
sense o responsibility and and receiving its reactions. with over time could love and care or.
helpul assistant or users.
attachment.
The device could not Siri is unable to remember
Small monochrome screen A limited range o actions and Limited interaction
display videos, images, and earlier conversations with users.
with limited graphics. Basic expressions also required abilities as it was
Key limitations text, also limited in Lack o sel-disclosure and
medium o interaction with the regular maintenance and designed or kids with
interacting with the users emotional engagement. It is
help o buttons only. charging. autism disorder.
aectively. limited to Apple devices.

Characters Anthropomorphic characters AI characters


Amazon's Alexa (Gao et al., Microsot's XiaoIce (Zhou Gate box (Hirano, ChatGPT
Name Replika (Ta et al., 2020)
2018) et al., 2020) 2016) (Dwivedi et al., 2023)
Time 2014 2014 2016 2016 2022
Theme Digital assistant Empathetic chatbot Personal companion Personal companion Versatile chatbot
A versatile AI chatbot that can
write poems, stories, news
A virtual assistant in smart A companion chatbot that A holographic
A riendly chatbot who can articles and generate original
Description home devices and smart mimics romantic domestic companion
understand user emotions. text on any subject. It can also
phones. conversations and smart assistant
play various roles like travel
advisor, book author act.
Learning based
Learning Based Modeling, Convolutional neural
modeling, computer Generative Pre-trained
Recurrent Neural Networks networks (CNNs), long-short LSTM, RNNs, and
Architecture vision, robotics, Transormer, sel-supervised
(RNNs), Long-short term term memory networks Reinorcement learning.
internet o things, learning
memory networks (LSTM) (LSTM)
emotion recognition
Functional intelligence to Capable o generating original
Emotional intelligence, Emotional intelligence,
perorm tasks such as Named entity recognitions, content, can respond to wide
mental health support and Functional intelligence
switching o or on lights and sentiment analysis and range o prompts and questions,
gamication allows in home automation,
Capabilities other appliances, playing emotional intelligence enable can discuss any subject as long
Replika to mimic romantic mental health support,
music, making phone calls, XiaoIce to engage users or as user wants. Can remember
conversations with the social media
announcing weather, setting prolonged conversations. previous conversations with the
users. integration
alarms etc.. user.
Holographic
Appearance Disembodied Virtually embodied Virtually embodied Disembodied
embodiment
Medium o
Text, voice, image Text, voice, image Text, voice, image Text, voice, image Text
interaction
Focus on emotional
A physical appearance
Alexa skills kit to integrate To generate emotionally intelligence, empathy, sel-
o avorite character Generative AI is the key
Key specialty Alexa with wide range o smart responsive and contextually disclosure and mimicking
that users can live specialty in ChatGPT
home devices. relevant responses human romantic
with.
conversations.
Emotional support, mental Emotional support, Providing writing support,
Voice based device controls, Emotional support and ability
health support and ability mental health support, inormation support and the
and humorous responses o to engage users in prolonged
Companionship to converse romantically and home automation ability to play versatile roles
Alexa make user-Alexa interactions enable XiaoIce to
eature allows Replika to make support allow users to enables ChatGPT to establish
relationship proximal in orm relational bonds with the
users eel emotionally consider AI a living long-term relationships with the
nature. users
connected. partner. users.
(continued on next page)

4
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 1 (continued )
Characters Articial pets Anthropomorphic characters

XiaoIce is only limited to


Lack o contextual Japanese and Chinese
Lack o unctional Only limited to
understanding, cannot answer languages. Responses Lack o unctional abilities,
intelligence like Alexa, Japanese and English
Key limitations long and complex queries and sometimes eel scripted or disembodiment and limited to
limited conversational skill language, relatively
lack o emotional repetitive. Lack o unctional text based interaction medium.
to stretch a discussion. expensive to buy.
understanding. intelligence like Alexa. Lack o
sel-disclosure.

employs network analysis to understand the interrelationship between


documents, keywords, authors, or citations (Pilkington and Catherine,
1999). The data clustering algorithm is the oundation o network
analysis to discern the intellectual structure o the research domain
(Chen et al., 2010). Nodes in a cluster possess similar characteristics in
the network (Radicchi et al., 2004). The network parameters, such as
edges' thickness and density, depict the similarity index between the
scientic actors (Leydesdor and Raols, 2011; Radicchi et al., 2004).
Betweenness centrality identies the most prominent nodes in the
shared linkage (Hjørland, 2013). Network's modularity index is calcu-
lated using the Louvain algorithm. Leading eigenvalues, or spin glass
algorithms, measure the strength o the relationship between nodes
(Blondel et al., 2008). We preer the Louvain algorithm in the present
study as it optimizes the network run in time 0 (n log n). The sotware
eliminates the outliers o the network present in the orm o isolated
nodes on the map. The algorithm returns an optimized value with the
network's modularity ranging rom 1 to +1. For a weighted graph, the
modularity is ormulated as
( )
Fig. 1. Technological shits in the evolution o AI Companions. 1 ∑∞ ki kj
Q= aij  ᵟ(CiCj)
2m ij 2m
conerence proceedings; and subject area as “Social science, Psychology,
Decision Science, Arts and Humanities, Neurosciences, Health Pro- 4. Performance analysis of SC with CAs
essions, Business, Management and Accounting, Economics Econo-
metrics and Finance, and Multidisciplinary.” Manual ltering involves Most reviews examine the perormance o research constituents, e.g.,
careul scanning o documents or the relevancy and specicity o the authors, journals, and documents (Lim et al., 2022; Donthu et al., 2021a;
research domain. Two-stage ltering led to a reduced number o 197 Chandra et al., 2022), to show the signicance o the chosen eld. The
documents. current study attempts to unurl publication and citation trends or
research on SC with CAs. According to Mukherjee et al. (2022), top
3.4. Stage 4: data collection, data cleaning, and data processing publications and top authors o the research eld support practitioners
in identiying signicant contributions in the given eld, along with a
Post-screening, the data set was triangulated or data verication. list o experts in that domain. Additionally, a list o top sources helps
Two researchers and a group o experts opined the suitability o data. researchers and academicians to target suitable journals or publication
Finally, 126 papers ormed the consideration set or urther processing. o their studies in the given eld. In this regard, the current study maps
The present study uses an inductive approach to drive data insights (Far the perormance analysis o SC with CAs in the ollowing sub-sections.
and Rad, 2018; Lim et al., 2022). Besides the inductive approach,
deductive analysis helped in perormance analysis and science mapping 4.1. Publication and citation trend o SC with CAs (RQ1)
o the research domain (Donthu et al., 2021a). Perormance analysis
reveals the publication trend by identiying the domain's top authors, Table 2 presents the publication and citation trends or research on
documents, and sources. Content and thematic analysis revealed the SC with CAs. The table indicates that the eld is about nineteen years
prevailing theories, constructs, keyword co-occurrence, and biblio- old. The initial papers started appearing in 2003, and the domain
graphic coupling themes. The present study uses Microsot excel or evolved gradually. The dataset contains 126 documents (TP) published
content analysis, Biblioshiny R sotware, and VOSviewer sotware or a in 87 dierent sources, with 106 cited publications (TCP) and a pro-
network o themes (Donthu et al., 2021a). ductivity average o 6.63 publications per year. Fig. 3 represents the
publication trend and indicates the sharp growth in the last three years.
3.5. Stage 5: data analysis strategy The exponential growth in the last three years indicates the growing
interest in the social companionship aspect o conversational agents.
The study relies on the keyword Co-occurrence network and biblio- The involvement o digital assistants in consumers' lives can be attrib-
graphic coupling analysis to reveal dierent knowledge clusters present uted to social abrics and emotional connections developed by conver-
in the domain. Network analysis is perormed through VOS viewer, sational agents.
which creates clusters o keywords or documents based on their weight The table indicates that studies on SC with CAs have received 2231
(Van Eck and Waltman, 2011) and supports the content analysis. The citations (TC), with an average citation per publication o 17.70 (TC/
bibliographic coupling is a widely preerred technique that reveals the TP). The h-index (citation impact) o the eld 25 (h) inorms that
thematic analysis (Boyack and Klavans, 2010). The present study twenty-ve publications have received at least 25 (h) citations. In

5
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Fig. 2. Highlights the fow process or the review.

contrast, the eld's g-index (citation infuence) is eight, which indicates in 2018). The research papers published in Computers in Human
that eight publications have received at least 64 (g2) citations each. The Behaviour garnered the highest average number o citations (C/Y:
authorship inormation in the table reveals a total o 428 authors 57.20) per year, signiying the journal as the leading source or publi-
(including repetition) (NCA) or 391 unique authors (excluding repeti- cation in the given eld. Social companionship entails behavioural as-
tion) (NUA) contributing to the eld. The dataset includes teen single- pects with AI-enabled conversational agents rationalizing the higher
authored publications (SA), while 111 documents are co-authored publications in journals/conerences ocusing on the interace between
publications (CA). The collaboration index (CI=NCA-TP÷TP) o 3.14 computers and human behaviour. Conerence on human actors in
signies that each lead author has collaborated with an average o 3.14 computing systems proceedings is the most impactul conerence source
co-authors (CI). Most o the documents in the dataset are empirical (h-index = 5 and g-index = 5) or researchers working on SC with CA.
studies (92.85 %), while only 9 (7.14 %) are non-empirical. Noteworthily, the top 10 sources or publications on social compan-
ionship with conversational agents include the majority o sources (40
%) rom the area o Computer Science (CS) (i.e., Computers in Human
4.2. Top sources or SC with CAs (RQ2) Behaviour, International Journal o Human-Computer Studies, Conerence
on Human Factors in Computing Systems – Proceedings and Conerence on
Table 3 presents the top ten sources or publications in SC with CAs Human Factors in Computing Systems – Proceedings and Lecture Notes in
based on the highest number o citations received. The table indicates Computer Science) ollowed by the sources rom the area o inormation
that Computers in Human Behaviour receives the highest number o systems (IS) (i.e., Frontiers o Inormation Technology and Electronic
citations (TC: 286), with only ve publications in a short period (starting

6
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 2 Thereore, the current study goes beyond the number o citations and
Bibliometric inormation o SC with CAs. checks other research quality indexes such as solidity and plausibility,
Panel A. Publication inormation Statistic originality and novelty, scientic value, and societal value and rele-
vance (Aksnes et al., 2019). The table indicates Araujo's (2018) exper-
Total publications (TP) 126
Total cited publications (TCP) 106 imental study with chatbots is the most impactul publication with the
Total sources (TS) 87 highest number o citations (222) in the eld. The study is also the most
Number o active years (NAY) 19 1/2 infuential publication, with the highest average citations per year o
Productivity per active year (PAY) 6.63 44.40, which ound that social presence mediates the eect o anthro-
pomorphic design cues on the emotional connection established with the
Panel B. Citation inormation Results users (Araujo, 2018). For solidity and plausibility, we checked the
Total citations (TC) 2231
quality o citing journals. We ound that they are journals o repute like
Average citations per publication (TC/TP) 17.70
h index 25 the Journal o the Academy o Marketing Science, Journal o Management
g index 44 Inormation Systems, International Journal o Human Resource Manage-
ment, Electronic Markets, Journal o Business Research, Journal o Retailing
Panel C. Authorship inormation and Consumer Services, Psychology & Marketing, and Journal o Service
Number o contributing authors (including repetition) (NCA) 428 Management. In terms o originality and novelty, the study emphasizes
Number o unique authors (excluding repetition) (NUA) 391 the novel concept o anthropomorphic design in chatbots and its sig-
Authors o single-authored publications (ASA) 13
nicance in business, which is in triangulation with the evolution o AI
Authors o co-authored articles (ACA) 378
Single-authored publications (SA) 15
companions explained in the prior sections o the current paper.
Co-authored publications (CA) 111 Regarding scientic value, we conrmed that the authors citing the
Collaboration index (CI=NCA-TP÷TP) 3.14 study are established academicians and researchers. We used alternative
metrics through “PlumX” to assess the societal value and relevance o
Panel D. Document inormation the article and ound that the study captured the attention o 889 readers
Article (empirical) 117
Reviews (non-empirical) 9
Reerences 7365
Table 3
Keywords 420
Top sources or SC with CAs.
Note: Period o coverage = 2003 – May 2022.
Journals TC h g TP Start_PY C/Y

Computers in Human Behaviour 286 4 5 5 2018 57.20


Engineering, Computer Networks, Computational Linguistics, European
Frontiers o Inormation Technology
Conerence on Inormation Systems). While CS and IS are traditionally and Electronic Engineering 206 1 1 1 2018 41.20
considered dierent elds, there is an increasing need or interdisci- International Journal o Human-
plinary research in developing and implementing AI technologies or Computer Studies 151 3 4 4 2019 37.75
Computer Networks 116 1 1 1 2013 11.60
companionship. By catering to interdisciplinary audiences rom mar-
Computational Linguistics 94 1 1 1 2020 31.33
keting, philosophy, computer science, social science and inormation Electronic Markets 64 1 1 1 2021 32
science the study aims to contribute to a more comprehensive under-
standing o the subject matter, which can have implications or the
Conerences
advanced design o CAs capable o establishing relational bonds with Conerence on Human Factors in
users in range o disciplines(Bracken and Oughton, 2006; Keestra, 2017; Computing Systems - Proceedings 137 5 5 5 2004 7.21
MacLeod, 2018). Proceedings o Conerence on Human
Inormation Interaction and
Retrieval 92 1 1 1 2018 18.40
Lecture Notes in Computer Science 63 2 7 7 2011 5.25
4.3. Top publications or SC with CAs (RQ2) European Conerence on Inormation
Systems 62 2 2 2 2018 12
The top ten publications on SC with CAs' based on the highest
Abbreviations: C/Y, citations per year; g, g-index; h, h-index; Start_PY, the start
number o citations are presented in Table 4. However, citation counts
o publication year; TC, total citations; TP, total publications.
alone are insucient to evaluate any published work's research quality.

Fig. 3. Annual publication trend o SC with CAs' research.

7
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 4 article's societal value and relevance. The other top publications in
Top publications or research on SC with CAs. Table 4 deal with designing and implementing various empathetic
Title Authors Year TC C/Y chatbots and caring machines that can establish a close relationship with
humans. These include research studies by Boshma et al., 2013, Zhou
Living up to the chatbot hype: the
infuence o anthropomorphic et al., 2020, and Bickmore and Picard, 2004. Additionally, some publi-
design cues and communicative Araujo 2018 222 44.40 cations in the list highlight using social chatbots to improve customer
agency raming on conversational service (Adam et al., 2021; Bickmore et al., 2011; Lopatovska and Wil-
agent and company perceptions liams, 2018; Gnewuch et al., 2018). As mentioned or the top three, the
From Eliza to XiaoIce: challenges and
opportunities with social chatbots
Shum et al. 2018 206 41.20 authors have check the research quality indexes or each top publication
A taxonomy o social cues or in the table.
Feine et al. 2019 123 30.75
conversational agents
Design and analysis o a social botnet Boshma et al. 2013 116 11.60
The design and implementation o 4.4. Top authors or SC with CAs' research (RQ2)
Xiaoice, an empathetic social Zhou et al. 2020 94 31.33
chatbot
The top 10 authors in the eld o SC with CAs are listed in Table 5.
Personication o the Amazon Alexa: Lopatovska and
BFF or a mindless companion? Williams
2018 92 18.40 The table indicates Bickmore as the most prolic author with (4) pub-
AI-based chatbots in customer service lications and (180) total citations starting rom 2004. The list also
Adam et al. 2021 64 32.00
and their eects on user compliance contains our authors with three publications (Broadbent E, Loveys K,
Bickmore and Wilks Y, and Clavel C) and 5 authors with two publications. Li D (TC:
Towards caring machines 2004 64 3.37
Picard
Faster is not always better:
300) and Shum HY (TC: 300) are the most-cited authors in the eld,
understanding the eect o dynamic ollowed by Bickmore T (TC: 180), Gnewuch U (TC: 173), Maedche A
Gnewuch et al. 2018 50 10.00
response delays in human-chatbot (TC: 173), and Morana S (TC: 173). However, citation counts, and
interaction related traditional metrics limits the view o overall impact o authors on
Relational agents improve engagement
an emerging eld o research. Thus, the current study also investigates,
and learning in science museum Bickmore et al. 2011 50 4.17
visitors into the proles o authors and nds that, top authors in the list come
rom diversied workplaces such as academics (e.g., Bickmore T,
Note: Abbreviations: C/Y, citations per year, TC, total citations.
Broadbnet E, Loveys K, Wilks K, Clavel C, Gnewuch U, Maedche A,
Morana) and practice (e.g., Li D and Shum HY). Moreover, most o the
rom dierent parts o the globe. authors in the list are rom Western and European countries (e.g., the
Shum et al. (2018) paper is the second most impactul and infuential United States (4), New Zealand (2), Germany (3)). The current study can
publication, with (206) total citations and an average o 41.20 per year. be considered an extension o Lim et al. (2022) study on conversational
The authors presented the development o chatbots, starting rom the commerce research. The ndings align with the previous survey that
rst chatbot Eliza (1966), to the latest and most advanced social chat- western countries contribute more to research on SC with CAs.
bots like Siri (2011) and XiaoIce (2014). The authors also ound XiaoIce
as an empathetic chatbot that can recognize human emotions and
5. Theoretical foundations of SC with CAs (RQ3)
engage users or longer. The paper is cited by publications in various
journals o repute, such as the Journal o the Academy o Marketing Sci-
Table 6 lists theories used in the literature when addressing SC with
ence, Journal o Business Research, International Journal o Consumer
CAs. The table indicates that research on SC with CAs has adopted
Studies, and Knowledge-Based Systems, which ensures the solidity and
theories rom disciplines such as
plausibility o the study. In terms o originality and novelty, the study
emphasizes the novel concept o empathetic chatbots and their design
• Theories rom Psychology: Balance theory, oot-in-the-door technique,
using aective computing techniques, again in the triangulation with
uncanny valley theory, sel-perception theory, sel-disclosure theory,
the evolutions o the AI companions. The study is cited by numerous
practitioners in the eld due to the orientation o the study towards
Table 5
uturistic technology and thus ensures its scientic value. Alternative
Top authors or SC with CAs.
metrics rom PlumX indicates that the study has one patent amily, once
mentioned in a news article, tweeted two times on twitter, and has Authors Author aliations NP TC PY_start C/Y

captured 797 reading attentions throughout the globe, which ensures Northeastern University, United
Bickmore T 4 180 2004 9.47
the societal value and relevance o the article. States
Broadbent University o Auckland, New
Feine et al. (2019), with 123 total citations, is the third most-cited 3 15 2019 3.75
E Zealand
paper, with an average o 30.75 citations per year. The authors University o Auckland, New
Loveys K 3 15 2019 3.75
converged the diversied literature on social cues (such as gender, age, Zealand
gesture, etc.) that trigger humans to react while interacting with chat- Florida Institute or Human-
bots and classied them into our main categories (verbal, auditory, Wilks Y Machine Cognition, United 3 13 2005 0.72
States
visual, and invisible) and ten sub-categories. The paper is cited by Polytechnic Institute o Paris,
publications in various journals o repute, such as the International Clavel C 3 8 2013 0.8
France
Journal o Inormation Management, Decision Support Systems, Journal o Microsot Corporation, United
Li D 2 300 2018 60
Retailing and Consumer Services, Electronic markets, Journal o Manage- States
Microsot Corporation, United
ment Inormation Systems, and Psychology & Marketing etc., which ensures Shum HY 2 300 2018 60
States
the solidity and plausibility o the study. The study summarizes the Karlsruhe Institute o
Gnewuch U 2 173 2018 34.6
novel concept o social capabilities (using social cues in the chatbot Technology, Germany
design) in terms o originality and novelty. Authors rom diverse disci- Maedche A
Karlsruhe Institute o
2 173 2018 34.6
plines, including established and new, cite the study in their work, thus Technology, Germany
Karlsruhe Institute o
ensuring its scientic value. Alternative matrices rom PlumX indicates Morana S
Technology, Germany
2 173 2018 34.6
that the study has captured 452 reading attentions throughout the globe,
42 attentions on Facebook, and ve tweets on Twitter, ensuring the Abbreviations: C/Y, citations per year; Start_PY, the start o publication year; TC,
total citations.

8
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 6 Table 6 (continued )


Theories in SC with CAs' research. Year Field Theory Origin Example
Year Field Theory Origin Example
Dat and
Media and Media Richness Hsieh and
Psychology 1986 Lengel
Communication Theory Lee (2021)
Heider Rapp et al. (1986)
1946 Psychology Balance Theory
(1946) (2021) Walther and
Media and Social Inormation Lee et al.
Freedman 1992 Burgoon
Foot in the Door Adam et al. Communication Processing Theory (2020a)
1966 Psychology and Fraser (1992)
Technique (FITD) (2021)
(1966) Silverstone Brause and
Media and Domestication
Uncanny Valley Ta et al. 1992 and Hirsch, Blank
1970 Psychology Mori (1970) Communication Theory
Theory (2020) 1992 (2020)
Sel-Perception Adam et al. Media and Smith Porra et al.
1972 Psychology Bem (1972) 2003 Speech Act Theory
Theory (2021) Communication (2003) (2020)
Premack Interpersonal
Media and McCroskey Lei et al.
and Lee et al. 2006 Communication
1978 Psychology Theory o Mind Communication et al. (2006) (2021)
Woodru (2020a) Theory
(1978) Communication Petronio
Media and Ha et al.
Sel-Disclosure Derlaga and Lee et al. 2008 Privacy Management and Durham
1987 Psychology Communication (2021)
Theory Berg (1987) (2020c) Theory (2008)
Barrett and
The Functionalist Crolic et al.
1987 Psychology Campos
Theory o Emotions (2022) Marketing and computer science
(1987)
Cheng and
Burgoon Croes and Eisenhardt
Expectancy Violation 1989 Marketing Agency Theory Jiang
1988 Psychology and Hale Antheunis (1989)
Theory (2020)
(1988) (2021)
Brand Personality Aaker Youn and
Potdevin 1997 Marketing
Cognitive Load Sweller Theory (1997) Jin (2021)
1988 Psychology et al.
Theory (1988) Mick and Wilson-
(2021) Technology Paradox
1998 Marketing Fournier Nash et al.
Cole and Bickmore Theory
Theory o Relational (1998) (2020)
1996 Psychology Bradac et al.
Satisaction Mensio
(1996) (2011) Computer Aective Computing Lisetti
1998 et al.
Sel Determination Ryan and Sinoo et al. Science Theory (1998)
2000 Psychology (2018)
Theory Deci (2000) (2018)
Commitment- Cialdini Adam et al.
2001 Psychology
Consistency Theory (2001) (2021)
Preston and
expectancy violation theory, the unctionalist theory o emotions,
Perception-Action Gama et al. theory o relational satisaction, sel-determination theory,
2002 Psychology De Waal
Model (PAM) (2011)
(2002) commitment-consistency theory, theory o mind, perception-action
Potdevin model, emotional response theory, three-actor theory o anthropo-
Emotional Response Mottet et al.
2006 Psychology et al.
Theory (2006) morphism, and cognitive load theory.
(2021)
Three-Factor Theory Pradhan • Theories rom Sociology: Bourdieu's social theory, social penetration
Epley et al.
2007 Psychology o et al. theory, social exchange theory, relationship development theory,
(2007)
Anthropomorphism (2019) social role theory, time interaction and participation TIP theory o
groups, social response theory, computers are social actors' theory,
Sociology Aristotle's theory o riendship, and theory o companions.
Tassiello • Theories rom Media and communication: Communication privacy
Social Exchange Homans
1958 Sociology et al.
Theory (1958) management theory, interpersonal communication theory, speech
(2021)
Altman and Skjuve act theory, media richness theory, social inormation processing
Social Penetration
1973 Sociology Taylor et al. theory, domestication theory,
Theory
(1973) (2021) • Theories rom Marketing: Agency theory, brand personality theory,
ABCDE Model o Croes and technology paradox theory and computer science (e.g., aective
Levinger
1980 Sociology Relationship Antheunis
Development
(1980)
(2021)
computing theory)
Bourdieu's social Bourdieu Possati
1980 Sociology
theory (1980) (2022) The earliest theory stems rom psychology in 1946 (i.e., balance
Time Interaction and theory) and the latest theory rom sociology in 2011 (i.e., theory o
McGrath Wang et al.
1991 Sociology Perormance (TIP)
(1991) (2021) companions). A signicant percentage (68 %) o theoretical oundations
Theory o Groups
Gnewuch comes rom the psychology and sociology domain, indicating the sig-
Social Response Nass et al.
1994 Sociology et al. nicance o psychology and sociology in social companionship. Future
Theory (1994)
(2018) researchers may use theories rom communications to develop uturistic
Eagly and Rhee and models or chatbot companions.
1999 Sociology Social Role Theory Wood Choi
(1999) (2020)
Gnewuch 6. Intellectual structure analysis
Computers are Social Nass and
2000 Sociology et al.
Actors (CASA) Moon, 2000
(2018)
The intellectual structure reveals the underlying themes and con-
Aristotle's Theory o Aristotle Bosch et al.
2009 Sociology
Friendship (2009) (2022)
structs building the research domain. Techniques like co-occurrence
Theory o Krämer Payr analysis and bibliographic coupling discern the domain's past, present,
2011 Sociology
Companions et al. (2011) (2011) and uture research directions.

Media and communication


6.1. Keyword co-occurrence analysis (RQ4)

Keyword co-occurrence analysis reveals themes converging the

9
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

research domain (Donthu et al., 2021a; Zupic and Cater, 2015; Callon Table 7
et al., 1983). Fig. 4 presents the co-occurrence network o keywords Thematic Structure o Keyword Co-occurrence Network.
emerged rom VOS viewer analysis. Five thematic clusters emerged, Themes and keywords DG OC APY AC
wherein cluster 1 denotes personication o conversational agents,
Cluster 1 (Red): personication o conversational agents
cluster 2 encapsulates articial companions and Socialbots, cluster 3 Embodied conversational agent 5 8 2014.13 31.25
captures human relations with conversational agents, cluster 4 refects Aective computing 6 6 2011.83 17.33
enablers o conversational agents, and cluster 5 signies AI as social Personication 8 3 2018.33 57.33
companions. Co-occurrence network parameters include occurrence Amazon Echo 7 3 2018.33 55.66
Amazon Alexa 7 3 2019.67 34
(OC), degree (DG), average publication year (APY), and average citation
(AC), as represented in Table 7, to derive an objective assessment o each
Cluster 2 (green): articial companions and social bots
thematic cluster
Conversational agents 23 15 2019.73 9.2
Where OC denotes the requency o a keyword, DG signies the Human-computer interaction 10 6 2020.5 5.66
number o connections that a node (keyword) o the network has with User experience 7 3 2020.33 17
other nodes, APY refects the hotness (more recent) and coldness (less Mobile phone 5 3 2020.67 14
recent) o a keyword, and AC denotes the average impact o that Socialbots 1 3 2019 12.66
Cluster 3 (blue): human relations with conversational agents
keyword on the eld (Donthu et al., 2021a; Lim et al., 2022; Chandra
Articial companions 4 7 2015 13.57
et al., 2022). Robots 6 5 2018.8 9.6
Loneliness 10 5 2020.2 2.6
6.1.1. Cluster 1: personication o conversational agents Older adults 6 4 2020.25 15.5
Friendship 5 3 2020.33 11.66
The rst cluster comprises keywords pointing at personication ca-
Cluster 4 (yellow): enablers o conversational agents
pabilities o conversational agents. The cluster highlights AI agents in Chatbot 25 16 2020.13 26.43
various orms, such as “Amazon Alexa”, “Amazon Echo”, and “embodied Anthropomorphism 25 10 2020.5 37.2
conversational agents”. The highest explored keyword in this cluster is Trust 14 7 2020.71 8.28
“embodied conversational agents” (OC: 8), ollowed by “aective Social presence 11 6 2019.83 63.66
Sel-disclosure 10 6 2020.33 13.5
computing” (OC: 6), and “personication” (OC: 3). The keyword with
Voice assistants 9 5 2020.6 10.2
the highest links and citations in the cluster is “personication” (DG: 8; Cluster 5 (purple): AI as social companions
AC: 57.33), which signies that most AI agents are designed with Articial intelligence 36 24 2019.71 18.79
human-like eatures such as name, gender, voice, etc. Another keyword Socialbots 2 3 2016.33 42.66
Social support 7 3 2020 13
in the cluster, “amazon echo”, received high citations (AC: 55.66) signiy
Companion 3 3 2018.67 4.33
preerence or echo as a companion over other AI agents. Personication Machine learning 6 3 2019 2
is hot and trending topic o the cluster (APY: 2018.33–2019.67).
Abbreviations: DG: degree; OC: occurrence; APY: average publication year; AC:
average citation.
6.1.2. Cluster 2: articial companions and social bots
The second cluster, articial companions and social bots, projects AI
as a companion in the orm o social chatbot, such as “conversational
agents,” and “social bots”. The keyword with the highest connections

Fig. 4. Co-word network o SC with CAs' research.


Notes: Cluster 1 (Red) = Personication o conversa-
tional agents.
Cluster 2 (Green) = Articial Companions and Social
Bots.
Cluster 3 (Blue) = Human relations with conversational
agents.
Cluster 4 (Yellow) = Enablers o conversational agents.
Cluster 5 (Purple) = AI as social companions

10
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

and requency in this cluster is “conversational agents” (DG: 23; OC: 15) coupling technique uses documents or network analysis (Donthu et al.,
ollowed by “human-computer interaction” (DG: 10; OC: 6), “user 2021b; Kessler, 1963). Despite having alternative techniques such as co-
experience” (DG: 7; OC: 3), “mobile phone” (DG: 5; OC: 3), and citation analysis, the current study opts or bibliographic coupling
“Socialbots” (DG: 1; OC: 3). However, in terms o the average citations (which groups “citing” publications) and keyword co-occurrence anal-
“user experience” scored highest (AC: 17) ollowed by “mobile phone” ysis (which groups “current” keywords) to refect current trends existing
(AC: 14) and “social bot” (AC: 12.66). The cluster emphasized the in the eld o SC with CAs (Lim et al., 2022; Donthu et al., 2021a).
enhanced user experience with computers capable o interacting with Noteworthily, the results revealed ve signicant thematic clusters
humans. According to the average publication year index, all the key- discussed in the ollowing subsections. Table 8 lists the top ten articles o
words in this cluster are hot and trending (APY: 2019 – 2020.67). each cluster based on total citations.

6.1.3. Cluster 3: human relations with conversational agents 6.2.1. Cluster 1: articial companions and Socialbots
The third cluster, human relations with conversational agents, in- The rst cluster encapsulates research on articial companions and
cludes ve signicant keywords. The cluster encapsulates the nature o AI-powered social bots. The top ten most cited articles in this cluster are
the relationship between AI and humans. It is made up o keywords such Boshma et al. (2013), Floridi (2008), Elyashar et al. (2013), Orabi et al.
as “articial companions,” “robots,” “loneliness,” “older adults,” and (2020), Biundo et al. (2016), Hepp (2020), Porra et al. (2020), Portela
“riendship.” In terms o requency, “articial companions” has gained and Granell-Canut (2017), Siourti et al. (2018), and Portacolone et al.
the highest (OC: 7), ollowed by “robots” (OC: 5), “loneliness” (OC: 5), (2020) with 116, 45, 37, 34, 19, 15, 15, 13, 12 and 9 citations respec-
“older adults” (OC: 4), and “riendship” (OC: 3). Moreover, “older tively. Boshma et al. (2013), the cluster's most cited article (TC: 116),
adults” revealed highest average citations (AC: 15.5) which indicates ound that social media networks can be easily exploited with the help o
that authors are more inclined towards the relationship between older social bots' inltration campaigns with a success rate o up to 80 %.
adults and articial companions. Also, the keyword “loneliness” has the Elyashar et al. (2013) advocated the use o programmable social bots by
highest connections in the cluster (DG: 10). The cluster emphasizes the organizations on social media to build personal relations with users.
role o AI companions in mitigating the eects o loneliness in older However, data privacy programs should check inltration campaigns.
adults. In terms o average publication year, all the keywords o this Orabi et al. (2020) reviewed the literature on detection methods and
cluster are hot and trending (APY: 2015–2020.33). ound that designing a bot detector is challenging when botmasters keep
evolving with new inltration techniques.
6.1.4. Cluster 4: enablers o conversational agents The second shade o the cluster emphasizes on the emergence o
The ourth cluster, user experience with conversational agents, articial companions. Floridi (2008) identies three important roles o
combines six keywords. The cluster highlights the eatures oered by AI articial companions in the uture, as a partner, inormation-based
companions, such as “anthropomorphism,” “sel-disclosure,” “trust,” server, and a memory steward that could simulate human lie. More-
and “social presence.” The most requently explored keyword in this over, the author observed that the moral aspects o AI companions are
cluster is “chatbot” (OC: 16), ollowed by “anthropomorphism” (OC: still unexplored. Porra et al. (2020) argued that eelings, the substance o
10), “trust” (OC: 7), social “presence” (OC: 6), “sel-disclosure” (OC: 6), humanness, must be reserved only or human interaction. AI compan-
and “voice assistant” (OC: 5). In addition, this cluster holds the highest ions can transorm human behaviour and actions. Biundo et al. (2016)
cited keyword o the entire dataset, which is the “social presence” (AC: invites cross-disciplinary researchers to develop companion applications
63.66). Also, keywords with the highest connections in the cluster are in robotics, health, and elderly care, etc. The cluster also includes studies
“chatbot” (DG: 25) and “anthropomorphism” (DG: 25). Researchers see on AI companions such as Portela and Granell-Canut (2017), Siourti
“social presence” and “anthropomorphic design cues” as mandatory et al. (2018), and Portacolone et al. (2020) that investigated user
eatures to attain SC. A chatbot can become a companion or its users i it experience, user acceptance, and user behaviour with articial com-
behaves like humans and refects social presence. In terms o average panions, respectively.
publication year, all the keywords o this cluster are hot and trending
(APY: 2019.83–2020.71). 6.2.2. Cluster 2: personication o conversational agents
The second cluster ocuses on the personication o CAs. The top ten
6.1.5. Cluster 5: AI as social companions most cited articles in this cluster are Lopatovska and Williams (2018),
The th cluster, AI as social companions, comprises ve signicant Pradhan et al. (2019), Gao et al. (2018), Cho et al. (2019), Ta et al.
keywords. The cluster ocuses on designing articial intelligence and (2020), Skjuve et al. (2021), Crolic et al. (2022), Brause and Blank
machine-learning-based CAs that can play the role o companions, (2020), Kim and Choudhury (2021), and Wilson-Nash et al. (2020) with
providing social support to its users. Keywords enlisted in the cluster are 92, 42, 38, 37, 31, 12, 10, 9 and 9 citations respectively. Lopatovska and
“articial intelligence,” “Socialbots,” “social support,” “companion,” Williams (2018) ound that Alexa's personication behaviour charac-
and “machine learning.” The highest requency in the cluster is o terize as mindless politeness, is the most cited article (TC: 92) in the
“articial intelligence” (OC: 24), ollowed by “Socialbots” (OC: 3), “so- cluster. However, some evidence reveals that people consider Alexa, an
cial support” (OC: 3), “companion” (OC: 3), and “machine learning” associate partner. For example, Gao et al. (2018) observed that users do
(OC: 3). In terms o average citations, the “social bot” has achieved the personiy the echo as an assistant, riend, amily member, wie, and
highest score in the cluster (AC: 42.66). “Articial intelligence” as a girlriend. The ndings also revealed that users who personiy echo tend
keyword has the highest linkages in the entire dataset (DG: 36), as it is to develop more positive emotions with Alexa than those who treat it as
the primary enabler or designing AI agents. Indeed, the research on SC a speaker. Another study by Brause and Blank (2020) identied various
with CAs is a new and emerging eld o study. In terms o average use genres or smart speakers: companionship, sleep aid, peace o mind,
publication year, all the keywords o this cluster are hot and trending sel-control and productivity, increased accessibility, health care and
(APY: 2016.33–2020). support, convenience, and entertainment. Pradhan et al. (2019) ound
that participants fuidly moved between objectiying and personiying
6.2. Bibliographic coupling analysis (RQ4) echo instead o categorizing it straightorwardly. A similar investigation
on older adults by Kim and Choudhury (2021) reveals some benets
This section deals with the bibliographic coupling o themes in SC (such as enjoyment and convenience) and challenges (unctional errors
with CAs. The current study attempts to triangulate the clusters in and limited speech technology) o using smart speakers. Anthropomor-
keyword co-occurrence analysis in the previous section with the phic design cues can lead to better engagement.
bibliographic coupling themes (Goodell et al., 2021). The bibliographic Crolic et al. (2022) ound anthropomorphism negatively aects

11
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 8 Table 8 (continued )


Bibliographic coupling themes in SC with CAs. Themes and top articles Authors Citations
Themes and top articles Authors Citations
“In A.I., we trust?” The eects o parasocial
Cluster 1: articial companions and social bots interaction and technician versus luddite
Boshma et al. ideological views on chatbot-based customer Youn and Jin (2021) 23
Design and analysis o a social botnet 116
(2013) relationship management in the emerging
Articial intelligence's new rontier: articial “eeling economy.”
Floridi (2008) 45
companions and the ourth revolution Millennials' attitude towards chatbots: an
De Cicco et al.
Homing Socialbots: intrusion on a specic Elyashar et al. experimental study in a social relationship 21
37 (2020)
organization's employee using Socialbots (2013) perspective
Detection o Bots in social media: a systematic Perceiving a mind in a chatbot: eect o mind
Orabi et al. (2020) 34
review perception and social cues on co-presence, Lee et al. (2020a) 20
Companion-technology: an overview Biundo et al. (2016) 19 closeness, and intention to use
Articial companions, social bots, and work How do AI-driven chatbots impact user
bots: communicative robots as research Hepp (2020) 15 experience? examining gratications, Cheng and Jiang
15
objects o media and communication studies perceived privacy risk, satisaction, loyalty, (2020)
Can computer-based human-likeness endanger and continued use
humanness?” – A philosophical and ethical How chatbots' social presence communication
Porra et al. (2020) 15
perspective on digital assistants expressing enhances consumer engagement: the
Tsai et al. (2021) 8
eelings they can't have mediating role o parasocial interaction and
A new riend in our smartphone? Observing dialogue
Portela and Granell-
interactions with chatbots in the search o 13
Canut (2017)
emotional engagement
Cluster 4: social cues o conversational agents
The CaMeLi ramework—A multimodal virtual
Siourti et al. (2018) 12 A taxonomy o social cues or conversational
companion or older adults Feine et al. (2019) 123
agents
Ethical issues raised by the introduction o
Bickmore and Picard
articial companions to older adults with Portacolone et al. Towards caring machines 64
9 (2004)
cognitive impairment: a call or (2020)
Relational agents improve engagement and Bickmore et al.
interdisciplinary collaborations 50
learning in science museum visitors (2011)
Bickmore et al.
Tinker: a relational agent museum guide 41
Cluster 2: personication o conversational agents (2013)
Personication o the Amazon Alexa: BFF or a Lopatovska and Modalities or building relationships with Bickmore and Mauer
92 25
mindless companion Williams (2018) handheld computer agents (2006)
“Phantom riend” or “just a box with Friendship with a robot: children's perception
inormation”: personication and ontological o similarity between a robot's physical and
Pradhan et al. (2019) 42 Sinoo et al. (2018) 24
categorization o smart speaker-based voice virtual embodiment that supports diabetes
assistants by older adults sel-management
Alexa, my love: analyzing reviews o Amazon Can sotware agents infuence human relations?
Gao et al. (2018) 38 Nakanishi et al.
Echo - Balance theory in agent-mediated 24
(2003)
Once a kind riend is now a thing: communities
understanding how conversational agents at Cho et al. (2019) 37 Eectiveness o an empathic chatbot in
De Gennaro et al.
home are orgotten combating adverse eects o social exclusion 21
(2020)
User experiences o social support rom on mood
companion chatbots in everyday contexts: Ta et al. (2020) 31 The impact o interpersonal closeness cues in
thematic analysis text-based healthcare chatbots on attachment Kowatsch et al.
12
My chatbot companion - a study o human- bond and the desire to continue interacting: (2018)
Skjuve et al. (2021) 12
chatbot relationships An experimental design
Blame the Bot: anthropomorphism and anger in Can we be riends with Mitsuku? A longitudinal
Crolic et al. (2022) 10
customer–chatbot interactions study on the process o relationship Croes and Antheunis
11
Externalized domestication: smart speaker ormation between humans and a social (2021)
Brause and Blank
assistants, networks and domestication 10 chatbot
(2020)
theory
Exploring older adults' perception and use o
Kim and Choudhury Cluster 5: articial intelligence with emotional quotient
smart speaker-based voice assistants: A 9
(2021) From Eliza to XiaoIce: challenges and
longitudinal study Shum et al. (2018) 206
opportunities with social chatbots
Introducing the socialbot: a novel touchpoint Wilson-Nash et al.
9 The design and implementation o XiaoIce, an
along the young adult customer journey (2020) Zhou et al. (2020) 94
empathetic social chatbot
When chatbots meet patients: one-year
Cluster 3 user experience with conversational agents prospective study o conversations between Chaix et al. (2019) 42
Living up to the chatbot hype: the infuence o patients with breast cancer and a chatbot
anthropomorphic design cues and I hear you, I eel you”: encouraging deep sel-
Lee et al. (2020c) 32
communicative agency raming on Araujo (2018) 222 disclosure through a chatbot
conversational agent and company How should my chatbot interact? A survey on
Chaves and Gerosa
perceptions social characteristics in human–chatbot 23
(2021)
AI-based chatbots in customer service and their interaction design
Adam et al. (2021) 64
eects on user compliance Eects o cognitive styles on an MSN virtual
Faster is not always better: understanding the learning companion system as an adjunct to Hsieh (2011) 16
Gnewuch et al.
eect o dynamic response delays in human- 50 classroom instructions
(2018)
chatbot interaction The human side o human-chatbot interaction:
Eects o personalization and social role in a systematic literature review o ten years o Rapp et al. (2021) 14
voice shopping: an experimental study on Rhee and Choi research on text-based chatbots
29
product recommendation by a conversational (2020) Designing a chatbot as a mediator or
voice agent promoting deep sel-disclosure to a real Lee et al. (2020b) 14
The impact o chatbot conversational skills on Schuetzler et al. mental health proessional
25
engagement and perceived humanness (2020) The rise o emotion-aware conversational
Mensio et al. (2018) 8
agents: threats in digital emotions
(continued on next page)

12
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 8 (continued ) attachment bonds o the users with healthcare chatbots. Agent output
Themes and top articles Authors Citations modalities also aect the human-computer relationship status. Bickmore
and Mauer (2006) identied our dierent modalities (text with no
Chatbot to improve learning punctuation in
Spanish and enhance open and fexible
Vázquez-Cano et al.
6
image, text with a static image, animated and animated with nonverbal
(2021) speech) and ound that embodied and animated agents ormed stronger
learning environments
social bonds with the users. This cluster also highlights the role o AI in
combating loneliness and social exclusion. Interestingly, empathetic
chatbots can now mitigate the eects o social exclusion on mood and
customer satisaction because o loty expectations rom a humanlike
eelings (De Gennaro et al., 2020). Bickmore and Picard (2004)
chatbot and suggested empathetic chatbots can tackle human emotions
demonstrate that computers can signicantly impact users' perception o
(Crolic et al., 2022). Smart speaker usage decreases over time and loses
care. Bickmore et al. (2011) and Bickmore et al. (2013) ound increased
its presence at home (Cho et al., 2019). Notably, most studies examined
engagement o museum visitors when interacting with a virtual
user behaviour with Alexa or Google Home, however, advanced CAs
anthropomorphic robot. However, a study on diabetic children by Sinoo
provide more emotional and social support to their users. For example,
et al. (2018) revealed that physical robots established a stronger
Replika interacts with its users to ull their emotional desires. Ta et al.
riendship with children than virtual avatars. Contrarily, Croes and
(2020) examined 1854 user reviews and interviewed sixty-six o Replika
Antheunis (2021), a study on user relations with “Mitsuku”, ound that
users and ound that Replika can mitigate the eelings o loneliness in
participants experienced low eelings o riendship with the bot; also, the
users. Skjuve et al. (2021) echoed Ta et al. (2020) work and identied
social process decreased ater each interaction.
key characteristics o Replika that are the non-judgmental, under-
standing and accepting nature o the chatbot.
6.2.5. Cluster 5: articial intelligence with emotional quotient
The th cluster concentrates on emotion-aware chatbots that can
6.2.3. Cluster 3: user experience with conversational agents
understand human eelings. The top ten cited articles in this cluster are
The third cluster captures research on user experience with CAs. The
Shum et al. (2018), Zhou et al. (2020), Chaix et al. (2019), Lee et al.
top ten most cited articles in this cluster are Araujo (2018), Adam et al.
(2020c), Chaves and Gerosa (2021), Hsieh (2011), Rapp et al. (2021),
(2021), Gnewuch et al. (2018), Rhee and Choi (2020), Schuetzler et al.
Lee et al. (2020b), Mensio et al. (2018), and Vázquez-Cano et al. (2021)
(2020), Youn and Jin (2021), De Cicco et al. (2020), Lee et al. (2020a),
with 206, 94, 42, 32, 23, 16, 14, 14, 8, 6 respectively. Shum et al. (2018)
Cheng and Jiang (2020), Tsai et al. (2021) with 222, 64, 50, 29, 25, 23,
review o chatbot evolution rom Eliza (1960) to Xiaoice (2014), which
21,20,15, and 8 citations respectively. The most cited article (TC: 222)
demonstrated how XiaoIce could engage humans or long conversations
in the cluster, Araujo (2018), revealed that anthropomorphic design
through recognizing their emotions, has received the highest citations
cues infuence consumers' emotional connection with the organization
(TC: 206) in the cluster. Another study on XiaoIce by Zhou et al. (2020)
with a mediating role o social presence. AI-driven chatbots oer utili-
measured the chatbot's eectiveness using conversation turns per ses-
tarian, hedonic, technology, and social gratications that lead to
sion (CPS) and ound that the chatbot achieved an average CPS o 23,
customer satisaction, customer loyalty, and continued use (Cheng and
higher than any other chatbot or even humans. Chatbots' sel-disclosure
Jiang, 2020). The purpose o anthropomorphism is to oer a human-like
can trigger humans to disclose their personal eelings and thoughts (Lee
interaction experience to consumers. Gnewuch et al. (2018) ound that
et al., 2020c). “Conucius”, a virtual learning companion, signicantly
chatbots' dynamic delayed responses increased perceived humanness
beneted eld-dependent learners (Hsieh, 2011). Even in the eld o
and customer satisaction. A similar study by Adam et al. (2021) ound
language learning, students value chatbots as it provides greater support
that both anthropomorphism and consistent staying o the chatbot can
and companionship in the learning process (Vázquez-Cano et al., 2021).
signicantly increase user compliance on chatbot requests. Schuetzler
All humans need an understanding and supporting associate every-
et al. (2020) demonstrated that the conversational skill o the chatbot
where, irrespective o the eld (teaching, health and support, enter-
leads to anthropomorphism and social presence.
tainment, etc.). Empathetic conversational agents can now do the
Moreover, instead o an assistant, a riend role played by the chatbot
needul with the help o an inbuilt emotional quotient. Chaix et al.
positively infuences consumer engagement. De Cicco et al. (2020)
(2019) conducted a study on 4797 cancer patients and observed that 88
showed that instead o being task-oriented, social-oriented interaction
% o participants elt that “Vik”, a social chatbot helped and supported
style o chatbot positively infuences user's perception o social presence.
them in tracking their treatment. A similar study by Lee et al. (2020b)
Also, the social role played by chatbots and personalized interaction
ound that participants revealed more inormation to a chatbot than a
infuence customer attitudes towards the product recommended by the
mental health proessional. Considering the development o conversa-
conversational agent (Rhee and Choi, 2020). When interacting with
tional agents to an emotional awareness level, Mensio et al. (2018)
media characters, the user's interpersonal involvement represents par-
questioned the understanding o chatbots towards human values.
asocial interaction. Perceived parasocial interaction mediates the in-
fuence o social presence on consumer engagement outcomes (Tsai
7. Towards a conceptual framework (RQ5)
et al., 2021). Additionally, Youn and Jin (2021) ound that a riend
chatbot can build stronger parasocial interactions with consumers
The current study presents the triangulation o major themes in SC
compared to an assistant chatbot.
with CAs using keyword co-occurrence analysis and bibliographic
coupling analysis (Table 9). Extant literature is scant in presenting a
6.2.4. Cluster 4: social cues o conversational agents
holistic view on social companionship with AI. Thus, this study proposes
The ourth cluster highlights research on the social cues o conver-
a conceptual ramework that has emerged rom the content analysis o
sational agents. The top ten most cited articles in this cluster are Feine
the articles used or conducting the current review. The ramework
et al. (2019), Bickmore and Picard (2004), Bickmore et al. (2011),
details the antecedents, mediators, moderators, and outcomes o estab-
Bickmore et al. (2013), Bickmore and Mauer (2006), Sinoo et al. (2018),
lishing SC with CAs presented in Fig. 5, which provides oundational
Nakanishi et al. (2003), De Gennaro et al. (2020), Kowatsch et al.
knowledge to uture researchers and scholars in SC with CAs. Note-
(2018), and Croes and Antheunis (2021) with 123, 64, 50, 41, 25, 24,
worthily, although the ramework presents a comprehensive view o SC
24, 21, 12, and 11 respectively. Feine et al. (2019) review segmented
with CAs, this should not be considered as specic or all-inclusive;
orty-eight social cues into our categories (verbal, visual, auditory, and
rather, it should be viewed as a source o elemental knowledge that
invisible). Kowatsch et al. (2018) identied closeness cues (such as vi-
practitioners, designers, managers, business owners, and uture re-
sual, verbal, quasi-nonverbal, and relational) that can infuence the
searchers can utilize to expand the existing boundaries o the domain.

13
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Table 9 Table 9 (continued )


Summary o research on SC with CAs. Perormance Keyword co- Bibliographic Conceptual
Perormance Keyword co- Bibliographic Conceptual analysis occurrence analysis coupling ramework
analysis occurrence analysis coupling ramework
loneliness reduction,
• Field • Body o emotional
• Body o • Body o
perormance knowledge connection,
knowledge knowledge
• Bibliometric • Citing continued use,
• Author keywords • Based on actors
inormation publication service loyalty

Perormance Themes Themes Factors


7.1. Antecedents
Antecedents o SC
with CA
Antecedents are determinants o the outcomes, and their eects on
Conversational the outcome variables can be mediated and moderated by intervening
capability: variables. The current research broadly buckets antecedents o SC with
conversational skill,
humour and voice,
CAs into three categories, conversational capability and unctional
interaction style, capability. Conversational capability refects the aspects related to skills
response time, sel- o articial intelligence to communicate and interact with humans, such
expressive as conversational skill (Schuetzler et al., 2020), humour and voice
Functional
(Moussawi and Benbunan-Fich, 2021), interaction Style (De Cicco et al.,
Publication Cluster1: capability:
activity: a personication o inormation and 2020), the response time (Gnewuch et al., 2018) and sel-expressivity
total o 126 conversational entertainment (Ramadan, 2021). Functional capability reers to the aspects related to
articles were agents: embodied Social capability: the skills o AI to perorm various tasks or the users, such as providing
published conversational role o chatbot, inormation (Cheng and Jiang, 2020) and entertainment (Cheng and
between agent, aective anthropomorphic
2003 and computing, design, media
Jiang, 2020). Social Capability refects the aspects related to skills o AI to
May 2022. personication, richness and mind provide social support and connection to the users such as the role o
Amazon Echo, perception chatbot (Youn and Jin, 2021), anthropomorphic design cues (Araujo,
Top sources Amazon Alexa Cluster 1: 2018), media richness (Hsieh and Lee, 2021), mind perception (Lee
articial Mediators or SC
et al., 2020a).
• Most Cluster 2: articial companions with CAs:
citations: companions and and Socialbots perceived dialogue,
computers in Social Bots: trust attitude, 7.2. Mediators
human conversational Cluster 2: product
behaviour agents, human- personication involvement,
(TC: 286 computer o perceived Mediators are intervening actors that dene the nature o relation-
citations) interaction, user conversational intelligence, ship between antecedents and outcomes. The study identied mediators
• Most experience, mobile agents perceived such as perceived dialogue (Tsai et al., 2021), trust (Pitardi and Mar-
publications: phone, Socialbot anthropomorphism, riott, 2021), attitude (Hsieh and Lee, 2021), product involvement (Rhee
lecture notes Cluster 4: social social presence,
in computer Cluster 3: human cues o parasocial
and Choi, 2020), perceived intelligence (Moussawi and Benbunan-Fich,
science (NP: relations with conversational interaction, privacy 2021), perceived anthropomorphism (Moussawi and Benbunan-Fich,
7 conversational agents concern, task and 2021), social presence (Adam et al., 2021), parasocial Interaction
publications) agents: articial social attraction, (Youn and Jin, 2021), privacy Concern (Bawack et al., 2021), task and
companions, robots, Cluster 3: user closeness, love,
social attraction (Lei et al., 2021), closeness (Lee et al., 2020a), love
Top loneliness, older experience passionate desire to
publications: adults, riendship with use AI (Hernandez-Ortega and Ferreira, 2021), and passionate desire to use AI
Araujo, 2018 conversational (Ramadan, 2021).
(222 citations) Cluster 4: enablers agents Moderators or SC
o conversational with CA
Top authors agents: Chatbot, Cluster 5: situational 7.3. Moderators
anthropomorphism, articial characteristics:
• Most trust, social intelligence (brand involvement, Moderators infuence the degree o relationship between antecedents
citations: presence, sel- with emotional mood - angry or and outcomes; antecedents and mediators; and mediators and conse-
Shum HY and disclosure, voice quotient. calm)
quences. Two moderators play the oremost role in dening social
Li D (three assistants
hundred User characteristics: companionship with conversational agents: situational actors and user
citations Cluster 5: AI as psychographic characteristics. Situational characteristics refect the environmental in-
each) social companions: characteristics fuence, such as brand involvement (Hasan et al., 2021) and customer
• Most articial (experience with
mood (angry or Calm) (Crolic et al., 2022), while user characteristics
publications: intelligence, chatbot, consumer
Bickmore T Socialbots, social innovativeness, refect the individual's psychographic actors, such as past experience
(our support, companion, ideological views (Kowatsch et al., 2018), consumer innovativeness (Kasilingam, 2020),
publications) machine learning and ideological views on technology (Youn and Jin, 2021), as well as
Demographic demographic characteristics o individuals such as their age (Pradhan
characteristics: age,
et al., 2019) and gender (Bergen, 2016).
gender

Outcomes o SC with 7.4. Outcomes


CA:
intention to use,
user engagement, Outcomes are the actors that results rom the infuence o ante-
perceived cedents and intervening variables like mediators and moderators. Social
humanness, user companionship with conversational agents lead to intention to use
experience,
chatbots (Lee et al., 2020a), user engagement (Tsai et al., 2021),
perceived humanness (Gnewuch et al., 2018), user experience (Bawack

14
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Fig. 5. Conceptual ramework or antecedents, mediators, moderators, and outcomes o SC with CAs.

et al., 2021), loneliness reduction (Jones et al., 2021), emotional are now developing AI agents to orm emotional connections with
connection (Araujo, 2018), continued use (Cheng and Jiang, 2020), and humans. The multiold applications make the conversational commerce
service Loyalty (Hernandez-Ortega and Ferreira, 2021). vital and uturistic warranting academicians and practitioners'
The proposed conceptual ramework unurls three oundational ca- attention.
pabilities in a conversational AI that intends to orm a relational bond In terms o the source perormance, review ound that Computers in
with the users: Conversational Capabilities, Functional Capabilities and Human Behaviour is the preerred avenue or publication with the
Social Capabilities. Future researchers are guided to investigate each highest citations (286). Majorly the top sources published in the eld
capability and its signicance in enabling CA designs to achieve elevated belongs to computer science, communication, human-computer inter-
companionship levels. As the ramework emerges rom the extant action, and psychology, highlighting the multidisciplinary scope or
literature, one notable observation is that there can be an addition to research on SC with CAs. Regarding the author's perormance, the study
capabilities beyond what is already in the literature, such as emotional, ound that Shum and Li are the most cited (300 citations) authors in the
motivational, and educational capabilities. This opens immense oppor- eld. However, Bickmore has the highest number o articles (4). The top
tunities or practitioners to design AI companions with multiple capa- authors come rom both practice and academics. Moreover, signicant
bilities. The ramework also proposes that the more capable a CA is, the contributions come rom western countries, which suggests more
stronger the companionship it can develop with the user. investigation and research are required in other parts o the globe to
promote higher diversity and inclusivity in the eld. The study has also
8. Disscussion revealed a plethora o theories used in SC with CAs. Theories are rom
varied disciplines such as sociology, psychology, media and communi-
The present study oered a state-o-the-art literature review cation, marketing, and computer science, with the earliest theory rom
encompassing science mapping and intellectual structure analysis. Sci- the eld o psychology in 1946 (i.e., balance theory) and the latest
ence mapping entail perormance analysis o scientic actors like theory rom the eld o sociology in 2018 (i.e., actor-network theory).
sources, authors and documents in the research domain. Intellectual Noteworthily, psychology and sociology are prominent as the area is
structure analysis entail keyword co-occurrence and bibliographic primarily associated with human psychology and relationship develop-
coupling analysis. Intellectual structure analysis also discerns major ment. Future researchers are encouraged to use communication the-
theories, themes and conceptual ramework encompassing antecedents, ories, morality and ethics, and theories rom human relations, which can
mediator, moderators, and outcomes o social companionship with help develop and examine more advanced AI companions.
conversational commerce.
The science mapping reveals the growing interest in the research 8.1. Theoretical contributions and implications
domain, and the spurt indicates the relevance o the topic in the
contemporary times. The study identied 126 articles on SC with CAs The present study's theoretical contributions align with the guide-
published between 2003 and May 2022 (RQ1). The publication trend o lines or advancing theories with the help o bibliometric research
the eld indicated that about 66 % o articles have appeared only in the (Mukherjee et al., 2022). The study draws three essential and notable
last three years (2019–2022). The recent emergence o the topic is due to contributions. The rst contribution appears to promote the objective
the technological advancement through AI in the ourth industrial rev- discovery o the eld's knowledge clusters, namely articial companions
olution that gave rise to several AI applications in the market. Businesses and Socialbots, personication o conversational agents, user experience

15
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

with conversational agents, social cues o conversational agents, and adopting advanced technologies (Omrani et al., 2022). Future re-
articial intelligence with emotional quotient. The knowledge clusters searchers and practitioners must design companions laden with moral
enrich the understanding o research design and philosophy along with values and ethics to trigger positive thoughts in human minds. Thus, the
the major streams. The objective discovery also helps strengthen and study proposes the ollowing research questions:
develop the domain's theoretical oundations. More importantly, uture
FRQ1. How does social bot inltration infuence consumer behaviour?
researchers can identiy and explore the research stream to ll the
existing research gaps discussed in a later section o this paper. The FRQ2. What are articial companions' positive and negative impacts on
second contribution appears to deliver a nomological clarity o con- human emotions, decisions, and actions?
structs and the network that maps them together through co-occurrence
FRQ3. What actors are responsible or perceiving an articial companion
analysis o author keywords. Finally, the third contribution comes
as moral and ethical?
through the development o the conceptual ramework (Fig. 5), along
with enlisting the theories in SC with CAs (Table 6). The proposed ho-
8.3.2. Personication o conversational agents
listic conceptual ramework refects the eld's antecedents, mediators,
Anthropomorphic chatbot design leads to personication behaviour
moderators, and outcomes, unolding numerous opportunities or
but, at times, increases user expectations, which may negatively aect
theoretical developments to the current understanding o the domain.
customer satisaction when a customer is angry (Crolic et al., 2022).
Future researchers should design empathetic chatbots to understand
8.2. Managerial contributions and implications
user emotions beore responding to users' needs. Future bots should
redirect clients to human executives i they sense uriated customers.
The managerial contribution o the current study aligns practice with
Also, AI can help patients and older adults remember the medicines and
theory (Mukherjee et al., 2022). First, this review contributes to practice
diet plans to ensure their well-being. Extant literature points at chatbot
by providing managers with a macroscopic overview o research and
personality; however, it is silent on the infuence o the user's personality
development in social companionship with conversational agents. The
on human-computer interaction. Extrovert users nd AI agents as talk-
top publications in Table 4 can help managers know more about arti-
ative and active compared to introverts. Users may lose connection with
cial companions and their relationships with humans. More specically,
the smart speakers due to the conversational agent's inability to modiy
the review allows managers to look at dierent streams o research on
their conversation style based on the user's personality (Cho et al.,
social companionship with AI, namely articial companions and
2019). Future researchers should work on CAs that assess users' per-
Socialbots, personication o conversational agents, user experience
sonalities and interact accordingly. Thus, the study proposes the
with conversational agents, social cues o conversational agents, and
ollowing uture research questions:
articial intelligence with emotional quotient along with top publica-
tions o each stream. FRQ4. What actors are responsible or perceiving an articial agent with a
Second, this study contributes to practice by presenting the list o top caring and loving personality?
authors in the eld o SC with CAs, which can help managers to reach
FRQ5. How should conversational agents be designed that support patients'
out to the experts in the area or any guidance they require while
well-being, lonely individuals, and older citizens?
designing or developing a new articial companion. Managers and
practitioners can contact the top authors to gain an expert opinion on SC FRQ6. What is the impact o user personality on interaction with articial
with CAs using Table 5. Though experts and leading authors are oten companions?
busy and are not readily available or casual appointments, they
consider good collaboration opportunities oered respectully depend- 8.3.3. User experience with conversational agents
ing upon the research model, scope, and interest. User experience has always taken center stage whenever technology
Third, this study contributes to practice by proposing the conceptual transorms the business. According to Araujo (2018), the anthropo-
ramework (Fig. 5) developed through the content analysis o 126 arti- morphic design o chatbots lead to an emotional connection between the
cles that can help managers to look at antecedents, mediators, moder- organization and users. The stream highlights research on user percep-
ators, and outcomes o SC with CAs. Also, managers are encouraged to tion, satisaction, and experience with conversational agents based on
consider essential variables while designing companion technology or the perception o social presence and anthropomorphism. However, the
the marketplace. literature is scant on the role o SC in infuencing the user experience.
Factors such as perceived intelligence, anthropomorphism, and social
8.3. Future research directions (RQ6) presence are predominant in the literature, while conversational capa-
bility's role in building relationships remained unexplored. Schuetzler
The review suggests several uture research agendas or the major et al. (2020) ound that conversational skill leads to perceived anthro-
themes identied in the study on SC with CAs. The ollowing sub- pomorphism and social presence. Future researchers should study the
sections unold new and exciting avenues in the domain. determinants o conversational capability and its infuence on human-
computer relationships. Communication theories could help uture re-
8.3.1. Articial companions and Socialbots searchers design more interesting articial companions. Thereore, the
Empathetic CAs oer companionship to humans. However, social study proposes the ollowing research questions:
bots may exploit and misuse users' personal inormation via inltration
FRQ7. What is the user experience in a relationship with articial
on social media platorms such as Facebook and Twitter (Boshma et al.,
companions?
2013; Elyashar et al., 2013). Social bot design advancements challenge
bot detection (Orabi et al., 2020). It leads to conusion in the emerging FRQ8. Does human-computer interaction aect the user's relations with
metaverse to gure out whether proles are o real humans or ake bots. other humans in society?
Future researchers need to design ecient socialbot detectors to deal
FRQ9. What actors determine the conversational capability o an articial
with this conusion. AI companions also bring challenges; or example,
companion?
the growing infuence o such technology in human society will reshape
human emotions, decisions, and actions (Floridi, 2008). AI companions
8.3.4. Social cues o conversational agents
may ampliy the negative emotions o the user i not programmed
Conversational agents display various social cues, such as verbal,
morally. Negative emotions may induce depression and suicidal ten-
visual, auditory, and invisible (Feine et al., 2019). These social cues
dencies (Possati, 2022). Trust, is always a concern or users beore

16
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

create warmth in the conversational agents, making them converse like CRediT authorship contribution statement
humans. Bickmore and Mauer (2006) ound that embodied and
animated relational agents tend to orm stronger social bonds with the All authors have contributed to all aspects o this paper.
users than disembodied and non-animated agents. Thus, it would be
interesting to know about the social signals responsible or establishing
an emotional connection between users and CAs. Croes and Antheunis Declaration of competing interest
(2021) ound that social processes decrease ater interacting with the
chatbot. However, studies on chatbots like Tinker and Replika suggest There is no conficts o interest or this manuscript.
that users can orm long-time relationships with AI companions (Bick-
more et al., 2011; Bickmore et al., 2013; Ta et al., 2020; Skjuve et al., Data availability
2021). The contradictory ndings suggest uture researchers explore the
comparative perormance o two or more articial companions, such as Data will be made available on request.
Mitsuku and Replika. Thus, the study proposes the ollowing uture
research questions: References
FRQ10. Which social signals or cues are responsible or establishing Aaker, J.L., 1997. Dimensions o brand personality. J. Mark. Res. 34 (3), 347–356.
companionship and emotional connection with the conversational agents? Adam, C., Cavedon, L., Padgham, L., 2010, July. “Hello Emily, how are you today?”-
personalized dialogue in a toy to engage children. In: Proceedings o the 2010
FRQ11. What is the comparative perormance o two or more articial Workshop on Companionable Dialogue Systems, pp. 19–24.
companions? Adam, M., Wessel, M., Benlian, A., 2021. AI-based chatbots in customer service and their
eects on user compliance. Electron. Mark. 31 (2), 427–445.
FRQ12. How do service settings, socioeconomic status and cultural norms Aksnes, D.W., Langeldt, L., Wouters, P., 2019. Citations, citation indicators, and
research quality: an overview o basic concepts and theories. SAGE Open 9 (1),
infuence the selection o social cues and design o articial companions? 2158244019829575.
Altman, I., Taylor, D.A., 1973. Social Penetration: The Development o Interpersonal
8.3.5. Articial intelligence with emotional quotient Relationships. Holt, Rinehart & Winston.
Ameen, N., Tarhini, A., Reppel, A., Anand, A., 2021. Customer experiences in the age o
Conversational agents have evolved through the years, rom ELIZA
articial intelligence. Comput. Hum. Behav. 114, 106548.
(1960) to empathetic chatbots like XiaoIce (2014) (Shum et al., 2018). Araujo, T., 2018. Living up to the chatbot hype: the infuence o anthropomorphic design
Empathetic chatbots are conversational agents that can recognize and cues and communicative agency raming on conversational agent and company
understand human emotions. Recently, Google's Lambda is believed to perceptions. Comput. Hum. Behav. 85, 183–189.
Aristotle (2009) Nicomachean Ethics (trans: Ross, W.D.). Oxord: Oxord University.
have become sentient and developed human-like eelings (that include Aw, E.C.X., Tan, G.W.H., Cham, T.H., Raman, R., Ooi, K.B., 2022. Alexa, what’s on my
love, ear, condence, etc.). Noteworthily the degree o chatbot's sel- shopping list? Transorming customer experience with digital voice assistants.
awareness or emotional awareness is unexplored. It opens new ave- Technol. Forecast. Soc. Chang. 180, 121711.
Barrett, K., Campos, J., 1987. Perspectives on emotional development: II. A unctionalist
nues o research on articial lie, articial consciousness, and articial approach to emotions. In: Handbook o Inant Development, 2nd ed. Wiley-
beings. There is scope or uture research comparing human empathy Interscience, New York, pp. 555–578.
with AI empathy. Articial intelligence with an intellectual quotient has Bawack, R.E., Wamba, S.F., Carillo, K.D.A., 2021. Exploring the role o personality, trust,
and privacy in customer experience perormance during voice shopping: evidence
snatched several jobs rom humans in the market. Would it be accept- rom SEM and uzzy set qualitative comparative analysis. Int. J. In. Manag. 58,
able to design AI with an emotional quotient too? And what about hu- 102309.
manity, then? Feelings and emotions are humans' assets, making them Bem, D.J., 1972. Sel-perception theory. Adv. Exp. Soc. Psychol. 6, 1–62.
Benyon, D., Mival, O., 2010, December. From human-computer interactions to human-
dierent rom machines. As Porra et al. (2020) questioned the companion relationships. In: Proceedings o the First International Conerence on
Humanlikliness o AI agents, arguing that eelings, the very substance o Intelligent Interactive Technologies and Multimedia, pp. 1–9.
humanness, must be reserved only or human interaction. Thereore, the Bergen, H., 2016. ‘I'd blush i I could': digital assistants, disembodied cyborgs and the
problem o gender. Word and text, A Journal o Literary Studies and Linguistics, 6
study proposes the ollowing research questions:
(01), 95–113. In: Proceedings o the 8th International Conerence on Human-Agent
Interaction, pp. 221–223.
FRQ13. What actors indicate that a conversational agent has eelings,
Bickmore, T., Mauer, D., 2006, April. Modalities or building relationships with handheld
sel-awareness, and consciousness like humans? computer agents. In: CHI'06 Extended Abstracts on Human Factors in Computing
Systems, pp. 544–549.
FRQ14. How do individual dierences aect their perceived awareness Bickmore, T., Peier, L., Schulman, D., 2011, September. Relational agents improve
and consciousness o a conversational agent? engagement and learning in science museum visitors. In: International Workshop on
Intelligent Virtual Agents. Springer, Berlin, Heidelberg, pp. 55–67.
FRQ15. What are the opportunities, challenges, and threats to human so- Bickmore, T.W., Picard, R.W., 2004, April. Towards caring machines. In: CHI’04
ciety rom articial intelligence with an emotional quotient? Extended Abstracts on Human Factors in Computing Systems, pp. 1489–1492.
Bickmore, T.W., Vardoulakis, L.M.P., Schulman, D., 2013. Tinker: a relational agent
museum guide. Auton. Agent. Multi-Agent Syst. 27 (2), 254–276.
9. Conclusion Biundo, S., Höller, D., Schattenberg, B., Bercher, P., 2016. Companion-technology: an
overview. Künstl. Intell. 30 (1), 11–20.
Bloch, L.-R., Lemish, D., 1999. Disposable love: the rise and all o a virtual pet. New
This study is a comprehensive systematic review o SC with CAs that Media Soc. 1 (3), 283–303.
attempts to reveal the nuances o articial companions and their rela- Blondel, V.D., Guillaume, J.L., Lambiotte, R., Leebvre, E., 2008. Fast unolding o
tionship with humans. The covid-19 pandemic and other reasons or communities in large networks. J. Stat. Mech: Theory Exp. 10008, 1–12.
Bosch, M., Fernandez-Borsot, G., Comas, M.I., Figa Vaello, J., 2022. Evolving riendship?
social exclusion result in eelings o loneliness among individuals, Essential changes, rom social networks to articial companions. Soc. Netw. Anal.
including older adults. The articial intelligence expertise o human Min. 12 (1), 1–10.
society has overcome this issue by designing advanced articial com- Boshma, Y., Muslukhov, I., Beznosov, K., Ripeanu, M., 2013. Design and analysis o a
social botnet. Comput. Netw. 57 (2), 556–578.
panions such as XiaoIce and Replika, which can unction as therapeutic Bothun, D., Lieberman, M., Rao, A.S., 2017. Bot. In: Me: A Revolutionary Partnership.
resources or their users. The uture o human-computer interaction lies Bourdieu, P., 1980. Le sens pratique. Minuit, Paris.
in developing AI companions, their capabilities, determinants, social Boyack, K.W., Klavans, R., 2010. Co-citation analysis, bibliographic coupling, and direct
citation: which citation approach represents the research ront most accurately.
acceptability, and their infuence on society. The study highlights the
J. Am. Soc. In. Sci. Technol. 61 (12), 2389–2404.
usage o articial companions in each service sector (hospitality, Bracken, L.J., Oughton, E.A., 2006. ‘What do you mean?’The importance o language in
tourism, education, healthcare, entertainment, etc.). Thereore, AI developing interdisciplinary research. Trans. Inst. Br. Geogr. 31 (3), 371–382.
companions in the customer journey are a paradigm shit in designing Brause, S.R., Blank, G., 2020. Externalized domestication: smart speaker assistants,
networks and domestication theory. In. Commun. Soc. 23 (5), 751–763.
the marketing strategies. In such a scenario, this literature review oers Burgoon, J.K., Hale, J.L., 1988. Nonverbal expectancy violations: model elaboration and
a bird's eye view on the emerging eld o SC with CAs. application to immediacy behaviors. Commun. Monogr. 55 (1), 58–79.

17
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Callon, M., Courtial, J.P., Turner, W.A., Bauin, S., 1983. From translations to problematic Gao, Y., Pan, Z., Wang, H., Chen, G., 2018, October. Alexa, my love: analyzing reviews o
networks: an introduction to co-word analysis. Soc. Sci. In. 22 (2), 191–235. amazon echo. In: 2018 IEEE Smart World, Ubiquitous Intelligence and Computing,
Campos, J., Paiva, A., 2010, September. May: my memories are yours. In: International Advanced and Trusted Computing, Scalable Computing and Communications, Cloud
Conerence on Intelligent Virtual Agents. Springer, Berlin, Heidelberg, pp. 406–412. and Big Data Computing, Internet o People and Smart City Innovation. IEEE,
Cavazza, M., Smith, C., Charlton, D., Zhang, L., Turunen, M., Hakulinen, J., 2008. pp. 372–380.
A ‘companion’ ECA with planning and activity modelling. Auton. Agent. Multi-Agent Gasteiger, N., Loveys, K., Law, M., Broadbent, E., 2021. Friends rom the uture: a
Syst. 1281–1284. scoping review o research into robots and computer agents to combat loneliness in
Chaix, B., Bibault, J.E., Pienkowski, A., Delamon, G., Guillemassé, A., Nectoux, P., older people. Clin. Interv. Aging 941–971.
Brouard, B., 2019. When chatbots meet patients: one-year prospective study o Gnewuch, U., Morana, S., Adam, M. T., and Maedche, A. (2018). Faster is not always
conversations between patients with breast cancer and a chatbot. JMIR Cancer 5 (1), better: understanding the eect o dynamic response delays in human-chatbot
e12856. interaction. In 26th European Conerence on Inormation Systems: Beyond
Chandra, S., Verma, S., Lim, W.M., Kumar, S., Donthu, N., 2022. Personalization in Digitization-acets o Socio-technical Change, ECIS 2018, Portsmouth, UK, June 23-
personalized marketing: trends and ways orward. Psychol. Mark. 39 (8), 28, 2018. Ed.: U. Frank (p. 143975).
1529–1562. Goodell, J.W., Kumar, S., Lim, W.M., Pattnaik, D., 2021. Articial intelligence and
Chaturvedi, R., Verma, S., 2022. Articial Intelligence-Driven Customer Experience: machine learning in nance: identiying oundations, themes, and research clusters
Overcoming the Challenges. Caliornia Management Review Insights. Accessed rom. rom the bibliometric analysis. J. Behav. Exp. Financ. 32, 100577.
https://cmr.berkeley.edu/assets/documents/pd/2022-03-artiicial-intelligence-dr Ha, Q.A., Chen, J.V., Uy, H.U., Capistrano, E.P., 2021. Exploring the privacy concerns in
iven-customer-experience-overcoming-the-challenges.pd. using intelligent virtual assistants under perspectives o inormation sensitivity and
Chaves, A.P., Gerosa, M.A., 2021. How should my chatbot interact? A survey on social anthropomorphism. Int. J. Hum. Comput. Interact. 37 (6), 512–527.
characteristics in human–chatbot interaction design. Int. J. Hum. Comput. Interact. Hamilton, R., Ferraro, R., Haws, K.L., Mukhopadhyay, A., 2021. Traveling with
37 (8), 729–758. companions: the social customer journey. J. Mark. 85 (1), 68–92.
Chen, C., Ibekwe-SanJuan, F., Hou, J., 2010. The structure and dynamics o co-citation Hasan, R., Shams, R., Rahman, M., 2021. Consumer trust and perceived risk or voice-
clusters: a multiple perspective co-citation analysis. J. Am. Soc. In. Sci. Technol. 61 controlled articial intelligence: the case o Siri. J. Bus. Res. 131, 591–597.
(7), 1386–1409. Heider, F., 1946. Attitudes and cognitive organization. J. Psychol. 21 (1), 107–112.
Cheng, Y., Jiang, H., 2020. How do AI-driven chatbots impact user experience? Hepp, A., 2020. Articial companions, social bots and work bots: communicative robots
Examining gratications, perceived privacy risk, satisaction, loyalty, and continued as research objects o media and communication studies. Media Cult. Soc. 42 (7–8),
use. J. Broadcast. Electron. Media 64 (4), 592–614. 1410–1426.
Cho, M., Lee, S.S., Lee, K.P., 2019, June. Once a kind riend is now a thing: Hernandez-Ortega, B., Ferreira, I., 2021. How smart experiences build service loyalty:
Understanding how conversational agents at home are orgotten. In: Proceedings o the importance o consumer love or smart voice assistants. Psychol. Mark. 38 (7),
the 2019 on Designing Interactive Systems Conerence, pp. 1557–1569. 1122–1139.
Cialdini, R.B., 2001. Harnessing the Science o Persuasion. Hirano, T., December 2016. Gatebox, Holographic Virtual Assistant, Launches Pre-orders
Cole, T., Bradac, J.J., 1996. A lay theory o relational satisaction with best riends. or Geeks in Japan, US. The Bridge. Retrieved rom. http://thebridge.jp/en/2016/1
J. Soc. Pers. Relat. 13 (1), 57–83. 2/gatebox-launch. Retrieved rom.
Conversational AI market report (2021). Markets and markets.com. Accessed on Hjørland, B., 2013. Citation analysis: a social and dynamic approach to knowledge
February, 27 2023, at https://www.marketsandmarkets.com/Market-Reports/c organisation. In. Process. Manag. 49 (6), 1313–1325.
onversational-ai-market-49043506.html#:~:text=%5B317%20Pages%20Report% Homans, G.C., 1958. Social behavior as exchange. Am. J. Sociol. 63 (6), 597–606.
5D%20The%20global,USD%206.8%20billion%20in%202021. Hsieh, S.H., Lee, C.T., 2021. Hey Alexa: examining the eect o perceived socialness in
Croes, E.A., Antheunis, M.L., 2021. Can we be riends with Mitsuku? A longitudinal study usage intentions o AI assistant-enabled smart speaker. J. Res. Interact. Mark. 15 (2),
on the process o relationship ormation between humans and a social chatbot. 267–294.
J. Soc. Pers. Relat. 38 (1), 279–300. Hsieh, S.W., 2011. Eects o cognitive styles on an MSN virtual learning companion
Crolic, C., Thomaz, F., Hadi, R., Stephen, A.T., 2022. Blame the bot: anthropomorphism system as an adjunct to classroom instructions. J. Educ. Technol. Soc. 14 (2),
and anger in customer–chatbot interactions. J. Mark. 86 (1), 132–148. 161–174.
Dat, R.L., Lengel, R.H., 1986. Organizational inormation requirements, media richness Huang, M.H., Rust, R., Maksimovic, V., 2019. The eeling economy: managing in the next
and structural design. Manag. Sci. 32 (5), 554–571. generation o articial intelligence (AI). Cali. Manag. Rev. 61 (4), 43–65.
Darcy, A., Daniels, J., Salinger, D., Wicks, P., Robinson, A., 2021. Evidence o human- Jones, V.K., Hanus, M., Yan, C., Shade, M.Y., Boron, J.B., Bicudo, R.M., 2021. Reducing
level bonds established with a digital conversational agent: cross-sectional, loneliness among aging adults: the roles o personal voice assistants and
retrospective observational study. JMIR Formative Res. 5 (5), e27868. anthropomorphic interactions. Front. Public Health 9.
De Cicco, R., Silva, S.C., Alparone, F.R., 2020. Millennials’ attitude toward chatbots: an Kar, A.K., Kushwaha, A.K., 2021. Facilitators and barriers o articial intelligence
experimental study in a social relationship perspective. Int. J. Retail Distrib. Manag. adoption in business–insights rom opinions using big data analytics. In. Syst. Front.
48 (11), 1213–1233. 1–24.
De Gennaro, M., Krumhuber, E.G., Lucas, G., 2020. Eectiveness o an empathic chatbot Kasilingam, D.L., 2020. Understanding the attitude and intention to use smartphone
in combating adverse eects o social exclusion on mood. Front. Psychol. 3061. chatbots or shopping. Technol. Soc. 62, 101280.
Derlaga, V.J., Berg, J.H. (Eds.), 1987. Sel-disclosure: Theory, Research, and Therapy. Keestra, M., 2017. Metacognition and refection by interdisciplinary experts: insights
Springer Science & Business Media. rom cognitive science and philosophy. Issues Interdisc. Stud. 35, 121–169.
Donthu, N., Kumar, S., Mukherjee, D., Pandey, N., Lim, W.M., 2021a. How to conduct a Kessler, M.M., 1963. Bibliographic coupling between scientic papers. Am. Doc. 14 (1),
bibliometric analysis: an overview and guidelines. J. Bus. Res. 133, 285–296. 10–25.
Donthu, N., Kumar, S., Pattnaik, D., Lim, W.M., 2021b. A bibliometric retrospection o Kim, S., Choudhury, A., 2021. Exploring older adults’ perception and use o smart
marketing rom the lens o psychology: insights rom. Psychol. Mark. 38 (5), speaker-based voice assistants: a longitudinal study. Comput. Hum. Behav. 124,
834–865. 106914.
Dwivedi, Y.K., Kshetri, N., Hughes, L., Slade, E.L., Jeyaraj, A., Kar, A.K., Wright, R., Kowatsch, T., Nißen, M., Rüegger, D., Stieger, M., Flückiger, C., Allemand, M., von
2023. “So what i ChatGPT wrote it?” multidisciplinary perspectives on Wangenheim, F., 2018. The Impact o Interpersonal Closeness Cues in Text-Based
opportunities, challenges and implications o generative conversational AI or Healthcare Chatbots on Attachment Bond and the Desire to Continue Interacting: An
research, practice and policy. Int. J. In. Manag. 71, 102642. Experimental Design.
Eagly, A.H., Wood, W., 1999. The origins o sex dierences in human behavior: evolved Krämer, N.C., Eimler, S., Von Der Pütten, A., Payr, S., 2011. Theory o companions: what
dispositions versus social roles. Am. Psychol. 54, 408–423. can theoretical models contribute to applications and understanding o human-robot
Eisenhardt, K., 1989. Agency theory: a review and assessment. Acad. Manag. Rev. 14 (1), interaction? Appl. Arti. Intell. 25 (6), 474–502.
57–74. Kushwaha, A.K., Kumar, P., Kar, A.K., 2021. What impacts customer experience or B2B
Elyashar, A., Fire, M., Kagan, D., Elovici, Y., 2013, August. Homing socialbots: intrusion enterprises on using AI-enabled chatbots? Insights rom Big data analytics. Ind.
on a specic organization’s employee using socialbots. In: Proceedings o the 2013 Mark. Manag. 98, 207–221.
IEEE/ACM International Conerence on Advances in Social Networks Analysis and Laranjo, L., Dunn, A.G., Tong, H.L., Kocaballi, A.B., Chen, J., Bashir, R., Coiera, E., 2018.
Mining, pp. 1358–1365. Conversational agents in healthcare: a systematic review. J. Am. Med. Inorm. Assoc.
Epley, N., Waytz, A., Cacioppo, J.T., 2007. On seeing human: a three-actor theory o 25 (9), 1248–1258.
anthropomorphism. Psychol. Rev. 114 (4), 864. Lee, S., Lee, N., Sah, Y.J., 2020a. Perceiving a mind in a Chatbot: eect o mind
Far, S.B., Rad, A.I., 2018. Security Analysis o Big Data on Internet o Things. arXiv perception and social cues on co-presence, closeness, and intention to use. Int. J.
preprint arXiv:1808.09491. Hum. Comput. Interact. 36 (10), 930–940.
Feine, J., Gnewuch, U., Morana, S., Maedche, A., 2019. A taxonomy o social cues or Lee, Y.C., Yamashita, N., Huang, Y., 2020b. Designing a chatbot as a mediator or
conversational agents. Int. J. Hum. Comput. Stud. 132, 138–161. promoting deep sel-disclosure to a real mental health proessional. Pro. ACM
Floridi, L., 2008. Articial intelligence’s new rontier: articial companions and the Human-Comput. Interact. 4 (CSCW1), 1–27.
ourth revolution. Meta Philos. 39 (4–5), 651–655. Lee, Y.C., Yamashita, N., Huang, Y., Fu, W., 2020c. “I hear you, i eel you”: encouraging
Freedman, J.L., Fraser, S.C., 1966. Compliance without pressure: the oot-in-the-door deep sel-disclosure through a Chatbot. In: Proceedings o the 2020 CHI Conerence
technique. J. Pers. Soc. Psychol. 4 (2), 195. on Human Factors in Computing Systems, pp. 1–12.
Gama, S., Barata, G., Gonçalves, D., Prada, R., Paiva, A., 2011. SARA: social aective Lei, S.I., Shen, H., Ye, S., 2021. A comparison between chatbot and human service:
relational agent: a study on the role o empathy in articial social agents. In: customer perception and reuse intention. Int. J. Contemp. Hosp. Manag. 33 (11),
Aective Computing and Intelligent Interaction: 4th International Conerence, ACII 3977–3995.
2011, Memphis, TN, USA, October 9–12, 2011, Proceedings, Part I 4. Springer, Levinger, G., 1980. Toward the analysis o close relationships. J. Exp. Soc. Psychol. 16
Berlin Heidelberg, pp. 507–516. (6), 510–544.

18
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Leydesdor, L., Raols, I., 2011. Indicators o the interdisciplinarity o journals: diversity, Portacolone, E., Halpern, J., Luxenberg, J., Harrison, K.L., Covinsky, K.E., 2020. Ethical
centrality, and citations. J. In. Secur. 5 (1), 87–100. issues raised by the introduction o articial companions to older adults with
Lim, M.Y., 2012. Memory models or intelligent social companions. In: Human-computer cognitive impairment: a call or interdisciplinary collaborations. J. Alzheimers Dis.
Interaction: The Agency Perspective. Springer, Berlin, Heidelberg, pp. 241–262. 76 (2), 445–455.
Lim, W.M., Kumar, S., Verma, S., Chaturvedi, R., 2022. Alexa, what do we know about Portela, M., Granell-Canut, C., 2017, September. A new riend in our smartphone?
conversational commerce? Insights rom a systematic literature review. Psychol. Observing interactions with Chatbots in the search o emotional engagement. In:
Mark. 39 (6), 1129–1155. Proceedings o the XVIII International Conerence on Human Computer Interaction,
Lisetti, C.L., 1998. Aective Computing. pp. 1–7.
Lopatovska, I., Williams, H., 2018, March. Personication o the Amazon Alexa: BFF or a Possati, L.M., 2022. Psychoanalyzing articial intelligence: the case o Replika. AI & Soc.
mindless companion. In: Proceedings o the 2018 Conerence on Human Inormation 1–14.
Interaction and Retrieval, pp. 265–268. Potdevin, D., Clavel, C., Sabouret, N., 2021. Virtual intimacy in human-embodied
Loveys, K., Fricchione, G., Kolappa, K., Sagar, M., Broadbent, E., 2019. Reducing patient conversational agent interactions: the infuence o multimodality on its perception.
loneliness with articial agents: design insights rom evolutionary neuropsychiatry. J. Multimodal User Interaces 15, 25–43.
J. Med. Internet Res. 21 (7), e13664. Pradhan, A., Findlater, L., Lazar, A., 2019. "Phantom riend" or" just a box with
Lowry, R., 2015. September Meet the Lonely Japanese Men in Love With Virtual inormation" personication and ontological categorization o smart speaker-based
Girlriends. Time. Retrieved rom. http://time.com/3998563/virtual-love-japan/. voice assistants by older adults. In: Proceedings o the ACM on Human-Computer
Retrieved rom. Interaction, 3. CSCW, pp. 1–21.
MacLeod, M., 2018. What makes interdisciplinarity dicult? Some consequences o Premack, D., Woodru, G., 1978. Does the chimpanzee have a theory o mind? Behav.
domain specicity in interdisciplinary practice. Synthese 195 (2), 697–720. Brain Sci. 1 (4), 515–526.
Maroukhani, P., Asadi, S., Ghobakhloo, M., Jannesari, M.T., Ismail, W.K.W., 2022. How Preston, S.D., De Waal, F.B., 2002. Empathy: its ultimate and proximate bases. Behav.
do interactive voice assistants build brands’ loyalty? Technol. Forecast. Soc. Chang. Brain Sci. 25 (1), 1–20.
183, 121870. Provoost, S., Lau, H.M., Ruwaard, J., Riper, H., 2017. Embodied conversational agents in
McCroskey, L.L., McCroskey, J.C., Richmond, V.P., 2006. Analysis and improvement o clinical psychology: a scoping review. J. Med. Internet Res. 19 (5), e6553.
the measurement o interpersonal attraction and homophily. Commun. Q. 54 (1), Radicchi, F., Castellano, C., Cecconi, F., Loreto, V., Parisi, D., 2004. Dening and
1–31. identiying communities in networks. Proc. Natl. Acad. Sci. 101 (9), 2658–2663.
McGoldrick, P.J., Keeling, K.A., Beatty, S.F., 2008. A typology o roles or avatars in Radziwill, N., Benton, M., 2017. Evaluating quality o Chatbots and intelligent
online retailing. J. Mark. Manag. 24 (3–4), 433–461. conversational agents. Sotw. Qual. Pro. 19 (3), 25.
McGrath, J.E., 1991. Time, interaction, and perormance (TIP) a theory o groups. Small Ramadan, Z., Farah, F., M., and El Essrawi, L., 2021. From Amazon.com to Amazon.
Group Res. 22 (2), 147–174. Love: how Alexa is redening companionship and interdependence or people with
McLean, G., Osei-Frimpong, K., Barhorst, J., 2021. Alexa, do voice assistants infuence special needs. Psychol. Mark. 38 (4), 596–609.
consumer brand engagement?–examining the role o AI powered voice assistants in Ramadan, Z.B., 2021. “Alexaying” shoppers: the examination o Amazon’s captive
infuencing consumer brand engagement. J. Bus. Res. 124, 312–328. relationship strategy. J. Retail. Consum. Serv. 62, 102610.
Mensio, M., Rizzo, G., Morisio, M., 2018, April. The rise o emotion-aware conversational Rapp, A., Curti, L., Boldi, A., 2021. The human side o human-chatbot interaction: a
agents: threats in digital emotions. In: Companion Proceedings o the The Web systematic literature review o ten years o research on text-based chatbots. Int. J.
Conerence 2018, pp. 1541–1544. Hum. Comput. Stud. 151, 102630.
Mhatre, P., Gedam, V., Unnikrishnan, S., Verma, S., 2020. Circular economy in the built Rhee, C.E., Choi, J., 2020. Eects o personalization and social role in voice shopping: an
environment–literature review and theory development. J. Build. Eng. 101995. experimental study on product recommendation by a conversational voice agent.
Mick, D.G., Fournier, S., 1998. Paradoxes o technology: consumer cognizance, emotions, Comput. Hum. Behav. 109, 106359.
and coping strategies. J. Consum. Res. 25 (2), 123–143. Ryan, R.M., Deci, E.L., 2000. Sel-determination theory and the acilitation o intrinsic
Mori, M., 1970. Bukimi no tani [the uncanny valley]. Energy 7, 33–35. motivation, social development, and well-being. Am. Psychol. 55 (1), 68.
Mottet, T.P., Frymier, A.B., Beebe, S.A., 2006. Theorizing about instructional Sa, M.F., Al Sadrani, B., Mustaa, A., 2021. Virtual voice assistant applications
communication. In: Handbook o Instructional Communication: Rhetorical and improved expressive verbal abilities and social interactions in children with autism
Relational Perspectives, pp. 255–282. spectrum disorder: a single-subject experimental study. Int. J. Dev. Disabil. 1–13.
Moussawi, S., Benbunan-Fich, R., 2021. The eect o voice and humour on users’ Schuetzler, R.M., Grimes, G.M., Scott Giboney, J., 2020. The impact o chatbot
perceptions o personal intelligent agents. Behav. Inorm. Technol. 40 (15), conversational skill on engagement and perceived humanness. J. Manag. In. Syst. 37
1603–1626. (3), 875–900.
Mukherjee, D., Lim, W.M., Kumar, S., Donthu, N., 2022. Guidelines or advancing theory Shum, H.Y., He, X.D., Li, D., 2018. From Eliza to XiaoIce: challenges and opportunities
and practice through bibliometric research. J. Bus. Res. 148, 101–115. with social chatbots. Front. In. Technol. Electron. Eng. 19 (1), 10–26.
Murtarelli, G., Gregory, A., Romenti, S., 2021. A conversation-based perspective or Silverstone, R., Hirsch, E., 1992. Consuming Technologies: Media and Inormation in
shaping ethical human–machine interactions: the particular challenge o chatbots. Domestic Spaces. Routledge, London.
J. Bus. Res. 129, 927–935. Sinoo, C., van Der Pal, S., Henkemans, O.A.B., Keizer, A., Bierman, B.P., Looije, R.,
Nakanishi, H., Nakazawa, S., Ishida, T., Takanashi, K., Isbister, K., 2003, July. Can Neerincx, M.A., 2018. Friendship with a robot: children’s perception o similarity
sotware agents infuence human relations? Balance theory in agent-mediated between a robot’s physical and virtual embodiment that supports diabetes sel-
communities. In: Proceedings o the Second International Joint Conerence on management. Patient Educ. Couns. 101 (7), 1248–1255.
Autonomous Agents and Multiagent Systems, pp. 717–724. Siourti, C., Quintas, J., Ben-Moussa, M., Hanke, S., Nijdam, N.A., Konstantas, D., 2018.
Nass, C., Moon, Y., 2000. Machines and mindlessness: social responses to computers. In: Bi, Y., Kapoor, S. (Eds.), The CaMeLi Framework—A Multimodal Virtual
J. Soc. Issues 56 (1), 81–103. Companion or Older Adults.
Nass, C., Steuer, J., Tauber, E.R., 1994, April. Computers are social actors. In: Skjuve, M., Følstad, A., Fostervold, K.I., Brandtzaeg, P.B., 2021. My chatbot companion-a
Proceedings o the SIGCHI Conerence on Human Factors in Computing Systems, study o human-chatbot relationships. Int. J. Hum. Comput. Stud. 149, 102601.
pp. 72–78. Smith, B. (Ed.), 2003. John Searle. Cambridge University Press, Cambridge.
Odekerken-Schröder, G., Mele, C., Russo-Spena, T., Mahr, D., Ruggiero, A., 2020. Sridevi, G.M., Suganthi, S.K., 2022. AI based suitability measurement and prediction
Mitigating loneliness with companion robots in the COVID-19 pandemic and between job description and job seeker proles. Int. J. In. Manag. Data Insights 2
beyond: an integrative ramework and research agenda. J. Serv. Manag. 31 (6), (2), 100109.
1149–1162. Suwono, L.V., Sihombing, S.O., 2016. Factors aecting customer loyalty o tness
Omrani, N., Rivieccio, G., Fiore, U., Schiavone, F., Agreda, S.G., 2022. To trust or not to centers: an empirical study. JDM (Jurnal Dinamika Manajemen) 7 (1), 45–55.
trust? An assessment o trust in AI-based systems: concerns, ethics and contexts. Sweller, J., 1988. Cognitive load during problem solving: eects on learning. Cogn. Sci.
Technol. Forecast. Soc. Chang. 181, 121763. 12, 257–285.
Orabi, M., Mouheb, D., Al Aghbari, Z., Kamel, I., 2020. Detection o bots in social media: Ta, V., Grith, C., Boateld, C., Wang, X., Civitello, M., Bader, H., Loggarakis, A., 2020.
a systematic review. In. Process. Manag. 57 (4), 102250. User experiences o social support rom companion chatbots in everyday contexts:
Payr, S., 2011. Social engagement with robots and agents: introduction. Appl. Arti. thematic analysis. J. Med. Internet Res. 22 (3), e16235.
Intell. 25 (6), 441–444. Takayanagi, K., Kirita, T., Shibata, T., Shibata, T., Shibata, T., 2014. Comparison o
Pesty, S., Duhaut, D., 2011, December. Articial companion: building a impacting verbal and emotional responses o elderly people with mild/moderate dementia and
relation. In: 2011 IEEE International Conerence on Robotics and Biomimetics. IEEE, those with severe dementia in responses to seal robot, PARO. Front. Aging Neurosci.
pp. 2902–2907. 6, 257-257. Retrieved 3 19, 2023.
Petronio, S., Durham, W., 2008. Understanding and applying communication privacy Tassiello, V., Tillotson, J.S., Rome, A.S., 2021. “Alexa, order me a pizza!”: the mediating
management theory. In: Engaging theories in interpersonal communication, role o psychological power in the consumer–voice assistant interaction. Psychol.
pp. 309–322. Mark. 38 (7), 1069–1080.
Pilkington, A., Catherine, L.H., 1999. Is production and operations management a Thorne, S., 2020. Hey Siri, tell me a story: digital storytelling and AI authorship.
discipline? A citation/co-citation study. Int. J. Oper. Prod. Manag. 19 (1), 7–20. Convergence 26 (4), 808–823.
Pitardi, V., Marriott, H.R., 2021. Alexa, she’s not human but… unveiling the drivers o Tsai, W.H.S., Liu, Y., Chuan, C.H., 2021. How chatbots’ social presence communication
consumers’ trust in voice-based articial intelligence. Psychol. Mark. 38 (4), enhances consumer engagement: the mediating role o parasocial interaction and
626–642. dialogue. J. Res. Interact. Mark. 15 (3), 460–482.
Porra, J., Lacity, M., Parks, M.S., 2020. “Can computer based human-likeness endanger Tsiourti, C., Quintas, J., Ben-Moussa, M., Hanke, S., Nijdam, N.A., Konstantas, D., 2016,
humanness?”–a philosophical and ethical perspective on digital assistants expressing September. The CaMeLi ramework—a multimodal virtual companion or older
eelings they can’t have. In. Syst. Front. 22 (3), 533–547. adults. In: Proceedings o SAI Intelligent Systems Conerence. Springer, Cham,
pp. 196–217.

19
R. Chaturvedi et al. Technological Forecasting & Social Change 193 (2023) 122634

Turunen, Markku, Hakulinen, Jaakko, Ståhl, Olov, Gambäck, Björn, Hansen, Preben, Sanjeev Verma is presently working as Proessor (Marketing) at National Institute o
Rodríguez, Mari C., Gancedo, Raúl Santos, de La Cámara, Cameron, Smith, Daniel Industrial Engineering (NITIE) Mumbai, India. Dr. Verma is an active researcher and has
Charlton, Cavazza, Marc, 2011. Multimodal and mobile conversational health and authored/co-authored more than 70 publications in reereed International/National
tness companions. Comput. Speech Lang. 25 (2), 192–209. journals and conerence proceedings and Authored one book. His papers have been pub-
Van Eck, N.J., Waltman, L., 2011. Text Mining and Visualization Using VOSviewer. arXiv lished in various international journals o Repute such as Psychology and Marketing,
preprint arXiv:1109.2058. Journal o Interactive Marketing, Tourism Review, Government Inormation Quarterly,
Vázquez-Cano, E., Mengual-Andrés, S., López-Meneses, E., 2021. Chatbot to improve Journal o Marketing Communications, Journal o Internet Commerce, Journal o
learning punctuation in Spanish and to enhance open and fexible learning Modeling Management, Journal o Global Marketing, International Journal o Marketing
environments. Int. J. Educ. Technol. High. Educ. 18 (1), 1–20. and Philanthropy etc.
Verma, S., 2022. Sentiment analysis o public services or smart society: literature review
and uture research directions. Gov. In. Q. 101708.
Ronnie Das is an Associate Proessor o Digital and Data Driven Marketing at Audencia
Verma, S., Yadav, N., 2021. Past, present, and uture o electronic word o mouth
Business School, France. He is also Head o the Marketing Department. Prior to joining
(EWOM). J. Interact. Mark. 53, 111–128.
Audenica, Ronnie has spent over a decade in UK higher education researching emerging
Verma, S., Sharma, R., Deb, S., Maitra, D., 2021. Articial intelligence in marketing:
technologies and application o machine learning in understanding marketing 4.0 and
systematic review and uture research direction. Int. J. In. Manage. Data Insights 1
transormative consumer behaviour. Ronnie's research, with Newcastle University Urban
(1), 100002.
Observatory, on Big Data driven insight into citizen behaviour during COVID crisis was
Walther, J.B., Burgoon, J.K., 1992. Relational communication in computer-mediated
published by the World Economic Forum and London School o Economics Impact Blog.
interaction. Hum. Commun. Res. 19 (1), 50–88.
Ronnie has also chaired prestigious research workstreams and research hub (SuperGen
Wang, L., Wang, D., Tian, F., Peng, Z., Fan, X., Zhang, Z., Wang, H., 2021. Cass: towards
Energy Network) committees unded by the UKRI (EPSRC). He has also successully
building a social-support chatbot or online health community. Proc. ACM Human-
developed and delivered projects unded and supported by ING Economics. Ronnie has
Comput Interact 5 (CSCW1), 1–31.
urther own applied innovations challenges organized by the European Commission
Whittemore, R., Chao, A., Jang, M., Minges, K.E., Park, C., 2014. Methods or knowledge
including EU Datathon 2020 and European Innovation Sprint 2020. He was also invited to
synthesis: An overview. Heart & Lung 43 (5), 453–461.
talk at the European Central Bank.
Wilson-Nash, C., Goode, A., Currie, A., 2020. Introducing the socialbot: a novel
touchpoint along the young adult customer journey. Eur. J. Mark. 54 (10),
2621–2643. Yogesh K Dwivedi is a Proessor o Digital Marketing and Innovation and Founding Di-
Wood, L.J., Zaraki, A., Robins, B., Dautenhahn, K., 2019. Developing Kaspar: a humanoid rector o the Emerging Markets Research Centre (EMaRC) at the School o Management,
robot or children with autism. Int. J. Soc. Robot. 1–18. Retrieved 3 19, 2023. Swansea University, Wales, UK. In addition, he holds a Distinguished Research Proes-
Youn, S., Jin, S.V., 2021. “In AI we trust?” The eects o parasocial interaction and sorship at the Symbiosis Institute o Business Management (SIBM), Pune, India. Proessor
technopian versus luddite ideological views on chatbot-based customer relationship Dwivedi is also currently leading the International Journal o Inormation Management as its
management in the emerging “eeling economy”. Comput. Hum. Behav. 119, Editor-in-Chie. His research interests are at the interace o Inormation Systems (IS) and
106721. Marketing, ocusing on issues related to consumer adoption and diusion o emerging
Zhou, L., Gao, J., Li, D., Shum, H.Y., 2020. The design and implementation o xiaoice, an digital innovations, digital government, and digital and social media marketing particu-
empathetic social chatbot. Comput. Linguist. 46 (1), 53–93. larly in the context o emerging markets. Proessor Dwivedi has published more than 500
Zupic, I., Cater, T., 2015. Bibliometric methods in management and organization. Organ. articles in a range o leading academic journals and conerences that are widely cited
Res. Methods 18 (3), 429–472.2w2ww. (more than 41 thousand times as per Google Scholar). He has been named on the annual
Highly Cited Researchers™ 2020 and 2021 lists rom Clarivate Analytics. Proessor Dwi-
vedi is an Associate Editor o the Journal o Business Research, European Journal o Mar-
Rijul Chaturvedi is a Fellow at National Institute o Industrial Engineering (NITIE),
keting, Government Inormation Quarterly and International Journal o Electronic Government
Mumbai, India. His research interest includes emotional AI in marketing, Conversational
Research, and Senior Editor o the Journal o Electronic Commerce Research.
commerce, and experiential marketing.

20

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy