Abstract
In this paper, we identify three types of human–robot interaction combining different collaborative industrial human–cobot scenarios with communicative modes. In particular, we address the following scenarios: independent, simultaneous, sequential, supportive, and apprenticeship, and map them with the communicative modes proxemics, kinesics, chronemics, haptics, speech, light signals, and GUIs. Concerning signals generated by the robot, we take a closer look at two important aspects of human–cobot communication: robot transparency and coordination of activity with the user. With regards to human signals that need to be interpreted by the robot, we focus on human turn-taking and human tutoring a robot. Our work is supplemented with existing studies, showing the effects of utilizing communicative signals in human–robot collaboration. Moreover, we identify challenges arising from an increased use of interactivity in human-cobot interaction. The work is meant to provide targeted guidance for augmenting different collaborative industrial human-cobot scenarios with communicative modes in order to foster a more smooth and natural collaboration among human and robotic coworkers.



Similar content being viewed by others
Data Availability
Data sharing is not applicable to this article as no datasets were generated or analysed.
Notes
https://www.iso.org/standard/62996.html (access: 10/07/2022).
https://www.universal-robots.com/case-stories/temar-sp-z-oo/ (access: 08/04/2022).
https://www.universal-robots.com/case-stories/vitesco-technologies/ (access: 08/04/2022).
https://www.universal-robots.com/case-stories/temar-sp-z-oo/(access: 08/02/2022).
https://www.universal-robots.com/case-stories/vitesco-technologies/ (access: 08/02/2022).
https://robotiq.com/de/produkte/adaptiver-3-finger-robotergreifer (access: 08/02/2022)
References
Beumelburg K (2005) Fähigkeitsorientierte Montageablaufplanung in der Direkten Mensch-Roboter-Kooperation
Müller R, Vette M, Mailahn O (2016) Process-oriented task assignment for assembly processes with human-robot interaction. Procedia CIRP 44:210–215
El Zaatari S, Marei M, Li W, Usman Z (2019) Cobot programming for collaborative industrial tasks: An overview. Robot Auton Syst 116:162–180
Vanderborght B (2020) Unlocking the potential of industrial human–robot collaboration: a vision on industrial collaborative robots for economy and society
Schmidbauer C, Schlund S, Ionescu TB, Hader B (2020) Adaptive task sharing in human-robot interaction in assembly. In: 2020 IEEE International conference on industrial engineering and engineering management (IEEM), pp. 546–550. IEEE
Srimal PAS, Muthugala MVJ, Jayasekara ABP (2017) Deictic gesture enhanced fuzzy spatial relation grounding in natural language. In: 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1–8. IEEE
Dragan AD, Bauman S, Forlizzi J, Srinivasa SS (2015) Effects of robot motion on human-robot collaboration. In: 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 51–58. IEEE
Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886
International Organization for Standardization: ISO 9241-210: ergonomics of human-system interaction: part 210: Human-centred design for interactive systems (2019)
International Organization for Standardization: ISO 10218-1:2011 robots and robotic devices: safety requirements for industrial robots: part 1: Robots (2011)
International Organization for Standardization: ISO 10218-2:2011 robots and robotic devices: safety requirements for industrial robots: part 2: Robot systems and integration (2011)
Cesta A, Orlandini A, Bernardi G, Umbrico A (2016) Towards a planning-based framework for symbiotic human-robot collaboration. In: 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), pp. 1–8. IEEE
Unhelkar VV, Li S, Shah JA (2020) Decision-making for bidirectional communication in sequential human-robot collaborative tasks. In: 2020 15th ACM/IEEE international conference on human-robot interaction (HRI), pp. 329–341. IEEE
Schmitt J, Hillenbrand A, Kranz P, Kaupp T (2021) Assisted human-robot-interaction for industrial assembly: Application of spatial augmented reality (sar) for collaborative assembly tasks. In: Companion of the 2021 ACM/ieee international conference on human-robot interaction, pp. 52–56
Müller R, Vette M, Geenen A (2017) Skill-based dynamic task allocation in human-robot-cooperation with the example of welding application. Procedia Manuf 11:13–21
Wang L, Gao R, Váncza J, Krüger J, Wang XV, Makris S, Chryssolouris G (2019) Symbiotic human-robot collaborative assembly. CIRP Ann 68(2):701–726
Schmidtler J, Knott V, Hölzel C, Bengler K (2015) Human centered assistance applications for the working environment of the future. Occup Ergon 12(3):83–95
Mara M, Meyer K, Heiml M, Pichler H, Haring R, Krenn B, Gross S, Reiterer B, Layer-Wagner T (2021) Cobot studio vr: a virtual reality game environment for transdisciplinary research on interpretability and trust in human-robot collaboration
Deimel R (2019) Reactive interaction through body motion and the phase-state-machine. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), pp. 6383–6390. IEEE
van den Berghe R, Verhagen J, Oudgenoeg-Paz O, Van der Ven S, Leseman P (2019) Social robots for language learning: a review. Rev Educ Res 89(2):259–295
Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F (2018) Social robots for education: a review. Sci Robot 3(21):5954
Engwall O, Lopes J (2020) Interaction and collaboration in robot-assisted language learning for adults. Computer Assisted Language Learning, 1–37
Ansari F, Erol S, Sihn W (2018) Rethinking human-machine learning in industry 4.0: how does the paradigm shift treat the role of human learning? Procedia Manuf 23:117–122
Chowdhury A, Ahtinen A, Pieters R, Väänänen K (2021) “How are you today, panda the robot?”–affectiveness, playfulness and relatedness in human-robot collaboration in the factory context. In: 2021 30th IEEE International conference on robot & human interactive communication (RO-MAN), pp. 1089–1096. IEEE
Burgoon JK, Manusov V, Guerrero LK (2021) Nonverbal communication, 2nd edn. Routledge, New York
Hall ET (1966) The hidden dimension, vol 609. Doubleday, New York
Saunderson S, Nejat G (2019) How robots influence humans: a survey of nonverbal communication in social human-robot interaction. Int J Soc Robot 11(4):575–608
Poyatos F (1977) The morphological and functional approach to kinesics in the context of interaction and culture
Frank LK (1957) Tactile communication. Genetic Psychology Monographs
Villani V, Pini F, Leali F, Secchi C (2018) Survey on human-robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics 55:248–266
Wakita Y, Hirai S, Suehiro T, Hori T, Fujiwara K (2001) Information sharing via projection function for coexistence of robot and human. Auton Robot 10(3):267–277
Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92
Gopinath V, Johansen K (2019) Understanding situational and mode awareness for safe human-robot collaboration: case studies on assembly applications. Prod Eng Res Devel 13(1):1–9
Tang G, Webb P, Thrower J (2019) The development and evaluation of robot light skin: a novel robot signalling system to improve communication in industrial human-robot collaboration. Robot Comput Integr Manuf 56:85–94
International Organization for Standardization: ISO/TS 15066: robots and robotic devices: collaborative robots (2016)
Walters ML, Oskoei MA, Syrdal DS, Dautenhahn K (2011) A long-term human-robot proxemic study. In: 2011 RO-MAN, pp. 137–142. IEEE
Koay KL, Syrdal DS, Walters ML, Dautenhahn K (2007) Living with robots: investigating the habituation effect in participants’ preferences during a longitudinal human-robot interaction study. In: RO-MAN 2007-The 16th IEEE International symposium on robot and human interactive communication, pp. 564–569. IEEE
Dragan AD, Lee KC, Srinivasa SS (2013) Legibility and predictability of robot motion. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), pp. 301–308. IEEE
Lasota PA, Fong T, Shah JA (2017) A survey of methods for safe human-robot interaction. Found Trends Robot 5(4):261–349
Yang W, Paxton C, Cakmak M, Fox D (2020) Human grasp classification for reactive human-to-robot handovers. arXiv preprint arXiv:2003.06000
Vollmer A-L, Schillingmann L (2018) On studying human teaching behavior with robots: a review. Rev Philos Psychol 9(4):863–903
Fischer K, Lohan K, Foth K (2012) Levels of embodiment: linguistic analyses of factors influencing hri. In: 2012 7th ACM/IEEE International conference on human-robot interaction (HRI), pp. 463–470. IEEE
Walters ML, Dautenhahn K, Te Boekhorst R, Koay KL, Kaouri C, Woods S, Nehaniv C Lee D, Werry I (2005) The influence of subjects’ personality traits on personal spatial zones in a human-robot interaction experiment. In: ROMAN 2005. IEEE International workshop on robot and human interactive communication, 2005., pp. 347–352. IEEE
Koay KL, Syrdal DS, Ashgari-Oskoei M, Walters ML, Dautenhahn K (2014) Social roles and baseline proxemic preferences for a domestic service robot. Int J Soc Robot 6(4):469–488
Shi D, Collins Jr EG, Goldiez B, Donate A, Liu X, Dunlap D (2008) Human-aware robot motion planning with velocity constraints. In: 2008 International symposium on collaborative technologies and systems, pp. 490–497. IEEE
Chao C, Thomaz AL (2013) Controlling social dynamics with a parametrized model of floor regulation. J Hum Robot Interact 2(1):4–29
Morency L-P, de Kok I, Gratch J (2010) A probabilistic multimodal approach for predicting listener backchannels. Auton Agent Multi-Agent Syst 20(1):70–84
Moon A, Panton B, Van der Loos H, Croft E (2010) Using hesitation gestures for safe and ethical human-robot interaction. In: Proceedings of the ICRA, pp. 11–13
Moon A, Parker CA, Croft EA, Van der Loos HM (2011) Did you see it hesitate?-empirically grounded design of hesitation trajectories for collaborative robots. In: 2011 IEEE/RSJ International conference on intelligent robots and systems, pp. 1994–1999. IEEE
de Greeff J, Belpaeme T (2015) Why robots should be social: enhancing machine learning through social human-robot interaction. PLoS ONE 10(9):0138061
Chao C, Cakmak M, Thomaz AL (2010) Transparent active learning for robots. In: 2010 5th ACM/IEEE International conference on human-robot interaction (HRI), pp. 317–324. IEEE
Busch B, Grizou J, Lopes M, Stulp F (2017) Learning legible motion from human-robot interactions. Int J Soc Robot 9(5):765–779
Wallkötter S, Tulli S, Castellano G, Paiva A, Chetouani M (2020) Explainable agents through social cues: a review. arXiv preprint arXiv:2003.05251
Szafir D, Mutlu B, Fong T (2015) Communicating directionality in flying robots. In: 2015 10th ACM/IEEE international conference on human-robot interaction (HRI), pp. 19–26. IEEE
Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92
Vogel C, Schulenburg E, Elkmann N (2020) Projective-ar assistance system for shared human-robot workplaces in industrial applications. In: 2020 25th IEEE International conference on emerging technologies and factory automation (ETFA). 1: 1259–1262. IEEE
Schreitter S, Krenn B (2016) The OFAI multi-modal task description corpus. In: Proceedings of the tenth international conference on language resources and evaluation (LREC’16), pp. 1408–1414
Takayama L, Dooley D, Ju W (2011) Expressing thought: improving robot readability with animation principles. In: Proceedings of the 6th international conference on human-robot interaction, pp. 69–76
Admoni H, Weng T, Hayes B, Scassellati B (2016) Robot nonverbal behavior improves task performance in difficult collaborations. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI), pp. 51–58. IEEE
Holladay RM, Dragan AD, Srinivasa SS (2014) Legible robot pointing. In: The 23rd IEEE International symposium on robot and human interactive communication, pp. 217–223. IEEE
St. Clair A, Mataric M (2015) How robot verbal feedback can improve team performance in human-robot task collaborations. In: Proceedings of the Tenth Annual Acm/ieee international conference on human-robot interaction, pp. 213–220
Lallée S, Hamann K, Steinwender J, Warneken F, Martienz U, Barron-Gonzales H, Pattacini U, Gori I, Petit M, Metta G (2013) Cooperative human robot interaction systems: Iv: communication of shared plans with naïve humans using gaze and speech. In: 2013 IEEE/RSJ International conference on intelligent robots and systems, pp. 129–136. IEEE
Ramaraj P, Sahay S, Kumar SH, Lasecki WS, Laird JE (2019) Towards using transparency mechanisms to build better mental models. In: Advances in cognitive systems: 7th goal reasoning workshop. 7: 1–6
Wortham RH, Theodorou A, Bryson JJ (2017) Improving robot transparency: real-time visualisation of robot ai substantially improves understanding in naive observers. In: 2017 26th IEEE International symposium on robot and human interactive communication (RO-MAN), pp. 1424–1431. IEEE
Lütkebohle I, Peltason J, Schillingmann L, Elbrechter C, Wrede B, Wachsmuth S, Haschke R (2009) The curious robot-structuring interactive robot learning. In: International conference on robotics and automation
Lohse M, Wrede B, Schillingmann L (2013) Enabling robots to make use of the structure of human actions-a user study employing acoustic packaging. In: 2013 IEEE RO-MAN, pp. 490–495. IEEE
Hirschmanner M, Gross S, Zafari S, Krenn B, Neubarth F, Vincze M (2021) Investigating transparency methods in a robot word-learning system and their effects on human teaching behaviors. In: Proceedings of the 30th IEEE international conference on robot and human interactive communication, (Ro-Man 2021). IEEE
Sacks H, Schegloff EA, Jefferson G (1978) A simplest systematics for the organization of turn taking for conversation. In: Studies in the Organization of conversational interaction, pp. 7–55. Elsevier, NY, USA
Skantze G (2021) Turn-taking in conversational systems and human-robot interaction: a review. Comput Speech Lang 67:101178
Calisgan E, Haddadi A, Van der Loos HM, Alcazar JA, Croft EA (2012) Identifying nonverbal cues for automated human-robot turn-taking. In: 2012 IEEE RO-MAN: The 21st IEEE International symposium on robot and human interactive communication, pp. 418–423. IEEE
Schreitter S, Krenn B (2014) Exploring inter-and intra-speaker variability in multi-modal task descriptions. In: The 23rd IEEE International symposium on robot and human interactive communication, pp. 43–48. IEEE
Bruneau T (1980) In: Key, M.R. (ed.) Chronemics and the verbal-nonverbal interface, pp. 101–107. Mouton, Netherlands
Abelho Pereira AT, Oertel C, Fermoselle L, Mendelson J, Gustafson J (2019) Responsive joint attention in human-robot interaction. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), pp. 1080–1087
Meindl JN, Cannella-Malone HI (2011) Initiating and responding to joint attention bids in children with autism: a review of the literature. Res Dev Disabil 32(5):1441–1454
Chevalier P, Kompatsiari K, Ciardo F, Wykowska A (2020) Examining joint attention with the use of humanoid robots-a new approach to study fundamental mechanisms of social cognition. Psychon Bull Rev 27:217
Sheikholeslami S, Moon A, Croft EA (2017) Cooperative gestures for industry: exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction. Int J Robot Res 36(5–7):699–720
Quintero CP, Tatsambon R, Gridseth M, Jägersand M (2015) Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. In: 2015 24th IEEE International symposium on robot and human interactive communication (RO-MAN), pp. 349–354. IEEE
Riek LD, Rabinowitch T-C, Bremner P, Pipe AG, Fraser M, Robinson P (2010) Cooperative gestures: Effective signaling for humanoid robots. In: 2010 5th ACM/IEEE International conference on human-robot interaction (HRI), pp. 61–68. IEEE
Lasota PA, Rossano GF, Shah JA (2014) Toward safe close-proximity human-robot interaction with standard industrial robots. In: 2014 IEEE International conference on automation science and engineering (CASE), pp. 339–344. IEEE
Marge M, Espy-Wilson C, Ward NG, Alwan A, Artzi Y, Bansal M, Blankenship G, Chai J, Daumé H III, Dey D et al (2021) Spoken language interaction with robots: recommendations for future research. Comput Speech Lang 71:101255
Prati E, Peruzzini M, Pellicciari M, Raffaeli R (2021) How to include user experience in the design of human-robot interaction. Robot Comput Integr Manuf 68:102072
Gross S, Krenn B, Scheutz M (2017) The reliability of non-verbal cues for situated reference resolution and their interplay with language: implications for human robot interaction. In: Proceedings of the 19th ACM international conference on multimodal interaction, pp. 189–196
Clark HH, Krych MA (2004) Speaking while monitoring addressees for understanding. J Mem Lang 50(1):62–81
Pohlt C, Hell S, Schlegl T, Wachsmuth S (2017) Impact of spontaneous human inputs during gesture based interaction on a real-world manufacturing scenario. In: Proceedings of the 5th international conference on human agent interaction, pp. 347–351
Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), pp. 349–356. IEEE
Barattini P, Morand C, Robertson NM (2012) A proposed gesture set for the control of industrial collaborative robots. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication, pp. 132–137. IEEE
Liu H, Wang L (2018) Gesture recognition for human-robot collaboration: a review. Int J Ind Ergon 68:355–367
Clark H (2003) Pointing and placing. Psychology Press, Mahwah
Gustavsson P, Syberfeldt A, Brewster R, Wang L (2017) Human-robot collaboration demonstrator combining speech recognition and haptic control. Procedia CIRP 63:396–401
Maurtua I, Fernandez I, Tellaeche A, Kildal J, Susperregi L, Ibarguren A, Sierra B (2017) Natural multimodal communication for human-robot collaboration. Int J Adv Robot Syst. https://doi.org/10.1177/1729881417716043
Bischoff R, Kazi A, Seyfarth M (2002) The morpha style guide for icon-based programming. In: Proceedings. 11th IEEE International workshop on robot and human interactive communication, pp. 482–487. IEEE
Neto P, Pires JN, Moreira AP (2010) Cad-based off-line robot programming. In: 2010 IEEE conference on robotics, automation and mechatronics, pp. 516–521. IEEE
Guerin KR, Riedel SD, Bohren J, Hager GD (2014) Adjutant: a framework for flexible human-machine collaborative systems. In: 2014 IEEE/RSJ International conference on intelligent robots and systems, pp. 1392–1399. IEEE
Pedersen MR, Herzog DL, Krüger V (2014) Intuitive skill-level programming of industrial handling tasks on a mobile manipulator. In: 2014 IEEE/RSJ international conference on intelligent robots and systems, pp. 4523–4530. IEEE
Schmidbauer C, Komenda T, Schlund S (2020) Teaching cobots in learning factories-user and usability-driven implications. Proc manuf 45:398–404
Zhu Z, Hu H (2018) Robot learning from demonstration in robotic assembly: a survey. Robotics 7(2):17
McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago
Kendon A (2004) Gesture: visible action as utterance. Cambridge University Press, Cambridge
Hanna JE, Brennan SE (2007) Speakers’ eye gaze disambiguates referring expressions early during face-to-face conversation. J Mem Lang 57(4):596–615
Furnas GW, Landauer TK, Gomez LM, Dumais ST (1987) The vocabulary problem in human-system communication. Commun ACM 30(11):964–971
Brennan SE (1996) Lexical entrainment in spontaneous dialog. Proc ISSD 96:41–44
Funding
This research was supported by the Vienna Science and Technology Fund (WWTF) project “Human tutoring of robots in industry” (NXT19-005), and the Austrian Research Promotion Agency (FFG) Ideen Lab 4.0 project CoBot Studio (872590).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no conflict of interest.
Animal and Human Rights
The paper contains theoretical work, therefore this research did not involve human participants and/or animals and no informed consent was necessary. Also, the authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Gross, S., Krenn, B. A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative Scenarios. Int J of Soc Robotics 16, 1315–1332 (2024). https://doi.org/10.1007/s12369-023-00991-5
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12369-023-00991-5