Skip to main content
Log in

A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative Scenarios

  • Published:
International Journal of Social Robotics Aims and scope Submit manuscript

Abstract

In this paper, we identify three types of human–robot interaction combining different collaborative industrial human–cobot scenarios with communicative modes. In particular, we address the following scenarios: independent, simultaneous, sequential, supportive, and apprenticeship, and map them with the communicative modes proxemics, kinesics, chronemics, haptics, speech, light signals, and GUIs. Concerning signals generated by the robot, we take a closer look at two important aspects of human–cobot communication: robot transparency and coordination of activity with the user. With regards to human signals that need to be interpreted by the robot, we focus on human turn-taking and human tutoring a robot. Our work is supplemented with existing studies, showing the effects of utilizing communicative signals in human–robot collaboration. Moreover, we identify challenges arising from an increased use of interactivity in human-cobot interaction. The work is meant to provide targeted guidance for augmenting different collaborative industrial human-cobot scenarios with communicative modes in order to foster a more smooth and natural collaboration among human and robotic coworkers.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Data Availability

Data sharing is not applicable to this article as no datasets were generated or analysed.

Notes

  1. https://www.iso.org/standard/62996.html (access: 10/07/2022).

  2. https://www.universal-robots.com/case-stories/temar-sp-z-oo/ (access: 08/04/2022).

  3. https://www.universal-robots.com/case-stories/vitesco-technologies/ (access: 08/04/2022).

  4. https://www.universal-robots.com/case-stories/temar-sp-z-oo/(access: 08/02/2022).

  5. https://www.universal-robots.com/case-stories/vitesco-technologies/ (access: 08/02/2022).

  6. https://robotiq.com/de/produkte/adaptiver-3-finger-robotergreifer (access: 08/02/2022)

References

  1. Beumelburg K (2005) Fähigkeitsorientierte Montageablaufplanung in der Direkten Mensch-Roboter-Kooperation

  2. Müller R, Vette M, Mailahn O (2016) Process-oriented task assignment for assembly processes with human-robot interaction. Procedia CIRP 44:210–215

    MATH  Google Scholar 

  3. El Zaatari S, Marei M, Li W, Usman Z (2019) Cobot programming for collaborative industrial tasks: An overview. Robot Auton Syst 116:162–180

    Google Scholar 

  4. Vanderborght B (2020) Unlocking the potential of industrial human–robot collaboration: a vision on industrial collaborative robots for economy and society

  5. Schmidbauer C, Schlund S, Ionescu TB, Hader B (2020) Adaptive task sharing in human-robot interaction in assembly. In: 2020 IEEE International conference on industrial engineering and engineering management (IEEM), pp. 546–550. IEEE

  6. Srimal PAS, Muthugala MVJ, Jayasekara ABP (2017) Deictic gesture enhanced fuzzy spatial relation grounding in natural language. In: 2017 IEEE International Conference on Fuzzy Systems (FUZZ-IEEE), pp. 1–8. IEEE

  7. Dragan AD, Bauman S, Forlizzi J, Srinivasa SS (2015) Effects of robot motion on human-robot collaboration. In: 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 51–58. IEEE

  8. Epley N, Waytz A, Cacioppo JT (2007) On seeing human: a three-factor theory of anthropomorphism. Psychol Rev 114(4):864–886

    MATH  Google Scholar 

  9. International Organization for Standardization: ISO 9241-210: ergonomics of human-system interaction: part 210: Human-centred design for interactive systems (2019)

  10. International Organization for Standardization: ISO 10218-1:2011 robots and robotic devices: safety requirements for industrial robots: part 1: Robots (2011)

  11. International Organization for Standardization: ISO 10218-2:2011 robots and robotic devices: safety requirements for industrial robots: part 2: Robot systems and integration (2011)

  12. Cesta A, Orlandini A, Bernardi G, Umbrico A (2016) Towards a planning-based framework for symbiotic human-robot collaboration. In: 2016 IEEE 21st International Conference on Emerging Technologies and Factory Automation (ETFA), pp. 1–8. IEEE

  13. Unhelkar VV, Li S, Shah JA (2020) Decision-making for bidirectional communication in sequential human-robot collaborative tasks. In: 2020 15th ACM/IEEE international conference on human-robot interaction (HRI), pp. 329–341. IEEE

  14. Schmitt J, Hillenbrand A, Kranz P, Kaupp T (2021) Assisted human-robot-interaction for industrial assembly: Application of spatial augmented reality (sar) for collaborative assembly tasks. In: Companion of the 2021 ACM/ieee international conference on human-robot interaction, pp. 52–56

  15. Müller R, Vette M, Geenen A (2017) Skill-based dynamic task allocation in human-robot-cooperation with the example of welding application. Procedia Manuf 11:13–21

    MATH  Google Scholar 

  16. Wang L, Gao R, Váncza J, Krüger J, Wang XV, Makris S, Chryssolouris G (2019) Symbiotic human-robot collaborative assembly. CIRP Ann 68(2):701–726

    Google Scholar 

  17. Schmidtler J, Knott V, Hölzel C, Bengler K (2015) Human centered assistance applications for the working environment of the future. Occup Ergon 12(3):83–95

    MATH  Google Scholar 

  18. Mara M, Meyer K, Heiml M, Pichler H, Haring R, Krenn B, Gross S, Reiterer B, Layer-Wagner T (2021) Cobot studio vr: a virtual reality game environment for transdisciplinary research on interpretability and trust in human-robot collaboration

  19. Deimel R (2019) Reactive interaction through body motion and the phase-state-machine. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), pp. 6383–6390. IEEE

  20. van den Berghe R, Verhagen J, Oudgenoeg-Paz O, Van der Ven S, Leseman P (2019) Social robots for language learning: a review. Rev Educ Res 89(2):259–295

    Google Scholar 

  21. Belpaeme T, Kennedy J, Ramachandran A, Scassellati B, Tanaka F (2018) Social robots for education: a review. Sci Robot 3(21):5954

    MATH  Google Scholar 

  22. Engwall O, Lopes J (2020) Interaction and collaboration in robot-assisted language learning for adults. Computer Assisted Language Learning, 1–37

  23. Ansari F, Erol S, Sihn W (2018) Rethinking human-machine learning in industry 4.0: how does the paradigm shift treat the role of human learning? Procedia Manuf 23:117–122

    MATH  Google Scholar 

  24. Chowdhury A, Ahtinen A, Pieters R, Väänänen K (2021) “How are you today, panda the robot?”–affectiveness, playfulness and relatedness in human-robot collaboration in the factory context. In: 2021 30th IEEE International conference on robot & human interactive communication (RO-MAN), pp. 1089–1096. IEEE

  25. Burgoon JK, Manusov V, Guerrero LK (2021) Nonverbal communication, 2nd edn. Routledge, New York

    Google Scholar 

  26. Hall ET (1966) The hidden dimension, vol 609. Doubleday, New York

    MATH  Google Scholar 

  27. Saunderson S, Nejat G (2019) How robots influence humans: a survey of nonverbal communication in social human-robot interaction. Int J Soc Robot 11(4):575–608

    MATH  Google Scholar 

  28. Poyatos F (1977) The morphological and functional approach to kinesics in the context of interaction and culture

  29. Frank LK (1957) Tactile communication. Genetic Psychology Monographs

  30. Villani V, Pini F, Leali F, Secchi C (2018) Survey on human-robot collaboration in industrial settings: safety, intuitive interfaces and applications. Mechatronics 55:248–266

    Google Scholar 

  31. Wakita Y, Hirai S, Suehiro T, Hori T, Fujiwara K (2001) Information sharing via projection function for coexistence of robot and human. Auton Robot 10(3):267–277

    MATH  Google Scholar 

  32. Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92

    MATH  Google Scholar 

  33. Gopinath V, Johansen K (2019) Understanding situational and mode awareness for safe human-robot collaboration: case studies on assembly applications. Prod Eng Res Devel 13(1):1–9

    MATH  Google Scholar 

  34. Tang G, Webb P, Thrower J (2019) The development and evaluation of robot light skin: a novel robot signalling system to improve communication in industrial human-robot collaboration. Robot Comput Integr Manuf 56:85–94

    Google Scholar 

  35. International Organization for Standardization: ISO/TS 15066: robots and robotic devices: collaborative robots (2016)

  36. Walters ML, Oskoei MA, Syrdal DS, Dautenhahn K (2011) A long-term human-robot proxemic study. In: 2011 RO-MAN, pp. 137–142. IEEE

  37. Koay KL, Syrdal DS, Walters ML, Dautenhahn K (2007) Living with robots: investigating the habituation effect in participants’ preferences during a longitudinal human-robot interaction study. In: RO-MAN 2007-The 16th IEEE International symposium on robot and human interactive communication, pp. 564–569. IEEE

  38. Dragan AD, Lee KC, Srinivasa SS (2013) Legibility and predictability of robot motion. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), pp. 301–308. IEEE

  39. Lasota PA, Fong T, Shah JA (2017) A survey of methods for safe human-robot interaction. Found Trends Robot 5(4):261–349

    MATH  Google Scholar 

  40. Yang W, Paxton C, Cakmak M, Fox D (2020) Human grasp classification for reactive human-to-robot handovers. arXiv preprint arXiv:2003.06000

  41. Vollmer A-L, Schillingmann L (2018) On studying human teaching behavior with robots: a review. Rev Philos Psychol 9(4):863–903

    MATH  Google Scholar 

  42. Fischer K, Lohan K, Foth K (2012) Levels of embodiment: linguistic analyses of factors influencing hri. In: 2012 7th ACM/IEEE International conference on human-robot interaction (HRI), pp. 463–470. IEEE

  43. Walters ML, Dautenhahn K, Te Boekhorst R, Koay KL, Kaouri C, Woods S, Nehaniv C Lee D, Werry I (2005) The influence of subjects’ personality traits on personal spatial zones in a human-robot interaction experiment. In: ROMAN 2005. IEEE International workshop on robot and human interactive communication, 2005., pp. 347–352. IEEE

  44. Koay KL, Syrdal DS, Ashgari-Oskoei M, Walters ML, Dautenhahn K (2014) Social roles and baseline proxemic preferences for a domestic service robot. Int J Soc Robot 6(4):469–488

    Google Scholar 

  45. Shi D, Collins Jr EG, Goldiez B, Donate A, Liu X, Dunlap D (2008) Human-aware robot motion planning with velocity constraints. In: 2008 International symposium on collaborative technologies and systems, pp. 490–497. IEEE

  46. Chao C, Thomaz AL (2013) Controlling social dynamics with a parametrized model of floor regulation. J Hum Robot Interact 2(1):4–29

    MATH  Google Scholar 

  47. Morency L-P, de Kok I, Gratch J (2010) A probabilistic multimodal approach for predicting listener backchannels. Auton Agent Multi-Agent Syst 20(1):70–84

    MATH  Google Scholar 

  48. Moon A, Panton B, Van der Loos H, Croft E (2010) Using hesitation gestures for safe and ethical human-robot interaction. In: Proceedings of the ICRA, pp. 11–13

  49. Moon A, Parker CA, Croft EA, Van der Loos HM (2011) Did you see it hesitate?-empirically grounded design of hesitation trajectories for collaborative robots. In: 2011 IEEE/RSJ International conference on intelligent robots and systems, pp. 1994–1999. IEEE

  50. de Greeff J, Belpaeme T (2015) Why robots should be social: enhancing machine learning through social human-robot interaction. PLoS ONE 10(9):0138061

    Google Scholar 

  51. Chao C, Cakmak M, Thomaz AL (2010) Transparent active learning for robots. In: 2010 5th ACM/IEEE International conference on human-robot interaction (HRI), pp. 317–324. IEEE

  52. Busch B, Grizou J, Lopes M, Stulp F (2017) Learning legible motion from human-robot interactions. Int J Soc Robot 9(5):765–779

    Google Scholar 

  53. Wallkötter S, Tulli S, Castellano G, Paiva A, Chetouani M (2020) Explainable agents through social cues: a review. arXiv preprint arXiv:2003.05251

  54. Szafir D, Mutlu B, Fong T (2015) Communicating directionality in flying robots. In: 2015 10th ACM/IEEE international conference on human-robot interaction (HRI), pp. 19–26. IEEE

  55. Baraka K, Veloso MM (2018) Mobile service robot state revealing through expressive lights: formalism, design, and evaluation. Int J Soc Robot 10(1):65–92

    MATH  Google Scholar 

  56. Vogel C, Schulenburg E, Elkmann N (2020) Projective-ar assistance system for shared human-robot workplaces in industrial applications. In: 2020 25th IEEE International conference on emerging technologies and factory automation (ETFA). 1: 1259–1262. IEEE

  57. Schreitter S, Krenn B (2016) The OFAI multi-modal task description corpus. In: Proceedings of the tenth international conference on language resources and evaluation (LREC’16), pp. 1408–1414

  58. Takayama L, Dooley D, Ju W (2011) Expressing thought: improving robot readability with animation principles. In: Proceedings of the 6th international conference on human-robot interaction, pp. 69–76

  59. Admoni H, Weng T, Hayes B, Scassellati B (2016) Robot nonverbal behavior improves task performance in difficult collaborations. In: 2016 11th ACM/IEEE international conference on human-robot interaction (HRI), pp. 51–58. IEEE

  60. Holladay RM, Dragan AD, Srinivasa SS (2014) Legible robot pointing. In: The 23rd IEEE International symposium on robot and human interactive communication, pp. 217–223. IEEE

  61. St. Clair A, Mataric M (2015) How robot verbal feedback can improve team performance in human-robot task collaborations. In: Proceedings of the Tenth Annual Acm/ieee international conference on human-robot interaction, pp. 213–220

  62. Lallée S, Hamann K, Steinwender J, Warneken F, Martienz U, Barron-Gonzales H, Pattacini U, Gori I, Petit M, Metta G (2013) Cooperative human robot interaction systems: Iv: communication of shared plans with naïve humans using gaze and speech. In: 2013 IEEE/RSJ International conference on intelligent robots and systems, pp. 129–136. IEEE

  63. Ramaraj P, Sahay S, Kumar SH, Lasecki WS, Laird JE (2019) Towards using transparency mechanisms to build better mental models. In: Advances in cognitive systems: 7th goal reasoning workshop. 7: 1–6

  64. Wortham RH, Theodorou A, Bryson JJ (2017) Improving robot transparency: real-time visualisation of robot ai substantially improves understanding in naive observers. In: 2017 26th IEEE International symposium on robot and human interactive communication (RO-MAN), pp. 1424–1431. IEEE

  65. Lütkebohle I, Peltason J, Schillingmann L, Elbrechter C, Wrede B, Wachsmuth S, Haschke R (2009) The curious robot-structuring interactive robot learning. In: International conference on robotics and automation

  66. Lohse M, Wrede B, Schillingmann L (2013) Enabling robots to make use of the structure of human actions-a user study employing acoustic packaging. In: 2013 IEEE RO-MAN, pp. 490–495. IEEE

  67. Hirschmanner M, Gross S, Zafari S, Krenn B, Neubarth F, Vincze M (2021) Investigating transparency methods in a robot word-learning system and their effects on human teaching behaviors. In: Proceedings of the 30th IEEE international conference on robot and human interactive communication, (Ro-Man 2021). IEEE

  68. Sacks H, Schegloff EA, Jefferson G (1978) A simplest systematics for the organization of turn taking for conversation. In: Studies in the Organization of conversational interaction, pp. 7–55. Elsevier, NY, USA

  69. Skantze G (2021) Turn-taking in conversational systems and human-robot interaction: a review. Comput Speech Lang 67:101178

    MATH  Google Scholar 

  70. Calisgan E, Haddadi A, Van der Loos HM, Alcazar JA, Croft EA (2012) Identifying nonverbal cues for automated human-robot turn-taking. In: 2012 IEEE RO-MAN: The 21st IEEE International symposium on robot and human interactive communication, pp. 418–423. IEEE

  71. Schreitter S, Krenn B (2014) Exploring inter-and intra-speaker variability in multi-modal task descriptions. In: The 23rd IEEE International symposium on robot and human interactive communication, pp. 43–48. IEEE

  72. Bruneau T (1980) In: Key, M.R. (ed.) Chronemics and the verbal-nonverbal interface, pp. 101–107. Mouton, Netherlands

  73. Abelho Pereira AT, Oertel C, Fermoselle L, Mendelson J, Gustafson J (2019) Responsive joint attention in human-robot interaction. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), pp. 1080–1087

  74. Meindl JN, Cannella-Malone HI (2011) Initiating and responding to joint attention bids in children with autism: a review of the literature. Res Dev Disabil 32(5):1441–1454

    MATH  Google Scholar 

  75. Chevalier P, Kompatsiari K, Ciardo F, Wykowska A (2020) Examining joint attention with the use of humanoid robots-a new approach to study fundamental mechanisms of social cognition. Psychon Bull Rev 27:217

    Google Scholar 

  76. Sheikholeslami S, Moon A, Croft EA (2017) Cooperative gestures for industry: exploring the efficacy of robot hand configurations in expression of instructional gestures for human-robot interaction. Int J Robot Res 36(5–7):699–720

    Google Scholar 

  77. Quintero CP, Tatsambon R, Gridseth M, Jägersand M (2015) Visual pointing gestures for bi-directional human robot interaction in a pick-and-place task. In: 2015 24th IEEE International symposium on robot and human interactive communication (RO-MAN), pp. 349–354. IEEE

  78. Riek LD, Rabinowitch T-C, Bremner P, Pipe AG, Fraser M, Robinson P (2010) Cooperative gestures: Effective signaling for humanoid robots. In: 2010 5th ACM/IEEE International conference on human-robot interaction (HRI), pp. 61–68. IEEE

  79. Lasota PA, Rossano GF, Shah JA (2014) Toward safe close-proximity human-robot interaction with standard industrial robots. In: 2014 IEEE International conference on automation science and engineering (CASE), pp. 339–344. IEEE

  80. Marge M, Espy-Wilson C, Ward NG, Alwan A, Artzi Y, Bansal M, Blankenship G, Chai J, Daumé H III, Dey D et al (2021) Spoken language interaction with robots: recommendations for future research. Comput Speech Lang 71:101255

    Google Scholar 

  81. Prati E, Peruzzini M, Pellicciari M, Raffaeli R (2021) How to include user experience in the design of human-robot interaction. Robot Comput Integr Manuf 68:102072

    Google Scholar 

  82. Gross S, Krenn B, Scheutz M (2017) The reliability of non-verbal cues for situated reference resolution and their interplay with language: implications for human robot interaction. In: Proceedings of the 19th ACM international conference on multimodal interaction, pp. 189–196

  83. Clark HH, Krych MA (2004) Speaking while monitoring addressees for understanding. J Mem Lang 50(1):62–81

    MATH  Google Scholar 

  84. Pohlt C, Hell S, Schlegl T, Wachsmuth S (2017) Impact of spontaneous human inputs during gesture based interaction on a real-world manufacturing scenario. In: Proceedings of the 5th international conference on human agent interaction, pp. 347–351

  85. Gleeson B, MacLean K, Haddadi A, Croft E, Alcazar J (2013) Gestures for industry intuitive human-robot communication from human observation. In: 2013 8th ACM/IEEE international conference on human-robot interaction (HRI), pp. 349–356. IEEE

  86. Barattini P, Morand C, Robertson NM (2012) A proposed gesture set for the control of industrial collaborative robots. In: 2012 IEEE RO-MAN: The 21st IEEE international symposium on robot and human interactive communication, pp. 132–137. IEEE

  87. Liu H, Wang L (2018) Gesture recognition for human-robot collaboration: a review. Int J Ind Ergon 68:355–367

    MATH  Google Scholar 

  88. Clark H (2003) Pointing and placing. Psychology Press, Mahwah

    MATH  Google Scholar 

  89. Gustavsson P, Syberfeldt A, Brewster R, Wang L (2017) Human-robot collaboration demonstrator combining speech recognition and haptic control. Procedia CIRP 63:396–401

    Google Scholar 

  90. Maurtua I, Fernandez I, Tellaeche A, Kildal J, Susperregi L, Ibarguren A, Sierra B (2017) Natural multimodal communication for human-robot collaboration. Int J Adv Robot Syst. https://doi.org/10.1177/1729881417716043

    Article  Google Scholar 

  91. Bischoff R, Kazi A, Seyfarth M (2002) The morpha style guide for icon-based programming. In: Proceedings. 11th IEEE International workshop on robot and human interactive communication, pp. 482–487. IEEE

  92. Neto P, Pires JN, Moreira AP (2010) Cad-based off-line robot programming. In: 2010 IEEE conference on robotics, automation and mechatronics, pp. 516–521. IEEE

  93. Guerin KR, Riedel SD, Bohren J, Hager GD (2014) Adjutant: a framework for flexible human-machine collaborative systems. In: 2014 IEEE/RSJ International conference on intelligent robots and systems, pp. 1392–1399. IEEE

  94. Pedersen MR, Herzog DL, Krüger V (2014) Intuitive skill-level programming of industrial handling tasks on a mobile manipulator. In: 2014 IEEE/RSJ international conference on intelligent robots and systems, pp. 4523–4530. IEEE

  95. Schmidbauer C, Komenda T, Schlund S (2020) Teaching cobots in learning factories-user and usability-driven implications. Proc manuf 45:398–404

    MATH  Google Scholar 

  96. Zhu Z, Hu H (2018) Robot learning from demonstration in robotic assembly: a survey. Robotics 7(2):17

    MATH  Google Scholar 

  97. McNeill D (1992) Hand and mind: what gestures reveal about thought. University of Chicago Press, Chicago

  98. Kendon A (2004) Gesture: visible action as utterance. Cambridge University Press, Cambridge

    Google Scholar 

  99. Hanna JE, Brennan SE (2007) Speakers’ eye gaze disambiguates referring expressions early during face-to-face conversation. J Mem Lang 57(4):596–615

    Google Scholar 

  100. Furnas GW, Landauer TK, Gomez LM, Dumais ST (1987) The vocabulary problem in human-system communication. Commun ACM 30(11):964–971

    Google Scholar 

  101. Brennan SE (1996) Lexical entrainment in spontaneous dialog. Proc ISSD 96:41–44

    MATH  Google Scholar 

Download references

Funding

This research was supported by the Vienna Science and Technology Fund (WWTF) project “Human tutoring of robots in industry” (NXT19-005), and the Austrian Research Promotion Agency (FFG) Ideen Lab 4.0 project CoBot Studio (872590).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stephanie Gross.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Animal and Human Rights

The paper contains theoretical work, therefore this research did not involve human participants and/or animals and no informed consent was necessary. Also, the authors have no relevant financial or non-financial interests to disclose.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gross, S., Krenn, B. A Communicative Perspective on Human–Robot Collaboration in Industry: Mapping Communicative Modes on Collaborative Scenarios. Int J of Soc Robotics 16, 1315–1332 (2024). https://doi.org/10.1007/s12369-023-00991-5

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12369-023-00991-5

Keywords

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy