Abstract
Robots, traditionally confined into factories, are nowadays moving to domestic and assistive environments, where they need to deal with complex object shapes, deformable materials, and pose uncertainties at human pace. To attain quick 3D perception, new cameras delivering registered depth and intensity images at a high frame rate hold a lot of promise, and therefore many robotics researchers are now experimenting with structured-light RGBD and Time-of-Flight (ToF) cameras. In this paper both technologies are critically compared to help researchers to evaluate their use in real robots. The focus is on 3D perception at close distances for different types of objects that may be handled by a robot in a human environment. We review three robotics applications. The analysis of several performance aspects indicates the complementarity of the two camera types, since the user-friendliness and higher resolution of RGBD cameras is counterbalanced by the capability of ToF cameras to operate outdoors and perceive details.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Alenyà G, Dellen B, Torras C (2011) 3D modelling of leaves from color and tof data for robotized plant measuring. In: Proceedings of IEEE international conference on robotics and automation, Shanghai, pp 3408–3414
Alenyà G, Dellen B, Foix S, Torras C (2013) Robotized plant probing: leaf segmentation utilizing time-of-flight data. IEEE Robot Autom Mag 20(3):50–59
Alenyà G, Foix S, Torras C (2014) ToF cameras for active vision in robotics. Sens Actuators Physl 218:10–22
Chiabrando F, Chiabrando R, Piatti D, Rinaudo F (2009) Sensors for 3D imaging: metric evaluation and calibration of a CCD/CMOS time-of-flight camera. Sensors 9(12):10080–10096
Foix S, Alenyà G, Andrade-Cetto J, Torras C (2010) Object modeling using a ToF camera under an uncertainty reduction approach. In: Proceedings of IEEE international conference on robotics and automation, Anchorage, pp 1306–1312
Foix S, Alenyà G, Torras C (2011) Lock-in time-of-flight (ToF) cameras: a survey. IEEE Sensors J 11(9):1917–1926
Fossati A, Gall J, Grabner H, Ren X, Konolige K (eds) (2013) Consumer Depth cameras for computer vision. Advances in computer vision and pattern recognition series. Springer, Berlin
Fuchs S, May S (2008) Calibration and registration for precise surface reconstruction with time of flight cameras. Int J Intell Syst Technol Appl 5(3/4):274–284
Ghobadi S, Loepprich O, Ahmadov F, Bernshausen J (2008) Real time hand based robot control using multimodal images. Int J Comput Sci 35(4):500–505
Janoch A, Karayev S, Jia Y, Barron JT, Fritz M, Saenko K, Darrell T (2013) A category-level 3d object dataset: putting the Kinect to work. In:Fossati A, Gall J, Grabner H, Ren X, Konolige K (eds) Consumer depth cameras for computer vision, Springer, London, pp 141–165
Kahn S, Bockholt U, Kuijper A, Fellner DW (2013) Towards precise real-time 3D difference detection for industrial applications. Comput Indus 64(9):1115–1128
Kazmi W, Foix S, Alenyà G, Andersen HJ (2014) Indoor and outdoor depth imaging of leaves with time-of-flight and stereo vision sensors: analysis and comparison. ISPRS J Photogramm Remote Sens 88:128–146
Khoshelham K, Elberink SO (2012) Accuracy and resolution of Kinect depth data for indoor mapping applications. Sensors 12(2):1437–1454
Lange R, Seitz P (2001) Solid-state time-of-flight range camera. IEEE J Quantum Electron 37(3):390–397
Lefloch D, Nair R, Lenzen F, Schäfer H, Streeter L, Cree M, Koch R, Kolb A. (2013) Technical foundation and calibration methods for time-of-flight cameras. In: Grzegorzek M, Theobalt C, Koch R, Kolb A (eds) Time-of-flight and depth imaging: sensors, algorithms, and applications, LNCS 8200, Springer
Lindner M, Schiller I, Kolb A, Koch R (2010) Time-of-flight sensor calibration for accurate range sensing. Comput Vis Image Underst 114(12):1318–1328
May S, Fuchs S, Droeschel D, Holz D, Nuchter A (2009) Robust 3D-mapping with time-of-flight cameras. In: Proceedings of IEEE international conference in intelligent robots and systems. Saint Louis, pp 1673–1679
Ramisa A, Alenyà G, Moreno-Noguer F, Torras C (2012) Using depth and appearance features for informed robot grasping of highly wrinkled clothes. In: Proceedings of IEEE international conference on robotics and automation, Saint Paul, pp 1703–1708
Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-time human pose recognition in parts from a single depth image. In: Proceedings of 25th IEEE conference on computer vision and pattern recognition, Colorado Springs
Zhang Z (2012) Microsoft Kinect sensor and its effect. IEEE Multimedia 19(2):4–10
Author information
Authors and Affiliations
Corresponding author
Additional information
This research is partially funded by the EU GARNICS project FP7-247947, by CSIC project MANIPlus 201350E102, by the Spanish Ministry of Science and Innovation under project PAU+ DPI2011-27510, and the Catalan Research Commission under Grant SGR-155.
Rights and permissions
About this article
Cite this article
Alenyà, G., Foix, S. & Torras, C. Using ToF and RGBD cameras for 3D robot perception and manipulation in human environments. Intel Serv Robotics 7, 211–220 (2014). https://doi.org/10.1007/s11370-014-0159-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11370-014-0159-5