#

Advances in Intelligent Systems and Technologies

Book Series

About the Book
About the Author
Table of Contents

Buy this Book

eBook
  • • Included format: Online and PDF
  • • eBooks can be used on all reading devices
  • • ISSN : 2959-3042
  • • ISBN : 978-9914-9946-0-5


Hardcover
  • • Including format: Hardcover
  • • Shipping Available for individuals worldwide
  • • ISSN : 2959-3034
  • • ISBN : 978-9914-9946-3-6


Services for the Book

Download Product Flyer
Download High-Resolutions Cover

First International Conference on Machines, Computing and Management Technologies

A Review of Art and Real World Applications of Intelligent Perception Systems

Ceren Ergenc and Yifei LI, Communication Science, University of Amsterdam, 1012 WX Amsterdam, Netherlands.


Online First : 30 July 2022
Publisher Name : AnaPub Publications, Kenya.
ISSN (Online) : 2959-3042
ISSN (Print) : 2959-3034
ISBN (Online) : 978-9914-9946-0-5
ISBN (Print) : 978-9914-9946-3-6
Pages : 076-086

Abstract


Sensory data and AI/ML techniques are crucial to several robotics applications, which is why perception in robots is a hot topic. Some of these applications include: object recognition, scene understanding, environment representation, activity identification, semantic location classification, object modeling, and pedestrian/human detection. Robotic perception, as used in this article, is the collection of machine learning (ML) techniques and methods that allow robots to process sensory data and form conclusions and perform actions accordingly. It is clear that recent development in the field of ML, mostly deep learning methodologies, have led to improvements in robotic perception systems, which in turn make it possible to realize applications and activities that were previously unimaginable. These recent advancements in complex robotic tasks, human-robot interaction, decision-making, and intelligent thought are in part due to the fast development and widespread usage of ML algorithms. This article provides a survey of real-world and state of the art applications of intelligent perception systems in robots.

Keywords


Robotic Perception, Perception Systems, Environment Representation, Machine Learning, Artificial Intelligence

  1. S. E. Navarro et al., “Proximity perception in human-centered robotics: A survey on sensing systems and applications,” IEEE Trans. Robot., vol. 38, no. 3, pp. 1599–1620, 2022.
  2. M. B. Shaikh and D. Chai, “RGB-D data-based action recognition: A review,” Sensors (Basel), vol. 21, no. 12, p. 4246, 2021.
  3. F. Mastrogiovanni, A. Sgorbissa, and R. Zaccaria, “Extending the capabilities of mobile robots through knowledge ecosystems,” in 2007 International Symposium on Computational Intelligence in Robotics and Automation, 2007.
  4. M. S. Qureshi, P. Singh, and P. Swarnkar, “Intelligent fuzzy logic-based sliding mode control methodologies for pick and drop operation of robotic manipulator,” Int. J. Comput. Vis. Robot., vol. 12, no. 5, p. 549, 2022.
  5. L. Morillo-Mendez, M. G. S. Schrooten, A. Loutfi, and O. M. Mozos, “Age-related differences in the perception of robotic referential gaze in human-robot interaction,” Int. J. Soc. Robot., pp. 1–13, 2022.
  6. Y.-W. Wang, C.-Z. Qin, W.-M. Cheng, A.-X. Zhu, Y.-J. Wang, and L.-J. Zhu, “Automatic crater detection by training random forest classifiers with legacy crater map and spatial structural information derived from digital terrain analysis,” Ann. Am. Assoc. Geogr., vol. 112, no. 5, pp. 1328–1349, 2022.
  7. M. Kragh and J. Underwood, “Multimodal obstacle detection in unstructured environments with conditional random fields: KRAGH and UNDERWOOD,” J. Field Robot., vol. 37, no. 1, pp. 53–72, 2020.
  8. X. Jiang et al., “Characterizing functional brain networks via Spatio-Temporal Attention 4D Convolutional Neural Networks (STA-4DCNNs),” Neural Netw., vol. 158, pp. 99–110, 2023.
  9. O. V. Simanjuntak and D. C. Simanjuntak, “Studentsâ€TM vocabulary knowledge: Comparative study enhancing between Semantic Mapping and Diglot Weave Techniques,” acuity, vol. 3, no. 2, p. 12, 2018.
  10. C. Mura, O. Mattausch, A. J. Villanueva, E. Gobbetti, and R. Pajarola, “Robust reconstruction of interior building structures with multiple rooms under clutter and occlusions,” in 2013 International Conference on Computer-Aided Design and Computer Graphics, 2013.
  11. H. Wang, F. Yang, B. Shen, K.-J. Ma, T.-H. Zheng, and Y.-H. Fan, “Construction process analysis for a multi-story building structure with floors slab of long-span,” Staveb. Obz. - Civ. Eng. J., vol. 28, no. 3, pp. 404–419, 2019.
  12. D. Simão, C. M. Gomes, P. M. Alves, and C. Brito, “Capturing the third dimension in drug discovery: Spatially-resolved tools for interrogation of complex 3D cell models,” Biotechnol. Adv., vol. 55, no. 107883, p. 107883, 2022.
  13. D. Bellos, M. Basham, T. Pridmore, and A. P. French, “Temporal refinement of 3D CNN semantic segmentations on 4D time-series of undersampled tomograms using hidden Markov models,” Sci. Rep., vol. 11, no. 1, p. 23279, 2021.
  14. R. DasGupta and R. Shaw, “Cumulative impacts of human interventions and climate change on mangrove ecosystems of South and southeast Asia: An overview,” J. Ecosyst., vol. 2013, pp. 1–15, 2013.
  15. A. Petrović, M. Nikolić, M. Jovanović, and B. Delibašić, “Gaussian conditional random fields for classification,” Expert Syst. Appl., vol. 212, no. 118728, p. 118728, 2023.
  16. J. Zhang, M. Gao, W. Holmes, M. Mavrikis, and N. Ma, “Interaction patterns in exploratory learning environments for mathematics: a sequential analysis of feedback and external representations in Chinese schools,” Interact. Learn. Environ., vol. 29, no. 7, pp. 1211–1228, 2021.
  17. I. Khan and L. Cañamero, “The long-term efficacy of ‘social buffering’ in artificial social agents: Contextual affective perception matters,” Front. Robot. AI, vol. 9, p. 699573, 2022.
  18. G. Averta, C. Della Santina, F. Ficuciello, M. A. Roa, and M. Bianchi, “Editorial: On the planning, control, and perception of soft robotic end-effectors,” Front. Robot. AI, vol. 8, p. 795863, 2021.
  19. A. Presenti, Z. Liang, L. F. A. Pereira, J. Sijbers, and J. De Beenhouwer, “Fast and accurate pose estimation of additive manufactured objects from few X-ray projections,” Expert Syst. Appl., vol. 213, no. 118866, p. 118866, 2023.
  20. M. Saeidi and A. Arabsorkhi, “A novel backbone architecture for pedestrian detection based on the human visual system,” Vis. Comput., vol. 38, no. 6, pp. 2223–2237, 2022.
  21. T. Peynot, S. Monteiro, A. Kelly, and M. Devy, “Editorial: Special issue on alternative sensing techniques for robot perception,” J. Field Robot., vol. 32, no. 1, pp. 1–2, 2015.
  22. D. L. Tkachev, “Spectrum and linear Lyapunov instability of a resting state for flows of an incompressible polymeric fluid,” J. Math. Anal. Appl., vol. 522, no. 1, p. 126914, 2023.
  23. X.-T. Truong, V. N. Yoong, and T.-D. Ngo, “RGB-D and laser data fusion-based human detection and tracking for socially aware robot navigation framework,” in 2015 IEEE International Conference on Robotics and Biomimetics (ROBIO), 2015.
  24. B. Ma and T. Wang, “Head pose estimation using sparse representation,” in 2010 Second International Conference on Computer Engineering and Applications, 2010.
  25. M. S. Biradar, B. G. Shiparamatti, and P. M. Patil, “Fabric defect detection using deep convolutional neural network,” Opt. Mem. Neural Netw., vol. 30, no. 3, pp. 250–256, 2021.
  26. O. Cronie and J. Mateu, “Spatio-temporal c\`adl\`ag functional marked point processes: Unifying spatio-temporal frameworks,” arXiv [math.ST], 2014.
  27. S. B. Jabeur, H. Ballouk, W. B. Arfi, and R. Khalfaoui, “Machine learning-based modeling of the environmental degradation, institutional quality, and economic growth,” Environ. Model. Assess., vol. 27, no. 6, pp. 953–966, 2022.

Cite this article


Ceren Ergenc and Yifei LI, “A Review of Art and Real World Applications of Intelligent Perception Systems”, Advances in Intelligent Systems and Technologies, pp. 076-086. 2022. doi:10.53759/aist/978-9914-9946-0-5_9

Copyright


© 2023 Ceren Ergenc and Yifei LI. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.