Journal of Machine and Computing


Cognitive Emotion Aware Systems Using Multimodal Signals and Reinforcement Learning



Journal of Machine and Computing

Received On : 28 November 2024

Revised On : 04 January 2025

Accepted On : 18 March 2025

Published On : 05 July 2025

Volume 05, Issue 03

Pages : 1349-1362


Abstract


Predicting human behaviour is a complex task. Traditional methods often rely on explicit user input or external observation, which can be restrictive and impractical in real-world scenarios. As an alternative, Brain-Computer Interfaces (BCIs) offer a more direct and specific means of accessing cognitive and emotional states, providing valuable insights into human intentions and decision-making processes. This paper proposes a novel method that predicts and suggests personalised emotion-based activities for individual users based on multi-modal sensory data collected from the brain, body, and environment. Our method overcomes the limitations of conventional systems by incorporating a multi-modal data collection set throughout the day to understand user context and intent better. By analysing this data, we predict the emotions-based practice of the user's day. We train our method using state-of-the-art, nature-inspired reinforcement learning algorithms and agent technology to optimise its optimisations and personalised continuously. The performance evaluation shows that the accuracy and F1 score for the proposed method achieved 95.6% and 84%, respectively, achieving 2 to 3% more accuracy than AI-based emotion state-of-the-art detection methods.


Keywords


Agent Technology, Brain-Computer Interfaces, Human Behavior, Personalized Daily Activities, Multi-Modal Sensory.


  1. P. Ye, T. Wang, and F.-Y. Wang, “A Survey of Cognitive Architectures in the Past 20 Years,” IEEE Transactions on Cybernetics, vol. 48, no.12, pp. 3280–3290, Dec. 2018, doi: 10.1109/tcyb.2018.2857704.
  2. I. Kotseruba and J. K. Tsotsos, “40 years of cognitive architectures: core cognitive abilities and practical applications,” Artificial Intelligence Review, vol. 53, no. 1, pp. 17–94, Jul. 2018, doi: 10.1007/s10462-018-9646-y.
  3. C. Adam, W. Johal, D. Pellier, H. Fiorino, and S. Pesty, “Social Human-Robot Interaction: A New Cognitive and Affective Interaction-Oriented Architecture,” Social Robotics, pp. 253–263, 2016, doi: 10.1007/978-3-319-47437-3_25.
  4. A. Ghandeharioun, D. McDuff, M. Czerwinski, and K. Rowan, “EMMA: An Emotion-Aware Wellbeing Chatbot,” 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII), pp. 1–7, Sep. 2019, doi: 10.1109/acii.2019.8925455.
  5. Barrett LF, Gross JJ. Emotional intelligence: a process model of emotion representation and regulation. 2001.
  6. M. Jeon, “Emotions and Affect in Human Factors and Human–Computer Interaction: Taxonomy, Theories, Approaches, and Methods,”Emotions and Affect in Human Factors and Human-Computer Interaction, pp. 3–26, 2017, doi: 10.1016/b978-0-12-801851-4.00001-x.
  7. R. Zall and M. R. Kangavari, “Comparative Analytical Survey on Cognitive Agents with Emotional Intelligence,” Cognitive Computation, vol. 14, no. 4, pp. 1223–1246, May 2022, doi: 10.1007/s12559-022-10007-5.
  8. E. A. Phelps, “Emotion and Cognition: Insights from Studies of the Human Amygdala,” Annual Review of Psychology, vol. 57, no. 1, pp.27–53, Jan. 2006, doi: 10.1146/annurev.psych.56.091103.070234.
  9. R. Smith and R. D. Lane, “The neural basis of one’s own conscious and unconscious emotional states,” Neuroscience & Biobehavioral Reviews, vol. 57, pp. 1–29, Oct. 2015, doi: 10.1016/j.neubiorev.2015.08.003.
  10. J. Ratican, J. Hutson, and D. Plate, “Synthesizing Sentience: Integrating Large Language Models and Autonomous Agents for Emulating Human Cognitive Complexity,” Journal of Artificial Intelligence, Machine Learning and Data Science, vol. 1, no. 4, pp. 135–141, Oct. 2023,doi: 10.51219/jaimld/jeremiah-ratican/17.
  11. J. Pérez, E. Cerezo, F. J. Serón, and L.-F. Rodríguez, “A cognitive-affective architecture for ECAs,” Biologically Inspired Cognitive Architectures, vol. 18, pp. 33–40, Oct. 2016, doi: 10.1016/j.bica.2016.10.002.
  12. J. R. Anderson, D. Bothell, M. D. Byrne, S. Douglass, C. Lebiere, and Y. Qin, “An Integrated Theory of the Mind.,” Psychological Review, vol. 111, no. 4, pp. 1036–1060, 2004, doi: 10.1037/0033-295x.111.4.1036.
  13. Belavkin RV. The role of emotion in problem solving. In Proceedings of the AISB'01 Symposium on Emotion, cognition and affective computing, Heslington, York, England. 2001, Citeseer, pp. 49–57.
  14. I. Juvina, O. Larue, and A. Hough, “Modeling valuation and core affect in a cognitive architecture: The impact of valence and arousal on memory and decision-making,” Cognitive Systems Research, vol. 48, pp. 4–24, May 2018, doi: 10.1016/j.cogsys.2017.06.002.
  15. C. Flavián-Blanco, R. Gurrea-Sarasa, and C. Orús-Sanclemente, “Analyzing the emotional outcomes of the online search behavior with search engines,” Computers in Human Behavior, vol. 27, no. 1, pp. 540–551, Jan. 2011, doi: 10.1016/j.chb.2010.10.002.
  16. P. Gebhard, “ALMA,” Proceedings of the fourth international joint conference on Autonomous agents and multiagent systems, pp. 29–36, Jul. 2005, doi: 10.1145/1082473.1082478.
  17. R. R. McCrae and O. P. John, “An Introduction to the Five‐Factor Model and Its Applications,” Journal of Personality, vol. 60, no. 2, pp. 175–215, Jun. 1992, doi: 10.1111/j.1467-6494.1992.tb00970.x.
  18. Laird JE. The Soar cognitive architecture. MIT press. 2019.
  19. L.-F. Rodríguez, J. O. Gutierrez-Garcia, and F. Ramos, “Modeling the interaction of emotion and cognition in Autonomous Agents,”Biologically Inspired Cognitive Architectures, vol. 17, pp. 57–70, Jul. 2016, doi: 10.1016/j.bica.2016.07.008.
  20. Hudlicka E. Beyond cognition: modeling emotion in cognitive architectures. In ICCM. 2004, pp. 118–123.
  21. M. S. El-Nasr, J. Yen, and T. R. Ioerger, Autonomous Agents and Multi-Agent Systems, vol. 3, no. 3, pp. 219–257, 2000, doi:10.1023/a:1010030809960.
  22. Ojha S, Vitale J, Williams M-A. EEGS: a transparent model of emotions. 2020. arXiv preprint. arXiv: 2011.02573.
  23. G. Fernández-Blanco Martín et al., “An Emotional Model Based on Fuzzy Logic and Social Psychology for a Personal Assistant Robot,”Applied Sciences, vol. 13, no. 5, p. 3284, Mar. 2023, doi: 10.3390/app13053284.
  24. E. Hudlicka, “The Case for Cognitive-Affective Architectures as Affective User Models in Behavioral Health Technologies,” Augmented Cognition. Human Cognition and Behavior, pp. 191–206, 2020, doi: 10.1007/978-3-030-50439-7_13.
  25. A. V. Samsonovich, “Socially emotional brain-inspired cognitive architecture framework for artificial intelligence,” Cognitive Systems Research, vol. 60, pp. 57–76, May 2020, doi: 10.1016/j.cogsys.2019.12.002.

CRediT Author Statement


The authors confirm contribution to the paper as follows:

Conceptualization: Ezil Sam Leni A, Revathi T and Niranchana Radhakrishnan; Methodology: Ezil Sam Leni A and Revathi T; Data Curation: Niranchana Radhakrishnan; Writing- Original Draft Preparation: Ezil Sam Leni A and Revathi T; Visualization: Revathi T and Niranchana Radhakrishnan; Investigation: Ezil Sam Leni A, Revathi T and Niranchana Radhakrishnan; Supervision: Revathi T and Niranchana Radhakrishnan; Validation: Ezil Sam Leni A and Revathi T; Writing- Reviewing and Editing: Ezil Sam Leni A, Revathi T and Niranchana Radhakrishnan; All authors reviewed the results and approved the final version of the manuscript.


Acknowledgements


Authors thanks to Department of Computer Science and Engineering for this research support.


Funding


No funding was received to assist with the preparation of this manuscript.


Ethics declarations


Conflict of interest

The authors have no conflicts of interest to declare that are relevant to the content of this article.


Availability of data and materials


Data sharing is not applicable to this article as no new data were created or analysed in this study.


Author information


Contributions

All authors have equal contribution in the paper and all authors have read and agreed to the published version of the manuscript.


Corresponding author


Rights and permissions


Open Access This article is licensed under a Creative Commons Attribution NoDerivs is a more restrictive license. It allows you to redistribute the material commercially or non-commercially but the user cannot make any changes whatsoever to the original, i.e. no derivatives of the original work. To view a copy of this license, visit https://creativecommons.org/licenses/by-nc-nd/4.0/


Cite this article


Ezil Sam Leni A, Revathi T and Niranchana Radhakrishnan, “Cognitive Emotion Aware Systems Using Multimodal Signals and Reinforcement Learning”, Journal of Machine and Computing, vol.5, no.3, pp. 1349-1362, July 2025, doi: 10.53759/7669/jmc202505106.


Copyright


© 2025 Ezil Sam Leni A, Revathi T and Niranchana Radhakrishnan. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.