Machine vs. human agents in moral dilemmas: evidence from EEG and behavioral data

Federico Cassioli, Davide Crivelli, Michela Balconi

Research output: Chapter in Book/Report/Conference proceedingChapter


The future involvement of automated technology in morally-charged decisions is an ethical challenge with relevant consequences. In this study, a task composed of moral dilemmas was employed to assess how participants (n= 34) would evaluate the morality, the consciousness, the responsibility, the intentionality, and the emotional impact of the agent (human vs. machine) and the type of behavior (action vs. inaction). Reaction times (RTs) and electroencephalography (delta, theta, beta, alpha, gamma bands) data were also gathered. Data showed that participants apply different moral rules based on the agent factor, considering humans more moral, responsible, and conscious. The evaluation of the emotional impact derived from human actions was perceived as more severe. Moreover, increased gamma power was found when assessing the intentionality and emotional impact of machines. Higher beta power in the right frontal and frontocentral areas was detected when evaluating the machine's produced emotional impact. An increased right temporal activation was detected when evaluating human agent's emotional impact. Lastly, an alpha desynchronization occurred when evaluating the responsibility of the inaction behavior in the left occipital area. These results might be interpreted as the applications of moral schemata depending on the involved agent, with a possible asymmetry effect in moral judgment
Original languageEnglish
Title of host publication2022 International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME)
Number of pages6
Publication statusPublished - 2022


  • Measurement
  • Ethics
  • Mechatronics
  • Decision making
  • Electrophysiology
  • Electroencephalography
  • Behavioral sciences


Dive into the research topics of 'Machine vs. human agents in moral dilemmas: evidence from EEG and behavioral data'. Together they form a unique fingerprint.

Cite this