TY - JOUR
T1 - Gaze detection as a social cue to initiate natural human-robot collaboration in an assembly task
AU - Lavit, Nicora M.
AU - Prajod, P.
AU - Mondellini, M.
AU - Tauro, G.
AU - Vertechy, R.
AU - Andre, E.
AU - Malosio, M.
PY - 2024
Y1 - 2024
N2 - Introduction: In this work we explore a potential approach to improve human-robot collaboration experience by adapting cobot behavior based on natural cues from the operator. Methods: Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employed a gaze-based attention recognition model to identify when the participants look at the cobot. Results: Our results indicate that in most cases (83.74%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot mostly around the time of the joint activity. Given the above results, a fully integrated system triggering joint action only when the gaze is directed towards the cobot was piloted with 10 volunteers, of which one characterized by high-functioning Autism Spectrum Disorder. Even though they had never interacted with the robot and did not know about the gaze-based triggering system, most of them successfully collaborated with the cobot and reported a smooth and natural interaction experience. Discussion: To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task and to attempt the full integration of an automated gaze-based triggering system.
AB - Introduction: In this work we explore a potential approach to improve human-robot collaboration experience by adapting cobot behavior based on natural cues from the operator. Methods: Inspired by the literature on human-human interactions, we conducted a wizard-of-oz study to examine whether a gaze towards the cobot can serve as a trigger for initiating joint activities in collaborative sessions. In this study, 37 participants engaged in an assembly task while their gaze behavior was analyzed. We employed a gaze-based attention recognition model to identify when the participants look at the cobot. Results: Our results indicate that in most cases (83.74%), the joint activity is preceded by a gaze towards the cobot. Furthermore, during the entire assembly cycle, the participants tend to look at the cobot mostly around the time of the joint activity. Given the above results, a fully integrated system triggering joint action only when the gaze is directed towards the cobot was piloted with 10 volunteers, of which one characterized by high-functioning Autism Spectrum Disorder. Even though they had never interacted with the robot and did not know about the gaze-based triggering system, most of them successfully collaborated with the cobot and reported a smooth and natural interaction experience. Discussion: To the best of our knowledge, this is the first study to analyze the natural gaze behavior of participants working on a joint activity with a robot during a collaborative assembly task and to attempt the full integration of an automated gaze-based triggering system.
KW - gaze estimation
KW - human-centered computing
KW - human-robot interaction
KW - industry 5.0
KW - natural behavior
KW - gaze estimation
KW - human-centered computing
KW - human-robot interaction
KW - industry 5.0
KW - natural behavior
UR - https://publicatt.unicatt.it/handle/10807/313611
UR - https://www.scopus.com/inward/citedby.uri?partnerID=HzOxMe3b&scp=85200053702&origin=inward
UR - https://www.scopus.com/inward/record.uri?partnerID=HzOxMe3b&scp=85200053702&origin=inward
U2 - 10.3389/frobt.2024.1394379
DO - 10.3389/frobt.2024.1394379
M3 - Article
SN - 2296-9144
VL - 11
SP - 1
EP - 12
JO - Frontiers in Robotics and AI
JF - Frontiers in Robotics and AI
IS - july
ER -