Publication:

Conveying Emotions to Robots Through Touch and Sound

 
dc.contributor.authorren, Qiaoqiao
dc.contributor.authorProesmans, Remko
dc.contributor.authorBossuyt, Frederick
dc.contributor.authorVanfleteren, Jan
dc.contributor.authorWyffels, Francis
dc.contributor.authorBelpaeme, Tony
dc.contributor.imecauthorRen, Qiaoqiao
dc.contributor.imecauthorProesmans, Remko
dc.contributor.imecauthorBossuyt, Frederick
dc.contributor.imecauthorVanfleteren, Jan
dc.contributor.imecauthorWyffels, Francis
dc.contributor.imecauthorBelpaeme, Tony
dc.contributor.orcidimecProesmans, Remko::0000-0002-5925-625X
dc.contributor.orcidimecBossuyt, Frederick::0000-0003-3350-9295
dc.contributor.orcidimecVanfleteren, Jan::0000-0002-9654-7304
dc.contributor.orcidimecWyffels, Francis::0000-0002-5491-8349
dc.contributor.orcidimecBelpaeme, Tony::0000-0001-5207-7745
dc.date.accessioned2025-08-18T03:58:56Z
dc.date.available2025-08-18T03:58:56Z
dc.date.issued2025
dc.description.abstractHuman emotions can be conveyed through nuanced touch gestures. However, there is a lack of understanding of how consistently emotions can be conveyed to robots through touch. This study explores the consistency of touch-based emotional expression toward a robot by integrating tactile and auditory sensory reading of affective haptic expressions. We developed a piezoresistive pressure sensor and used a microphone to mimic touch and sound channels, respectively. In a study with 28 participants, each conveyed 10 emotions to a robot using spontaneous touch gestures. Our findings reveal a statistically significant consistency in emotion expression among participants. However, some emotions obtained low intraclass correlation values. Additionally, certain emotions with similar levels of arousal or valence did not exhibit significant differences in the way they were conveyed. We subsequently constructed a multi-modal integrating touch and audio features to decode the 10 emotions. A support vector machine (SVM) model demonstrated the highest accuracy, achieving 40% for 10 classes, with “Attention” being the most accurately conveyed emotion at a balanced accuracy of 87.65 %.
dc.identifier.doi10.1007/978-981-96-3525-2_28
dc.identifier.eisbn978-981-96-3525-2
dc.identifier.isbn978-981-96-3524-5
dc.identifier.issn2945-9133
dc.identifier.urihttps://imec-publications.be/handle/20.500.12860/46088
dc.publisherSPRINGER-VERLAG SINGAPORE PTE LTD
dc.source.beginpage329
dc.source.conference16th International Conference on Social Robotics-ICSR-Empowering Humanity: The Role of Social and Collaborative Robotics in Shaping Our Future
dc.source.conferencedate2024-10-24
dc.source.conferencelocationOdense
dc.source.endpage339
dc.source.journalLecture Notes in Computer Science
dc.source.numberofpages11
dc.title

Conveying Emotions to Robots Through Touch and Sound

dc.typeProceedings paper
dspace.entity.typePublication
Files

Original bundle

Name:
DS884_acc.pdf
Size:
1.83 MB
Format:
Adobe Portable Document Format
Description:
Accepted
Publication available in collections: