Publication:

Touched by ChatGPT: Using an LLM to Drive Affective Tactile Interaction

 
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.orcid0000-0001-5207-7745
cris.virtual.orcid#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtualsource.department6c1aac4b-593e-4f80-9ecc-911fd20f3c31
cris.virtualsource.department952936d6-a5d1-4952-ab8c-7ba5c377af16
cris.virtualsource.orcid6c1aac4b-593e-4f80-9ecc-911fd20f3c31
cris.virtualsource.orcid952936d6-a5d1-4952-ab8c-7ba5c377af16
dc.contributor.authorren, Qiaoqiao
dc.contributor.authorBelpaeme, Tony
dc.date.accessioned2026-03-24T13:39:11Z
dc.date.available2026-03-24T13:39:11Z
dc.date.createdwos2025-10-29
dc.date.issued2025
dc.description.abstractTouch is a fundamental aspect of emotion-rich communication, playing a vital role in human interaction and offering significant potential in human-robot interaction. Previous research has demonstrated that a sparse representation of human touch can effectively convey social tactile signals. However, advances in human-robot tactile interaction remain limited, as many humanoid robots possess simplistic capabilities, such as only opening and closing their hands, restricting nuanced tactile expressions. In this study, we explore how a robot can use sparse representations of tactile vibrations to convey emotions to a person. To achieve this, we developed a wearable sleeve integrated with a 5×5 grid of vibration motors, enabling the robot to communicate diverse tactile emotions and gestures. Using chain prompts within a Large Language Model (LLM), we generated distinct 10-second vibration patterns corresponding to 10 emotions (e.g., happiness, sadness, fear) and 6 touch gestures (e.g., pat, rub, tap). Participants (N=32) then rated each vibration stimulus based on perceived valence and arousal. People are accurate at recognising intended emotions, a result which aligns with earlier findings. These results highlight the LLM's ability to generate emotional haptic data and effectively convey emotions through tactile signals. By translating complex emotional and tactile expressions into vibratory patterns, this research demonstrates how LLMs can enhance physical interaction between humans and robots.
dc.identifier.doi10.1109/HRI61500.2025.10973852
dc.identifier.isbn979-8-3503-7894-8
dc.identifier.issn2167-2121
dc.identifier.urihttps://imec-publications.be/handle/20.500.12860/58930
dc.language.isoeng
dc.provenance.editstepusergreet.vanhoof@imec.be
dc.publisherIEEE
dc.source.beginpage1563
dc.source.conference20th ACM/IEEE International Conference on Human-Robot Interaction (HRI)
dc.source.conferencedate2025-03-04
dc.source.conferencelocationMelbourne
dc.source.endpage1567
dc.source.journal2025 20TH ACM/IEEE INTERNATIONAL CONFERENCE ON HUMAN-ROBOT INTERACTION, HRI
dc.source.numberofpages5
dc.title

Touched by ChatGPT: Using an LLM to Drive Affective Tactile Interaction

dc.typeProceedings paper
dspace.entity.typePublication
imec.internal.crawledAt2025-10-22
imec.internal.sourcecrawler
Files
Publication available in collections: