Publication:

When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR

Date

 
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.department#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.orcid#PLACEHOLDER_PARENT_METADATA_VALUE#
cris.virtual.orcid0000-0003-4714-5373
cris.virtual.orcid0000-0003-2557-3764
cris.virtual.orcid0000-0003-4011-219X
cris.virtual.orcid0000-0003-2056-1246
cris.virtualsource.department205ca195-74dd-4627-950e-4b5616cbf046
cris.virtualsource.departmentb06e1b7c-512a-486a-8ae7-a1904509051f
cris.virtualsource.departmentf8b1712d-8920-49aa-9768-f761c34e4483
cris.virtualsource.department8754c9b2-916f-46f2-a722-ddca1779eb02
cris.virtualsource.departmentcdac3f50-44be-4aaf-94f2-3bb46d68d069
cris.virtualsource.orcid205ca195-74dd-4627-950e-4b5616cbf046
cris.virtualsource.orcidb06e1b7c-512a-486a-8ae7-a1904509051f
cris.virtualsource.orcidf8b1712d-8920-49aa-9768-f761c34e4483
cris.virtualsource.orcid8754c9b2-916f-46f2-a722-ddca1779eb02
cris.virtualsource.orcidcdac3f50-44be-4aaf-94f2-3bb46d68d069
dc.contributor.authorOsei Tutu, Dennis
dc.contributor.authorHabibiabad, Sepideh
dc.contributor.authorVan den Noortgate, Wim
dc.contributor.authorSaldien, Jelle
dc.contributor.authorBombeke, Klaas
dc.date.accessioned2026-01-22T09:58:03Z
dc.date.available2026-01-22T09:58:03Z
dc.date.createdwos2025-09-21
dc.date.issued2025-09-04
dc.description.abstractSoft skills such as communication and collaboration are vital in both professional and educational settings, yet difficult to train and assess objectively. Traditional role-playing scenarios rely heavily on subjective trainer evaluations—either in real time, where subtle behaviors are missed, or through time-intensive post hoc analysis. Virtual reality (VR) offers a scalable alternative by immersing trainees in controlled, interactive scenarios while simultaneously capturing fine-grained behavioral signals. This study investigates how task design in VR shapes non-verbal and paraverbal behaviors during dyadic collaboration. We compared two puzzle tasks: Task 1, which provided shared visual access and dynamic gesturing, and Task 2, which required verbal coordination through separation and turn-taking. From multimodal tracking data, we extracted features including gaze behaviors (eye contact, joint attention), hand gestures, facial expressions, and speech activity, and compared them across tasks. A clustering analysis explored whether o not tasks could be differentiated by their behavioral profiles. Results showed that Task 2, the more constrained condition, led participants to focus more visually on their own workspaces, suggesting that interaction difficulty can reduce partner-directed attention. Gestures were more frequent in shared-visual tasks, while speech became longer and more structured when turn-taking was enforced. Joint attention increased when participants relied on verbal descriptions rather than on a visible shared reference. These findings highlight how VR can elicit distinct soft skill behaviors through scenario design, enabling data-driven analysis of collaboration. This work contributes to scalable assessment frameworks with applications in training, adaptive agents, and human-AI collaboration.
dc.description.wosFundingTextThis research is part of the imec Smart Education Program, supported by covenant funding from the Flemish Government. There is no specific funding number associated with this program. This research was (partially) funded by the Flemish Government (AI Research Program).
dc.identifier.doi10.3390/s25175498
dc.identifier.issn1424-8220
dc.identifier.pmidMEDLINE:40942927
dc.identifier.urihttps://imec-publications.be/handle/20.500.12860/58699
dc.language.isoeng
dc.provenance.editstepusergreet.vanhoof@imec.be
dc.publisherMDPI
dc.source.beginpage5498
dc.source.issue17
dc.source.journalSENSORS
dc.source.numberofpages25
dc.source.volume25
dc.title

When Action Speaks Louder than Words: Exploring Non-Verbal and Paraverbal Features in Dyadic Collaborative VR

dc.typeJournal article
dspace.entity.typePublication
imec.internal.crawledAt2025-10-22
imec.internal.sourcecrawler
Files

Original bundle

Name:
sensors-25-05498.pdf
Size:
13.52 MB
Format:
Adobe Portable Document Format
Description:
Published
Publication available in collections: