Abstract
This study investigates how interpersonal (speaker–partner) synchrony contributes to empathetic response generation in communication scenarios. To perform this investigation, we propose a model that incorporates multimodal directional (positive and negative) interpersonal synchrony, operationalized using the cosine similarity measure, into empathetic response generation. We evaluate how incorporating specific synchrony affects the generated responses at the language and empathy levels. Based on comparison experiments, models with multimodal synchrony generate responses that are closer to ground truth responses and more diverse than models without synchrony. This demonstrates that these features are successfully integrated into the models. Additionally, we find that positive synchrony is linked to enhanced emotional reactions, reduced exploration, and improved interpretation. Negative synchrony is associated with reduced exploration and increased interpretation. These findings shed light on the connections between multimodal directional interpersonal synchrony and empathy’s emotional and cognitive aspects in artificial intelligence applications.
Original language | English |
---|---|
Article number | 434 |
Journal | Sensors |
Volume | 25 |
Issue number | 2 |
DOIs | |
State | Published - 2025/01 |
Keywords
- affective computing
- empathetic response generation
- multimodal learning
ASJC Scopus subject areas
- Analytical Chemistry
- Information Systems
- Atomic and Molecular Physics, and Optics
- Biochemistry
- Instrumentation
- Electrical and Electronic Engineering