Estimating Empathic States in a Spoken Dialogue Data by using Speech Parameters

Yahan Gao, Risu Na, Yuko Ohno, and Yoshihiko Hayashi

Keywords

Empathic state, Speech parameters, Machine learning, Spontaneous dialogue

Abstract

We have been investigating characteristics of empathic states expressed in spontaneous utterances by analyzing speech data annotated with attitude and emotion categories. Based on these investigations, this paper examines a method to estimate empathic states from speech utterances. The idea is to use the annotated data as training data for a machine learning algorithm (SVM) to construct relevant classifiers. These classifiers are arranged in such a way to reflect our definition of empathic states. The experimental results suggest: (1) Dependencies exist between attitude and emotion; (2) The estimation can be improved if augmented with the categories for the preceding utterance. This paper further discusses details of the initial experimental results and proposes a research agenda for future studies.

Important Links:



Go Back