Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees
Richter, Viktor and Carlmeyer, Birte and Lier, Florian and Meyer zu Borgsen, Sebastian and Schlangen, David and Kummert, Franz and Wachsmuth, Sven and Wrede, Britta
In this paper we present our humanoid robot “Meka”, partici- pating in a multi party human robot dialogue scenario. Active arbitration of the robot’s attention based-on multi-modal stim- uli is utilised to attain persons which are outside of the robots field of view. We investigate the impact of this attention management and an addressee recognition on the robot’s capability to distinguish utterances directed at it from communication between humans. Based on the results of a user study, we show that mutual gaze at the end of an utterance, as a means of yielding a turn, is a substantial cue for addressee recognition. Verification of a speaker through the detection of lip movements can be used to further increase precision. Further- more, we show that even a rather simplistic fusion of gaze and lip movement cues allows a considerable enhancement in addressee estimation, and can be altered to adapt to the requirements of a particular scenario.
In Proceedings of the Fourth International Conference on Human-agent Interaction , 2016[PDF]
@inproceedings{Richter-2016, author = {Richter, Viktor and Carlmeyer, Birte and Lier, Florian and Meyer zu Borgsen, Sebastian and Schlangen, David and Kummert, Franz and Wachsmuth, Sven and Wrede, Britta}, booktitle = {Proceedings of the Fourth International Conference on Human-agent Interaction}, location = {Singapore}, publisher = {ACM Digital Library}, title = {{Are you talking to me? Improving the robustness of dialogue systems in a multi party HRI scenario by incorporating gaze direction and lip movement of attendees}}, doi = {10.1145/2974804.2974823}, year = {2016}, topics = {}, domains = {}, approach = {}, project = {} }