Show simple item record

dc.contributor.authorMa, Qian
dc.date.accessioned2014-03-04T18:59:44Z
dc.date.available2014-03-04T18:59:44Z
dc.date.issued2010-12
dc.identifier.otherma_qian_201012_ms
dc.identifier.urihttp://purl.galileo.usg.edu/uga_etd/ma_qian_201012_ms
dc.identifier.urihttp://hdl.handle.net/10724/26956
dc.description.abstractConversational gaze behavior is an important component of an embodied conversational agent (ECA). Without proper conversational gaze, conversational agents may be less persuasive, emotive, and ultimately less believable or usable. While many conversational agent systems have been created for one-on-one type interactions, there is a noticeable lack of multi-party-capable systems, i.e., systems capable of dealing with more than one user simultaneously. We present a conversational agent system capable of sensing and reacting to the conversational state of multiple users using computer vision algorithms for head and mouth motion tracking.
dc.languageeng
dc.publisheruga
dc.rightspublic
dc.subjectEmbodied conversational agent
dc.subjectHuman-computer interaction
dc.subjectHead tracking
dc.subjectMouth tracking
dc.subjectEye gaze
dc.titleEnabling a multi-party conversational virtual agent through head and mouth motion tracking
dc.typeThesis
dc.description.degreeMS
dc.description.departmentComputer Science
dc.description.majorComputer Science
dc.description.advisorSuchendra M. Bhandarkar
dc.description.committeeSuchendra M. Bhandarkar
dc.description.committeeKhaled Rasheed
dc.description.committeeKyle Johnsen


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record