Show simple item record

dc.contributor.authorRadhakrishnan, Srigopika
dc.description.abstractHuman robot interaction is an important aspect of developing robots. The level to which a robot emulates human actions determines the success in the development of the robot. The trend of robots interacting with humans using ‘emotions’ started commercially with Sony’s entertainment robots. This trend continued with Honda’s Asimo and Sony’s Qrio which are not yet available in the commercial market. In research, universities have long been creating robots that interact with humans as normally as humans do. Sony’s entertainment robot is called AIBO which is an acronym for Artificially Intelligent RoBOt. AIBO in Japanese means ‘Companion’. The selling point of AIBO was its ability to act (and look) like a puppy or a dog. Asimo and Qrio are humanoids which act and look like humans (remember C3PO from Star wars!). This thesis explores the possibilities of making AIBO more than just an entertainment robot by teaching numbers and operators to solve mathematical expressions. AIBO is also taught to recognize and respond to gestures. Neural networks are used as the learning algorithm to teach AIBO the numbers, operators and gestures. AIBO looks at an expression, calculates the result and provides the result the onlooker. It also recognizes gestures and performs actions for them.
dc.subjectImage Processing
dc.subjectNeural Networks
dc.subjectOffline Learning
dc.subjectOnline Testing
dc.titleExploring image processing capabilities of AIBO
dc.description.departmentArtificial Intelligence
dc.description.majorArtificial Intelligence
dc.description.advisorWalter D. Potter
dc.description.committeeWalter D. Potter
dc.description.committeeKhaled Rasheed
dc.description.committeeSuchendra Bhandarkar

Files in this item


There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record