Exploring image processing capabilities of AIBO
MetadataShow full item record
Human robot interaction is an important aspect of developing robots. The level to which a robot emulates human actions determines the success in the development of the robot. The trend of robots interacting with humans using ‘emotions’ started commercially with Sony’s entertainment robots. This trend continued with Honda’s Asimo and Sony’s Qrio which are not yet available in the commercial market. In research, universities have long been creating robots that interact with humans as normally as humans do. Sony’s entertainment robot is called AIBO which is an acronym for Artificially Intelligent RoBOt. AIBO in Japanese means ‘Companion’. The selling point of AIBO was its ability to act (and look) like a puppy or a dog. Asimo and Qrio are humanoids which act and look like humans (remember C3PO from Star wars!). This thesis explores the possibilities of making AIBO more than just an entertainment robot by teaching numbers and operators to solve mathematical expressions. AIBO is also taught to recognize and respond to gestures. Neural networks are used as the learning algorithm to teach AIBO the numbers, operators and gestures. AIBO looks at an expression, calculates the result and provides the result the onlooker. It also recognizes gestures and performs actions for them.