Monday, September 20, 2010

More realistic pet robots that recognize and respond to human emotions




Sony’s Aibo may be discontinued, but robotic pets of all shapes and sizes continue to stake a claim in the hearts of people around the world. Despite the apparent intelligence of some of these robot pets, their behavior and actions are usually nothing more than pre-programmed responses to stimuli – being patted in a particular location or responding to a voice command, for example. Real flesh and blood pets are much more complex in this regard, even discerning and responding to a person’s emotional state. Robotic pets could be headed in that direction, with researchers in Taiwan turning to neural networks to help them break the cycle of repetitive behavior in robot toys and endow them with almost emotional responses to interactions.
Building fully autonomous artificial creatures with intelligence akin to humans is a very long-term goal of robot design and computer science. On the way to such machines, home entertainment and utility devices such as "Tamagotchi" digital pets and domestic toy robots such as Aibo, the robotic dog and even the Roomba robotic vacuum cleaner, have been developed. At the same time, popular science fiction culture has raised consumer expectations.
In an effort to provide entertaining and realistic gadgets that respond to human interaction in ever more nuanced ways, mimicking the behavior of real pet animals or even people, researchers in Taiwan are now looking at a new design paradigm that could see the development of a robot vision module that might one-day recognize human facial expressions and respond appropriately.
"With current technologies in computing and electronics and knowledge in ethology, neuroscience and cognition, it is now possible to create embodied prototypes of artificial living toys acting in the physical world," Wei-Po Lee and colleagues at the National Sun Yat-sen University (NSYSU), Kaohsiung, explain.
There are three major issues to be considered in robot design, the team explains. The first is to construct an appropriate control architecture by which the robot can behave coherently. The second is to develop natural ways for the robot to interact with a person. The third is to embed emotional responses and behavior into the robot's computer.
The researchers hope to address all three issues by adopting an approach to behavior-based architecture - using a neural network - that could allow the owner of a robot pet to reconfigure the device to "learn", or evolve new behavior and at the same time ensure that the robot pet functions properly in real time.
The team has evaluated their framework by building robot controllers to achieve various tasks successfully. They, and other research teams across the globe, are currently working on vision modules for robots. The technique is not yet fully mature, but ultimately they hope to be able to build a robot pet that could recognize its owner's facial expressions and perhaps respond accordingly. Such a development has major implications for a range of interactive devices, computers and functional robots of the future.

No comments: